Yesterday AT&T opened the doors to one of its Foundry Innovation
Centers to showcase and demo some of the technologies that are currently
under development at the AT&T Labs and Foundry in Palo Alto. The
company has three Foundry centers, the other two are located in Plano,
Texas and Tel Aviv, Israel, and has designed them specifically to bring
together teams to advance developers’ projects through collaboration.
There
were fifteen such projects being shown, spread across a variety of
topics and features but all focused on using, integrating, or involving
the network. From a Remote Patient Monitoring project that incorporates
video calls, tablets, and Bluetooth to keep doctors updated on patient
vitals, to the Alpha API Platform which acts as a personal feedback
system for developers, the projects displayed ingenious ideas designed
to improve existing technology, or build on the existing network to
provide new services and functionalities. Here are a few more projects
that caught my eye yesterday.
There are a few things in this
world that I passionately hate: the post office, the so-called
“mid-season” break of television shows, and a phone tree are all high on
that list. I don’t want to fight with robots; I just want to be
connected to a person who can handle my problem. The Visual Interactive
Voice Response was designed specifically to solve the difficulties of
maneuvering through an automated messaging system by creating a visual
navigation tree. Much like the difference between original voicemail and
visual voicemail, Visual IVR lets you opt-in to a visual session by
pressing a button; it then sends you an SMS message with a text link
that leads you to the visual tree, which looks very similar to a mobile
app. You can then navigate through the session to say, book a
flight,Natural Chinese turquoise beads
at Wholesale prices. or you can opt to connect to an agent to assist
you. The best part is that you can change between the two seamlessly –
the program can easily resume services right from where you left off so
you won’t need to duplicate any steps.
Designed to be a highly
accessible remote app for those with visual or hearing impairments, the
U-Verse Easy Remote App responds to voice commands and is powered by
AT&T Watson speech recognition technology. Speech is captured by the
service, sent to the cloud servers, where it matches the speech to
available programming and comes back with results. You can search for a
program by name, or by the name of a cast member, and it will show a
list of results that are currently on U-verse. The app uses AT&T
Speech API to recognize a speaker and improve accuracy over time. It can
remember favorite functions and recognize gestures, and it plays nice
with the iPhone’s VoiceOver screen reader.Have you ever wondered about
the mold making process?
Another program using AT&T Watson speech technology, Text Translation,cableties
also takes advantage of AT&T’s extensive SMS services to “directly
transcribe language for various applications.” Basically, when I type
out a text in English and send it to my friend in Spain, the text
arrives to him in Spanish and vice versa. The application uses regular
SMS messages, and works by sending your messages to the cloud in
AT&T’s network, where it translates them, and then sends the
translated speech to the receiver, in real time. There’s no app to
download, install, or open; it’s as simple as checking a box to indicate
that you want the feature in your SMS messages. Going forward, there
are plans to extend the reach of the project to include texts that the
receiver can hear aloud, to work across multiple carriers.
Think
of it as an always-on Google hang out, with cameras set all over the
office – something even the developer himself admitted could enable some
“potentially creepy privacy issues,” which they plan to address with
both filters that allow you to put a slider on your visual presence, and
an encryption of the videos.Find detailed product information for
shamballa crys talbeads wholesale,
While you can go back and review video conversations that have happened
in the past, you would need permission from all the video’s
participants to do so. The goal is for distant coworkers to still be
able to reap the benefits of collaboration, and enable a seamless
connection between colleagues. Currently, it’s being used at AT&T
Labs, in the future the plan is to deploy it more broadly, and
incorporate more enhanced features such as facial recognition and
augmented reality.Our guides provide customers with information about porcelain tiles vs.
沒有留言:
張貼留言