These are my notes from UX LX 2016, recently recovered from Evernote and therefore being published outside of their natural sequence. For the rest of the content, open the UXLX16 story page
Humans respond to the volume they hear.
today we are going to cover our process of conversation, how we speak to machines and how to make machines with voice interfaces stronger.
when we walk down the street we don’t make eye contact with other humans.
making eye contact is a form of recognition and that we would like to engage in conversation.
Leaning about Grician Maxims (and how fictional narratives constantly break them) from @jonesabi #uxlx https://www.sas.upenn.edu/~haroldfs/dravling/grice.html
Computers need to wait for us to stop talking in order to process what we said.
It then extracts tiny parts of speech, runs sound recognition to understand the natural language.
The dialogue manager keeps track of this process and identifies the differente elements of the sentence.
It then generates natural language.
Paris - France + Italy = Rome.
Basebal - Bat + Racket = tennis
there is a hidden layer that humans don’t have access to, and it processes this information to come up with the best probable outcome.
Cohesion is the glue of discourse - cohen giangola, balogh
My kid only eats rice He can’t survive on that.
Computers can now understand pronouns.
rythm, stress and intonation
For every word spoken to a computer, we need to give different levels of intonation
Context is an agent’s understanding of the relationships between the elements of the agent’s environment.
— Andrew Hinton
People will only use voice interfaces when they don’t need failure modes like the computer saying it can’t do that without asking for more information.
A conversation interface that really works needs to be able to match all these requirements.
Really interesting talk about voice interfaces by @jonesabi More info here https://t.co/IYWEAg95hn #uxlx #design #ux
— Andrés-Leonardo Martínez-Ortiz, PhD (@davilagrau) May 26, 2016
recording of ‘how we talk, how machines listen’ 20160526 11:01:31.m4a
Alan Cooper #RanchStories The father of visual basic Writing feature packed bug free software didn’t guarantee success Most significant innovation is …
@gilescolborne gilescolborne: How I stopped worrying and learned to love algorithms. https://t.co/iYFZMDFD70 #uxlx How I stopped worrying and learned to …
@lissijean https://www.slideshare.net/mobile/MelissaPerri/designing-to-learn The controversial subject of the MVP Great for testing yet some teams seem …
@stephanierieger http://slideshare.net/yiibu http://www.slideshare.net/yiibu/the-emerging-global-web The online World is saturated in developed economies …
Per Axbom The more people’s brain is in auto pilot, the better ? The reptilian brain is great for conversion. What is our responsibility to the user? …
These are my notes from UX LX 2016, recently recovered from Evernote and therefore being published outside of their natural sequence. For the rest of the …
@caseorganic We are all cyborgs, we use objects and technology to go outside our natural environment or be able to do new things. We are waking up next to …
UXLx UXLx brings professionals from all over the world to Lisbon for 3 whole days of trainning and networking. Next edition is taking place on the 23rd and …
UXLx This is possibly the biggest conference for User Experience professionals of any field. Next edition is on the 23 and 26th of May 2017. Programme and …
Enter your email to get a weekly digest of new posts.