Viewing a single comment thread. View all comments

VincereAutPereo t1_itxy4d2 wrote

It's not the same thing. Your smart device has a few pre-programmed key works it can process locally, like "hey Siri". However, actually doing speech recognition takes a lot more processing power than your phone is capable of. If you're worried about your phone sending what you say, consider what your data usage would look like if your phone was sending a constant stream of raw recordings out. It would have to be raw because, again, your phone isn't really able to process "I am going camping next weekend" on its own.

13

EnglishDutchman t1_ity3fre wrote

To be fair, Siri never understood a single thing I said to it anyway. It barely had a 5% success rate which is one of the other reasons I turned it off. Utterly pointless. It had no concept of context so between that and getting 95% of things wrong, it took longer to constantly check everything it was doing than to just do it the old school way.

3

sexytokeburgerz t1_itz1w0x wrote

This is almost always due to improper configuration- Siri needs you to say a few phrases in its settings to make a voice profile for you, and if these recordings have bad audio quality (wet phone, talking/music in background, too far away), it won’t work very well.

It also improves the more you use it.

If you have a really deep voice, this is a separate cause and you’d have to just talk higher in config and usage. If the fundamental frequency of your voice sits in the LF rolloff of the microphone filter, the processor will pick up the second or third harmonics of your voice which will SERIOUSLY fuck up formant detection. In layman’s terms, deep voices need some work with these things.

4

EnglishDutchman t1_iu1teb1 wrote

I have a reasonably bland U.K. accent. I’ve tried Siri in English, American and Australian. Australian had about 10% success rate. The rest were less functional. Most of the time it just stops listening. So if I dictate a message to someone like “tell Mike I’ve left the house and will grab some food on the way” it gets as far as “I’ve left the house and” and then just sends the message. The lack of context really irritated me though. I couldn’t say “what’s next in my calendar?” And when it replied, then ask “and what’s the location?” It treats both as two separate queries. Although mostly it would reply “here’s what I found on the web for water rotation”.

2

sexytokeburgerz t1_iu1u75i wrote

Interesting, I hadn’t considered that. I live in a major US city so there are quite a few brits (and australians) and I’ve seen them use it flawlessly and often. Maybe it’s because their accents are more West-Americanized.

Is this failure common in the UK?

As for query permanence, yeah, I have never used it past one line with success. It’s pretty much meant to be used like a search engine, I think.

1

EnglishDutchman t1_iu220cc wrote

Most of the people I know in the U.K. turned it off years ago. They can’t be bothered with the hassle of it. For dictation it needs to be 100% especially for dictation while driving. When it can’t even get a five word sentence right, you give up after a while. I’m sure it works great for some people but I’ve yet to meet one of those people. Example: even if I said the simplest thing - hey siri next track - it would either say “something went wrong” or “here’s what I found on the web for wet sack”. I found it to be a novelty more than anything else. Couldn’t tell me F1 race positions. Couldn’t tell me the weather. Couldn’t find stuff in my calendar. Didn’t understand request to navigate.

1