The iMac on this desk needs Siri about as much as a nagging wife, a painful toothache, and a bill collector which won't stop calling. What thinking brings this annoying baggage to the desktop as I don't recall anyone expressing interest in trying to turn desktop computers into cellphones. (AppleInsider: Siri for Mac: How it works in Apple's macOS Sierra and what it's capable of)
Sample statement from the article to show Siri may be capable of doing something but it doesn't do much for awkward sentence construction.
The system will moreover support contextual follow-up commands, which can narrow down results.
Sample statement from the article to show Siri may be capable of doing something but it doesn't do much for awkward sentence construction.
The system will moreover support contextual follow-up commands, which can narrow down results.
Apple's relentless drive to squeeze its entire future into a four-inch screen has brought nothing to the corporation so much as vaulted it to the top of the heap for mindless consumerism of replaceable commodities. Every year there will be another iPhone so buy one now to keep up with what the cool kids are doing. It's not clear if Apple plays for high tech or a high school sock hop.
It seems the only substantive progress from Apple toward much of anything in recent years has been the Swift programming language since it is apparently one of the few which is fully interoperable across platforms with full-function on all of them. Interoperability is always good for the general flexibility it provides but it's still not a substitute for forward evolution.
Apple seems to run from the software business in general faster than the roaming Casanova from a jealous husband. Maybe even Final Cut is in trouble because it's too big to run on an iPad. There's some ludicrous acceptance there's not reduced function in making iPad versions of anything and it's mystifying how Apple ever sold that idea.
The devolution is disappointing to see from a company which has been immensely important to my own artistic evolution when now their biggest claim to fame is populating Instagram with more low-quality photograph than Kodak ever dreamed for its Brownie camera.
Damn shame it is to see.
2 comments:
The phenomenon of trying to turn a computer with a 29-inch screen into a phone isn't unique to Apple. They're all doing it these days, and it annoys the hell out of me. I bought that video real estate for a reason, and I don't want it trivialized for the convenience of some hack of a programmer that can't handle real coding.
As to AAPL, it looks like they're hanging their hat on a new, flexible, wrap-around screen for the iPhone. /shrug For me it falls in the, "well, that's cute, but how about something of substance" category.
Now related to the iPad, well, that entire platform is dying a slow and painful death. The "phablets," i.e. phones with greater than a 5" screen, are rapidly replacing them. You still see the tablets in use in the corporate world and in the service (repairman) industry, but for the general consumer, they're dead.
AI - or rather, predictive intelligence - is growing fast, and now AAPL is playing catch-up with Siri. Google already blows it away, and the technology is improving daily. Voice recognition is absolutely amazing, and I use voice to text whenever possible. The problem, though, is it's not practical in the business world. So if they really want to impress me, they need to continue the research Sony was doing several years ago and perfect the ability to read brain patterns. I don't want to speak to a device; I want to THINK to the device. Now, that would be useful technology!
This Where's the Beef problem has been bugging me for quite a while as these phabulous phones took over everything. That would have been fun if not for the expense to, as you say, improvements of substance. I suspect quite a while before anything is reading brain waves or whatever to the level of precision I need for any kind of data entry. The option most obvious after that is some kind of electronic implantation.
Skipping past any ethical aspect, it's all very well if the chip gives the head the ability to make machine-readable data, it still doesn't get anything into a machine without some analog of Bluetooth. Typing is so archaically primitive, not only because of the mindless tapping motions to move the keys but also because of the posture required to do them.
Presumably any kind of designing requires some sort of surface on which to visualize bits and pieces of whatever comprises this Incredible Thing. I don't need to be free that surface and don't even want it but I do want to be free to move around, maybe interact with others, during the course of I/O to it.
All for the cost of a chip in my head ... which probably causes cancer .... and warts ... of course also autism.
Post a Comment