The iOS devices lined up by Apple this year could potentially predict
the best form of communication in a given situation while giving persons
with disability better options of controlling iPhones and iPads.
In a new patent that gained approval from U.S. authorities this week,
Apple is discovered to be working on upcoming smartphone features that
would lead to "method and system for facilitating contacting people
using electronic devices."
When deployed, this service will deliver the appropriate information
to user on when and how to contact a specific person, Apple Insider said
in a report.
Prior to sending a message or making a call, the iPhone feature will intelligently determine the best to get in touch with a contact. For example, if a person is in a business meeting and cannot pick up the phone, the iPhone would suggest sending a message instead through SMS or email.
Or if the person is within the same area where the initiating party is located, the same service would simply suggest an impromptu meet.
For the feature to behave correctly, it will depend on the data that
will be fed by monitoring devices such as GPS, cameras, microphones and
personal computers.
The service will likely be packaged into an iOS application, giving it a user interface that is easy to manipulate and program.
In another patent, Apple is considering the adoption of "specialized
techniques for controlling a device with touchscreen inputs using a
non-touch hardware accessory," Apple Insider said in a report.
This new set of services is more useful for persons with
disabilities, somehow tearing down the barriers that limit them full use
of the iPhones and iPads.
"For a user to interact with a touch screen device using touch input,
the user must generally be physically present at the device in order to
touch the screen," the Apple patent said.
"The user must also be capable of providing touch input and capable
of viewing the screen of the device or otherwise knowing where input is
needed," it further explained.
Essentially, the invention will further improve Apple's existing
accessibility options that would provide for easier or direct
manipulation of numerous touch screen device functions such as page
scrolling, text input, text cut, text copy and text-to-voice.
The process involves the interaction between a joystick or a button
pad and an iPhone. These peripherals will accept input commands which
are then dispatch to the host handset for interpretation.
The data is then sent back to the accessibility instruments, this time translated into a sound file or Braille data.
Source : International Business Times , 1st May 2013
No comments:
Post a Comment