Back
Apple developers made more than just superficial changes when they created the iPhone X.
Inside the phone, a special “neural engine” gives the technology artificial intelligence (AI) and augmented reality (AR) capabilities. The engine could signify the future of smartphones, which rely on machine-learning algorithms. What could the arrival of iOS 11 mean for UX designers? Let’s explore.
The iPhone X is replete with exciting new features, from face-scanning ID recognition to the elimination of the home button. The most dramatic change, however, is inside the phone, in the A11 Bionic processer. The A11 processor Apple was developed to give life to the iPhone X is a 64-bit, six-core processor that Apple described as the most powerful chip ever in a smartphone. Here’s a breakdown of what makes the A11 Bionic so groundbreaking:
Part of the A11 Bionic processing system is the neural engine. It is a dual-core engine that can handle 600 billion operations per second. The neural engine does not send any data to Apple. Apple designed the neural engine especially to accelerate AI software, using “artificial neural networks” that can process speech and images efficiently. The new capabilities the neural engine brings to iPhone could change the future for smartphones everywhere.
Apple’s neural engine is a glimpse into the potential future for smartphone technology. A breakdown of the engine shows that there is a pair of processing cores for handling machine learning algorithms. These algorithms give the iPhone X the ability to recognize facial features, create animojis, and handle augmented reality apps with finesse. It masterfully takes on complex AI tasks without lag time thanks to those 600 billion operations per second.
Source: youtube.com
The mainstay feature of the new engine is its impressive AI and AR capabilities. Apple’s approach to artificial intelligence is bringing a revolution to how users experience AI with their mobile devices. Historically, the cloud is how apps and processors have handled AI features. Using the cloud conserves battery power, but it’s less convenient and less secure. Apple found a way to crack the code and bring users all the benefits of AI for mobile without cloud-related drawbacks – dedicating the A11 processor to AI capabilities.
Apple brought AI straight to mobile back in June 2016, when it introduced differential privacy – the company’s way of marking users’ identities when collecting data using statistical methods. The neural engine, however, brings new life to having the AI-capable hardware on a phone. Apple no longer collects or analyzes user data; instead, Apple keeps the AI hardware on the phone itself, eliminating the need for the cloud as a middleman. The neural engine isn’t just the brainchild of Apple – it’s the future of the entire industry. Other phone companies are using the same approach to AI, including Huawei, Google, and Qualcomm.
When the iPhone X hit the market, UX design changed forever. The phone’s unique features – new screen size, no home button, rounded corners, and richer colors – begged the question of whether website design for mobile needed to change. But while the aesthetics of the iPhone X are exciting, it’s the neural engine that may impact UX design the most. Now that Apple has placed the power of augmented reality and artificial intelligence into the hands of the masses, UX designers need to broaden the scope of their work.
New and improved AI and AR in iOS 11 means that developers and designers no longer have to bake AR capabilities into the actual apps. This technique is expensive and makes it difficult to control the user experience. Instead, developers can depend on the AR components within the phone’s neural engine to handle applications without having to come up with and incorporate their own AR solutions into the apps. The new engine has opened the door to an enormous AR-ready consumer base that’s easier than ever for developers to tap into.
Specialized AR applications will continue to be a hot trend in UX. The iPhone X marks the beginning of what is sure to be a revolution in the efficiency and dependency of AR applications for mobile. To keep up with the times, UX design will have to adapt. The only thing limiting future AR apps is the imagination of the designers and developers. Bringing AR to the mainstream in a convenient, consistent, and reliable way will naturally change the way developers need to think about AR for mobile.
Apple has proven time and time again that change can be a very good thing. The iPhone X’s neural engine is certainly transforming AI and AR for mobile, but this only means that developers have new tools at their disposal for creating exceptional user experiences. The trick is to understand what the new engine means for UX and UI, and how to use its capabilities to the developer’s advantage. Some tips on how UX designers can capitalize on the A11’s features are as follows:
Predictions for augmented and virtual reality place worldwide revenues at $162 billion by 2020. It is critical for UX professionals to learn about the neural engine in iPhone X, recognize that this technology is becoming worldwide, and take advantage of new AR/AI technologies. The task UX designers now face is to come up with new ideas for AR applications that consumers will embrace, engage with, and truly enjoy. It’s a challenge that UX designers can turn into limitless opportunities with the right tools, knowledge, and a progressive mindset.
Main image source: developer.apple.com
Stephen Moyers has over a decade of experience as a technology consultant and web marketing manager. Since 2010, he has specialized in various technologies, bringing a...
Your email address will not be published. Required fields are marked *
Comment *
Name *
Email *
Website
Save my name, email, and website in this browser for the next time I comment.
Δ
Keep an eye out for awesome web content heading straight for your inbox!
Let our friendly web experts curate a personalized list of improvements that will help enhance the online presence of your brand.