Introducing ARCore: Android’s Answer To iOS’ ARKit
Close to three months ago during Apple’s annual Worldwide Developers Conference (WWDC), the iPhone maker had officially announced ARKit, its own augmented reality (AR) platform for iOS mobile users. Basically, ARKit existed to allow developers to start fashioning AR apps easily, because any device that is powered by the upcoming iOS 11 would be able to have support for the platform. Since then, ARKit has helped give birth to new AR apps like a virtual pet game, a restaurant app that is capable of showing food on a dish, and a music video inspired by 1980s group A-ha’s Take On Me hit, among many others (check them out at this web page).
Not to be outdone, Google unveiled its own AR platform this week, and it is called ARCore. But wait -- Google already has an AR project called Tango -- isn’t this the same thing? Not really. Whereas Google Tango needs a specific set of hardware (such as cameras and sensors) to work, ARCore does not. Similar to how Apple’s ARKit works, Google’s new AR platform only needs the user’s smartphone to do its wonders.
And another thing that ARCore has going for it is that Google has already made the platform available right now. Apple’s ARKit is supposed to go live as soon as iOS 11 is commercially rolled out to Apple users, and that will likely happen later this year (iOS 11, after all, has only been unveiled back in June earlier this year).
But with Google’s ARCore, developers can already start making full use of the platform on the Pixel devices and on Samsung’s Galaxy S8 smartphones, provided that they are powered by Android 7.0 Nougat or newer version. In the long term, Google is looking to have ARCore run on more Android handsets made by some of the world’s top phone makers, like Samsung, Huawei, LG, and Asus, just to name a few.
For those who do not mind being bombarded with the technical stuff, know that like ARKit, ARCore can function with Java/OpenGL, Unity and Unreal. Moreover, they can expect ARCore to deliver on a trio of aspects: motion tracking (by making use of the handset’s camera in determining the user’s relative position in any given location), environmental understanding (so that it can recognize horizontally oriented surfaces, and light estimation (so that the rendering of virtual objects can be made consistent with the lighting and shadows of the immediate environment).
Related Blog Articles
- Instagram: Intruders Got Access To Contact Info Of High-Profile Users
- So You Wanna Sell Your iPhone?
- Instagram Hackers Now Are Selling Celebrities’ Personal Info On Dark Web
- Red Pocket Mobile Introduces New $15 Essentials Plan
- What We Learned About Carriers In The Wake Of Hurricane Harvey
- Twigby’s Self Care: Making Mobile Plan Customization Easy For Customers
- What New Cool Stuff Will Apple Unveil On September 12th?
- Need A Phone Or Plan For Your Kid At School? Try TextNow Wireless.
- Here Are The Results Of Wirefly’s Carrier Internet Speed Rankings
- A Brief Guide To Everything Apple Announced During Its Event
Related Blog Posts
- Report: Drug users are using wearable devices during binges
- Spotify allows Android users to reorder playlists; Pandora lets users share tunes to Snapchat Stories
- WhatsApp combats fake news with a new forwarded label
- FCC: Today’s improving mobile networks can impact healthcare costs
- Did Apple Music already overtake Spotify in America?