Google and Android fans around the world tuned in to yesterday’s big hardware event in anticipation of the announcement of the Google’s own Pixel phone. Our Android developers gathered around the kitchen to watch, and afterwards, we asked for their feedback on the event and the many new products announced by Google.
What were the biggest surprises coming out of the event?
Tyler McCraw: The biggest surprise for me was Google’s announcing of the highest rated built-in camera for phones. We all know that Google and Apple are trying to one-up each other all of the time, but I wasn’t expecting Google to do it with the Pixel’s camera. Not to mention, with the Pixel, you get unlimited high-quality backup for your photos and videos! How can they even offer that!?Nick Cook: I was also surprised about the unlimited photo and video backup. That’s a huge commitment from Google. It’s a bold one, too, since a lot of companies are turning to charging users for storage to generate income. But Google does like pushing the boundaries of industry expectations.Austin Lanier: I’m going to semi-echo Tyler on this one. I was also excited about the camera improvements. From a user perspective, I came from iOS originally, and the camera has occasionally been a pain-point for me when making functionality comparisons between the two platforms. I was excited to see that this area has received the deserved attention, since it’s an improvement that really needs more control over the whole stack to really do something notable.
What are you most excited about?
Nick C: I’m excited about the new advances Google has made to their natural language processing and translation technologies. As someone who loves languages and learning new ones, I’m glad to see Google take a new approach to translation. It should provide much more accurate translations for users. It will also be useful for Google Assistant, since it can provide real-time translation services and ensure that any responses given are easily accessible to their users. It should open up Google’s products to an even wider language audience.Austin L: I like the movement towards improved product integration. This isn’t a new thing, but I like the emphasis on it with the latest update. iOS has always marketed itself as having it’s own ecosystem. A lot of iOS user’s really enjoy the product integration provided by Apple, and are willing to pay more for that ecosystem. This is something that Android has historically lacked, more-so than its iOS counterpart. With Google taking the lead on these product developments, they’re better poised to integrate with each other, which will help improve the product-to-product ecosystem that Android should be.Tyler M: The piece that I’m most excited about from today’s talk is Google’s newer initiative to build Google Assistant into their devices, and to expose the Assistant APIs (and other Machine Learning technologies, like TensorFlow) to developers. Google Assistant is just the first leap towards allowing people to have an AI companion alongside them every day. Google Now, Siri, and Microsoft’s Cortana were all great, but Google Assistant is going to push us to the next level. If other companies buy into developing features that work with AI, it’s going to make all of our lives so much easier. Our kids are going to look back at us one day and ask us why we had to go to a website to order a pizza, instead of just asking an AI to order it for us.
Is Pixel a Galaxy killer?
Nick C: Not necessarily. Samsung is currently one of the most, if not the most, popular brand of Android smartphones. A lot of people like the Samsung skin and the experience it provides. It’ll take a lot for enough users to move away from it to kill the Galaxy.Tyler M: Even with the latest issue of the Galaxy Note 7s having faulty batteries, Samsung holds a strong stance on the market. As of this writing, Samsung still has the top 5 Android phones on the market according to Google Play statistics gathered by AppBrain. However, I think they’re going to have more competition now that Google has released the Pixel, a phone that has relatively better specifications than any of Samsung’s lineup. Not to mention, it comes by default with all of Google’s latest software for Android.Austin L: Not sure if I should, but I still want to take a blunt standpoint on this one. I’m not really a Samsung fanboy -- they make good hardware (minus the current Note 7 battery debacle) -- but they fall short in some of the software additions and other places that I find to make a big difference from a usability and developer perspective. As a user that uses mostly stock, or “pure” Android, I occasionally still find that navigating around a Samsung is more of a transition than I would expect. I’m more excited to see the potential of another leading competitor.Nick C: I definitely prefer the stock Android experience. Whenever I’m on a Samsung device, I don’t find the navigation to be particularly intuitive.
What’s the significance of Google owning the hardware?
Nick C: I think that Google has to be careful not to neglect other phone manufacturers going forward. If they limited new features to Google-made phones, it would make a lot of users angry. It would also anger the phone manufacturers who would feel unsupported by Google, while losing a lot of their customers.Tyler M: I think you’re right, Nick, Google has gotten in trouble in the past for excluding other companies from their services or promoting their own company over others. They need to be tactical about how they sell their hardware. However, because Google is still partnering with HTC to make the Google Pixel, I think there’s still hope out there for other manufacturers. And because Google is now branding it as their own, they can set the bar higher for this industry, which is what Google loves to do: disrupt an industry and raise the bar.Nick C: True, I think that if Google continues to partner with manufacturers and make their features available (like they’re doing with the Assistant SDK), they have a good chance of doing well with their own branded hardware.Austin L: All of this is fair, but sometimes, to really improve something that’s already good, you have to have control over the whole process. Google can put out standards all day, but if they’re not properly utilized, then you’re going to have deficits. This comes round robin to my camera experience and why I found that exciting. I’m hoping, after these changes, it will be more on par rather than the blowout I experienced between my roommate’s iOS 6 Plus, and my Nexus 6 when we toured the west coast. The quality difference between our pictures in Yosemite was something that was hard for me to ignore (he basically ended up taking pictures the entire trip), and the camera stack is definitely something you need control over both the hardware and the software to really see significant improvement.Tyler M: And let’s not forget about the standards they’re sort of enforcing with a super speedy charge time. 7 hours of battery life after a 15 minute charge is an extremely difficult new standard that will not go unnoticed. With this kind of ability, I foresee fewer people needing to buy battery packs and more people carrying their phone on them for longer periods of time. People will be able to get more usage out of their phone, rather than having to sit around an outlet while the phone charges.
What is the significance of Google’s commitment to AI?
Tyler M: Ever since Google created Google Now and acquired DeepMind Technologies, we’ve known Google to be involved with the development of machine learning and artificial intelligence. Google even has a Machine Intelligence research organization, which released an open-source machine learning library called TensorFlow. And so we know just how committed Google is to building better and more superior AI. But, after the talk today, I think more consumers will realize the benefits of having an AI companion with them at all times. Billions of people on the planet already have a phone that they carry around with them every day and now we can make our lives easier with AI.Nick C: I agree with Tyler here. I think that Google is going to make a lot of users realize the benefits of having your own personal AI. Google said in the talk today that it’s going to be the next big leap in computing, and I agree.Austin L: Yeah, AI is the next big thing, whether you want it to be or not. Ever since the announcement of Google Photos, and the unlimited photo uploads that was announced a while back, this has been pretty apparent to me. We’re helping to feed the learning! The fact that a system like TensorFlow can learn from input data, and is openly available to anyone, is cool enough. And just look at the products that are coming out of Google with it at the foundation. It’s really empowering to know that it’s open source and anyone can tinker with it.Tyler M: Definitely. Now that Google has open-sourced TensorFlow and that they have opened an API for the Google Assistant, called “Actions on Google”, I’m excited to see what companies are going to do with these tools. It was really amazing seeing how easy it was for the Google Assistant during today’s demos to create an OpenTable reservation or to book an Uber for the user. And let’s not forget that they’re going to be releasing an SDK for Google Assistant for embedded devices and systems. All of this motivates me, as an Android Developer, to want to implement Google Assistant functionality within the apps that I have worked on.
What does this mean for you as an Android developer?
Nick C: Google’s development of new AI technologies and its release of Assistant mean that users are going to be interacting with their phones and apps in new ways. It’s going to be up to us as developers to make sure that our apps implement these new technologies and make our features available in new and exciting ways. Users will soon be expecting a less traditional approach to apps.Austin L: Hopefully not much. It’s Android, so it should be a seamless transition. So, just another device to test on, and ideally fewer bugs arising from testing on that device!Tyler M: Yeah, as far as the new Pixel goes, I don’t think we’ll need to do too much specifically for the device. But, it seems that fingerprint sensors are becoming more and more popular for Android devices. So, we’ll start needing to account for that when building applications that involve a user login. I’ve actually already started working on a library to make it easier to use fingerprint scanning as a sign-in mechanism for applications. Also, we’ll probably start researching more on the “Actions on Google” APIs and TensorFlow. You’ve already started working on an application that integrates TensorFlow, right Austin?Austin L: Definitely. I’ve been playing with it internally during my labs time. I’ve been busy experimenting with TensorFlow by going through installation and running this neat Neural Style repo to test it out. I’ve also been looking at comparisons between a similar implementation in Torch.Need extra muscle on your next Android project? GET IN TOUCH.