One thing is for sure. Apple is working on something big. We’re not even exactly sure how big just yet.
Each year, we set aside two full days of work to explore emerging technologies, challenge ourselves to learn something new, collaborate with our co-workers, and make something amazing. Bottle Rocket’s annual two-day hackathon, Rocket Science, is a chance for all Rocketeers to explore new technologies or interests while creating something fun for the office or potentially life-changing for its users.
With nearly 30 projects this year, this recap would be a little hefty if we were to cover them all. So, here’s 5 that we can’t stop talking about.
Project Title: CF Alert
Tech Explored: Progressive Web Apps (PWA)
Cystic Fibrosis is a devastating disease afflicting more 30,000 Americans with more than 75 percent of those diagnosed before turning two years old. As if that weren’t enough, children with Cystic Fibrosis cannot be within approximately 20 feet of each other due to the risk of transferring bacteria that can lead to serious infection in other patients. With that in mind, team CF Alert built a proof of concept PWA that would notify parents when another parent of a Cystic Fibrosis child was within 20 feet of them. This PWA concept was simple, elegant, and easy to use, and could potentially be life-saving for families coping with this serious condition.
Project Title: How Metal Am I?
Tech Explored: Create ML and CoreML
Leveraging Machine Learning for image and audio frequency analysis, this team looked to answer just how METAL something was. Simply snap a photo or let the “Metal Detector” app listen to the sounds around you and it will provide a Metal Quotient for how hardcore it is. While showcasing the app in the photo above, Russell took a selfie which resulted in a 34% Metal score, which is probably largely due to the Led Zeppelin shirt.
Project Title: Empathy Lab
Tech Explored: Analog
Several accessibility experts at Bottle Rocket worked together to create a truly interactive experience to help others empathize with those that are not able to use traditional inputs for technology that we interact with on a daily basis. For instance, how do you tap on a touch screen when you are unable to use your hands? This team set out to convey those struggles. Seen at the top of this page, the blue bar in the image is actually a form of crosshair that will move across the screen and stop when a Switch is triggered. From there, another horizontal line will appear, and the process is repeated. The user must time their taps to intersect the blue lines over the icon or button they want to click. It’s difficult to imagine what it’s like to have to use these devices in everyday life, but this experience definitely shed some light on the importance of designing with these needs in mind.
Project Title: Device Manager Voice App
Tech Explored: Google Assistant and Firebase
As Digital Voice Assistant adoption continues to rise, we continue to look for applications of the devices that go beyond simple “questions and answers.” Just as the name says, this team was able to use a Google Assistant App to manage our Quality Assurance device cabinet (which by the way currently includes over 400 devices). Instead of picking out a device, walking up to the checkout system, entering your name, scrolling to find the device (you get the picture), Rocketeers can now walk up and have a conversation with Google to expedite the checkout process.
Project Title: BLAST'EM
Tech Explored: Image recognition, Arduino, Machine Learning
Tired of Imperials marching across your lawn? BLAST ‘EM has a solution to rid you of those pesky Stormtroopers. By combining Machine Learning and Arduino supplies, this team was able to create an automated Nerf gun that could recognize a Stormtrooper and fire after finding the target. The best part? They saw about a 25% success rate when using a paper mask, but nearly 100% success rate once Founder and CEO, Calvin Carter, ran over and grabbed his homemade Stormtrooper helmet.
As this year’s 7th annual Rocket Science comes to a close, we would be remiss if we didn’t mention just a few other notable projects that our Rocketeers conceived and created over the course of 48 hours.
Our “favorite fail” of the year goes to a team that attempted to teach a computer to play Mario Kart on an iPad, only to find out the emulator and Machine Learning model required different versions of iOS.
Want to learn more about the technologies used in this year’s Rocket Science and how they can benefit your business? Contact us today.
This year, it seems many tech companies are focusing on “quality of life” (QOL). Usually the phrase “quality of life update” refers to a software update that makes many changes to an application or game to improve the overall experience – usually a combination of bug fixes, interface tweaks, performance enhancements, and anything else that improves someone’s experience with a particular piece of software. However, most recently, we’ve noticed more and more emphasis being placed on the user’s QOL rather than the software. Both Apple and Google have released features to help users spend less time on their phone and more time with those around them. Digital Health is not a new concept, but it does seem to have gone by the wayside in recent years. Although not the one key takeaway that we chose to highlight in this article (but it was a close second), we would be remiss if we didn’t mention the hot topic of app optimization. Quite a bit of time was spent covering how developers could and should optimize apps in every way possible – in file size, performance, and amount of time users need to spend in it to accomplish the desired task (which you should be doing anyway).
Speaking of QOL, Apple spent a majority of the keynote announcing new features for their apps and devices. Things like Search Suggestions for photos, updates and UI changes for several first-party apps, new workouts on the Apple Watch, and much more. They also announced that you could FaceTime with 32 people while using your own emoji, aptly named Memojies (below).
A majority of these updates benefited the ultimate end users of Apple devices while some helped developers more easily and effectively build on Apple’s platforms. There was, however, one update that stood out above the rest as the “killer feature” for apps this year. And that feature, is Siri Shortcuts.
These. Are. Big. Siri Shortcuts will change how a lot of people interact with a lot of apps. Since the emergence of DVAs (Digital Voice Assistants), the biggest barrier for adoption has been the learning curve for users. “What can I ask it? Was it how I phrased it? I didn’t want it to open that app to do ____.” are all statements you may have muttered to yourself when trying to communicate with your Google Home, Home Pod, or Amazon Echo. But Siri Shortcuts are going to change that. Instead of adding voice-controlled features to an app that users may or may not ever discover, developers can now prompt users with a button to “Add to Siri.” This does not add a particular action to Siri, but instead it allows users to create their own custom phrase to activate a certain feature that the app allows. For example, instead of having to say “Hey Siri, play my ‘Running’ playlist in Spotify,” someone can create a custom phrase for “Hey Siri, I’m going on a run” and the outcome will be the same.
This doesn’t sound like much, but this could change Siri’s role to many as a peripheral accessory of the iPhone to an app necessity. Instead of having to try several times to get a request to work, users can simply make their own. As we aren’t exactly sure on how this will work just yet, we are assuming it will be based on deep linking.
Another reason apps need to be Siri-ready is that Shortcuts will not just be for individual actions, but for a series of actions. Seen above, when asked “how’s the surf,” Siri began running through the requests the user had previously set up – like checking the weather and getting directions to the beach. Other examples Apple provided were Siri Shortcuts for “time to go home” or “let’s go to work.” In the “let’s go to work” example, Siri automatically knew to order a coffee from Starbucks that the user gets on the way to the office every day. So, for example, if your brand allows pick-up for groceries, you may want to integrate Siri in a way that allows people to create a grocery list of common items they need each week so users can order with a simple phrase.
By creating useful Siri integrations that can become part of a larger, daily/weekly/monthly routine instead of a one-off request, branded apps can quickly become a necessity of life even if they aren’t being manually launched. Like in the example above, the user with the morning routine didn’t open the Starbucks app, but they still bought a coffee.
Stay tuned for more from Apple’s developer conference or contact us today to learn more about Siri Shortcuts and how your brand can best leverage them.
Each year, the tech giants that power the computers in our pockets release a wide range of new features and capabilities for their platforms – though some features you might not see until the following year. Whether the feature is in Beta or because we haven't found a need for it in a client's app, it is important to take the time to learn and play with them anyway. We’d rather constantly run into roadblocks building an app that literally has no purpose instead of discovering the problems in an active project.
When asked about hackathons, a Rocketeer involved in the event said, “I used to think hackathons were just for kids and college students, but I couldn’t be more wrong. They’re fun, fast-paced, and exciting, which is a great way to get students involved, but they’re still just as fun for adults.” Hackathons are a time away from the pressure of client work, where failure is without consequence. There’s no breaking of keyboards, just laughing along with the insanity of it all.
Unknowns are the bane of project estimations. The more we know about the tech in question, the fewer variables there are in planning. Instead of jumping into a project that suddenly doubles in duration due to unforeseen problems with a feature or how to best implement it, hackathons give us the time to learn everything we need to know. Then, when the time comes to add a new feature for a client, we already know what to expect.
But enough about hackathons, here’s what happened in our latest event.
This team (seen in photo above) explored the new Android feature Picture-in-Picture mode. The goal was to have a live scoreboard of country medal counts displayed as you use your phone for other day-to-day tasks. They were able to get a functioning prototype completed and even discovered a few fun facts along the way. For instance, they found that if you make the aspect ratio of the minimized screen to extreme, it’ll crash. Also, they found that you could make the panel transparent – a totally useless feature unless you want to confuse your users, but a cool find nonetheless.
Seeking to better understand the capabilities of connectivity in iOS 11, The Gas House Gang made a curling(-like) game in which you slide pucks into area marked with a target. The reason this is interesting is that they had it running in two different applications – not the same application on two devices, but two different applications. While this might not sound like a feat, it shows the potential of having two completely different applications updating in real time to display information from each other.
Riding off the concept of tracking athletes’ diets, The Things explored one of the newer sides of IoT devices in the Android ecosystem – Android Things. Using a Raspberry Pi, the team had a functioning prototype that allowed them to photograph the food they were about to eat and then automatically post it to a channel of their choice. This could, for example, allow athletes to easily snap a photo and share it with their dietitian with the click of a button.
Everyone’s a winner in our eyes when it comes to hackathons. While some may not have finished, others found that the technology they were trying to leverage was actually not ready for the spotlight.
Brand New Wizbangs
This team put Apple’s ARKit 2D image detection and face mapping to the test with this face-painting experience. However, they ran into a roadblock trying to merge some features as the application would not allow it and crashed during each attempt to do so.
Leveraging Android’s Autofill Framework and Firebase Realtime Database, Javelin worked to create an up-to-date, real-time database that could update information and provide it as an autofill suggestion on the fly.
Alloy explored Metal, Apple’s proprietary graphics processing engine, and was able to have the five rings of the Olympics animated used 1,000 triangles.
Got an idea for your business but not sure where or how to start? Reach out to Bottle Rocket. We may just have an answer for you.
Mobile software testing involves many variables such as operating system versions, types of connectivity, networks, devices, and environments. To account for so many variables, the approach to testing apps poses an interesting dilemma: Should testing be performed on physical devices, simulators, emulators, or cloud-based devices? Is there a unique value in sticking to physical devices? At Bottle Rocket, we use a combination of these with an emphasis on physical devices—and we do mean emphasis. We recently surpassed 400 test devices in our library.
Users expect apps, including those with the ability to sync between devices with shared accounts, to work flawlessly on their devices. So, a comprehensive library of devices to mimic the user experience is essential in testing. However, the number of combinations of devices and operating systems that users have is huge. For this reason, many companies opt to rely mostly on simulators, emulators, and/or cloud-based devices for their software testing.
These device alternatives can be very helpful in the beginning of the application development lifecycle, but they are not ideal as the sole testing resource. Many important pieces of mobile interaction are lost when physical devices are left out.
Real-world users are equipped with real devices in different environments, meaning the effects of the following items may vary:
Recreating the user experience as closely as possible is what Bottle Rocket strives for. In our efforts to do this, we recently surpassed four hundred test devices in our QA arsenal! This is a milestone in itself since one of the biggest challenges that exists in application testing is the cost of obtaining and maintaining such a large volume of physical devices for testing purposes.
Our device lab contains Griffin cabinets with a capacity to hold and charge up to eighty devices at a time. Rocketeers pick out the devices they need, check them out library-style with our device checkout software, and begin their testing. All Rocketeers can access the checkout software remotely, so devices can be easily located and accounted for conveniently.
This system allows testers to plan ahead and dedicate certain devices to their project for as long as they are needed. Once a project starts, the lead tester maps out which devices will be needed during the testing process. They are checked out from the device cabinets and used throughout the project, only being returned when all testing is completed. When finished, checking the devices back in is as easy as pushing a button.
Solely relying on physical devices to test requires a great deal of time and money. Maintaining such a library involves frequent upgrades in order to remain relevant and effective. If our system doesn’t contain the latest and greatest mobile devices, a big chunk of tech-savvy users and their experiences will be unaccounted for. We need to remain ahead when it comes to testing on the latest hardware. We pay close attention to mobile announcements such as Google I/O and WWDC every year to gather intel on what’s coming and plan ahead by preordering any new devices or scoping out the latest operating system and its release date. So why all the focus on what’s coming? It’s important to know the adoption trends, new hardware features, such as aspect ratios, or if any new peripherals are announced.
New operating systems are especially impactful as they can touch any part of the device and installed applications. New features and tweaks to existing operating systems can affect corresponding areas of our applications, so we must look into the patch notes of any release and test our applications accordingly. A recent example of this is the release of iOS 11 – for every new beta released, we update our device operating systems and run tests on all of our applications. It’s a quick sanity check to verify all features cooperate with the updates. Developers have taken advantage of these betas as well in order to research and experiment with the new ARKit.
It is beneficial to have varying device types - high end and low end, new and old. Keeping old operating systems on devices is part of this method. Android users especially are not pushed to upgrade operating systems as aggressively as other platforms. Android OS 6 is still the most widely used according to Android’s developer portal, despite OS 7 having been out for more than a year. Covering OS 4, 5, 6, and 7 is necessary to ensure quality for all users.
In summary, using physical devices will assure the most accurate testing and analysis since that is the only way to duplicate the diversity of users’ environments, network settings, operating system versions, and hardware. Keeping up with current and future devices and operating system versions will also ensure quality results. Bottle Rocket will continue to build its device library – we’re beyond 400 now, and have no plans of stopping!
Interested in testing your app in our device library? Let us know at [email protected]
© 2019 Bottle Rocket. All Rights Reserved.