Tag Archives: Augmented Reality

This app lets you program objects by drawing lines


Like something out of science fiction, the Reality Editor lets you connect and manipulate the functionality of physical objects. 


Back in 2013, a team from MIT Media Lab’s Fluid Interfaces Group developed a method of creating Spatially-Aware Embodied Manipulation of Actuated Objects through augmented reality. The project was an effort to extend a user’s touchscreen interactions into the real world. Earlier this year, the crew released libraries and examples that could also allow others to do the same. With Open Hybridyou could directly map a digital interface onto a physical thing and program hybrid objects using Arduino and other popular hardware/software environments.

car3

Now, the researchers have taken the project to a whole new level. The Reality Editor is a futuristic tool that empowers you to connect and manipulate the functionality of any gizmo or gadget. Just point your smartphone camera at an item and an overlay with its invisible capabilities will appear on the screen for you to edit. Drag a virtual line from one to another and form a new relationship between the two.

Although the ultimate goal of the IoT is to make ordinary objects life in our smart, most things are still pretty ‘dumb.’ They don’t communicate with one another, and most are only capable of one function. Let’s take a smart bulb for instance, which can dim and brighten, but it can’t change the channel on your TV. This is where the Fluid Interfaces Group’s app comes in.

connect_r

The Reality Editor lets you define simple actions, change the functionality of objects around you, and remix how things work and interact. Essentially, the app gives you the power to turn something that is virtual into something that is physical and vice versa. The best part? It’s as easy as connecting dots.

“That light switch in your bedroom you always need to stand up in order to turn off — just point the Reality Editor at an object next to your bed and draw a line to the light. You have just customized your home to serve your convenience,” the team writes. “From now on you will use your spatial coordination and muscle memory to easily operate the object next to your bed as a tool for controlling the light.”

car2.gif

What’s more, you can ‘borrow’ functionalities from one object and use them on another. For example, you could employ your TV’s sleep timer as a way to switch your lights on and off, or even have the air conditioning at your house adjust the temperature when you hop into your car to head home. The possibilities are endless.

At the moment, the Reality Editor utilizes QR-like codes to identify smart devices. It works by prompting an HTML webpage and overlays a particular object’s functionalities onto the smartphone so you can program it. However, it will soon be able to recognize the objects as they are viewed with the app.

The Reality Editor can be downloaded and used along with the group’s open source platform Open Hybrid to build a new generation of Hybrid Objects. This isn’t solely geared towards designers and engineers, but Makers and other high-tech enthusiasts as well. Safe to say, a Minority Report-like future is quickly approaching.

 

How Bluetooth beacons can put an end to QR codes


Bluetooth beacons can enhance experiences in a way that is truly indistinguishable from magic.


Arthur C. Clarke once stated, “Any sufficiently advanced technology is indistinguishable from magic,” something that holds true when it comes to our ever-connected world. Take a look around and you will surely notice that the Internet of Things phenomenon is growing quite rapidly. So much so that some adopters have become a part of the IoT without even knowing. Many times, these cloud-based data processing solutions appear to the user as only a familiar webpage or mobile application.

The Internet of Things phenomenon is growing quickly around us.

Part of making IoT ubiquitous and nearly magical is awareness of where you are. GPS and cellular location can certainly do a great job outdoors. Cell tower-based location can give a very rough prediction of location indoors or outdoors. Using GPS or tower location, it is likely that an application running on a mobile device would know that you just walked into a particular store or venue.

But what happens if you need to know a more precise location inside? Take for instance, retailers and venues, who want to deliver very specific content based on the exact location of a customer, like a promotion for a particular product on a nearby shelf.

Today, many museums and public venues, such as malls and arenas, have strategically employed QR code barcodes to allow for on-demand access to location-specific information. Patrons can scan the code and automatically launch cloud-based content into an app or browser that is related to particular exhibits and locations. As great as it may be, I have come to realize that it is a real pain because it requires scanning the QR code at every exhibit. For me, this involves entering my PIN to unlock my cellphone, then looking for my QR code scanner app. This takes my attention away from my family and the overall museum experience. Usually by the time I have accessed the information, my family has moved on to the next exhibit without me.

I recently visited the North Carolina Aquarium in Pine Knoll Shores. It is a nice aquarium with thousands of examples of aquatic life from North Carolina’s many inland freshwater bodies, as well as the sea in smaller exhibits cumulating in the large 300,000-gallon tank holding a replica of the German U-352 that was sunken off the coast of North Carolina during WWII. What’s more, there is a 50,000-gallon installation that re-creates the scene as divers discovered the wreck of the Queen Anne’s Revenge, a ship once commanded by the most infamous pirate of them all – Blackbeard. The ship was last seen sinking off the North Carolina coast in 1718. Case in point: as with most exhibits, there are stories to be told that are specific to each one. Getting easy access to those stories easily enhances the overall visitor experience.

I noticed that several of the smaller exhibits at the NC Aquarium had interactive electronic experiences that were not working because they had fallen into disrepair.

I had noticed that several of the smaller exhibits at the North Carolina Aquarium featured interactive electronic experiences that weren’t working because they had fallen into disrepair. A prime example was the amphibian exhibit, which you can press an old-fashioned button and hear what a frog call sounds like.

I can imagine the electronics behind this antiquated pushbutton: probably a voice recorder circuit from the 1990s along with a power supply and speaker. The button most likely stopped working after a few thousand kids pressed it dozens of times each, or the contacts became oxidized and non-conducting because the current through the switch was insufficient to keep the oxidation burned off. Design of switch circuits is another topic and one that hopefully will need to be addressed much less going forward thanks to innovations like capacitive touch for buttons, sliders, wheels, and other user interface elements.

push-buttons-far-from-advance

In this case, the old-school pushbutton that doesn’t work is far from advanced, let alone “indistinguishable from magic.” And for that matter, the QR codes strategically placed at exhibits are clunky as well.

Instead, what if there were little radio transmitters at each exhibit that your mobile device could detect and reliably determine location? As you are well aware, your mobile device comes equipped with Bluetooth and Wi-Fi radios, as well as GPS, cellular and NFC. Of these technologies, we can use Bluetooth to interact with the exhibits by letting the phone seamlessly know where in the building it is located. Introducing self-contained Bluetooth Smart Beacons or iBeacons as a solution to this problem should not be difficult.

These beacons consist of a power source, a Bluetooth Smart radio and an antenna, all housed inside an enclosure. Beacons work by sending out a packet of data at regular intervals, called the advertising interval. In a museum or aquarium where people walk around, the advertising interval could be one second or more. With an advertising interval of a second, a Bluetooth Smart beacon using Atmel’s BTLC1000 SoC can operate at an average current of under 7 µA and last up to four years on a low-cost CR2032 Lithium coin-cell or longer on a pair of AAA batteries. And the best part is that there are no moving parts — nothing to be loaded onto the beacon except a unique ID or serial number associated with the specific location in the museum or other venue. And the technology is real today. In fact, beacons from Apple (known as iBeacons) are already being deployed in select retail locations such as Disney stores and throughout their own Apple stores. Some iBeacons apps simply run on iPhones and iPads, while others use dedicated low-power and low-cost hardware.

fyx-self-contained-bluetooth-beacon

Let’s consider the entire system and the lifecycle cost of a location-based system of beacons and a smartphone application versus individual content loaded at particular exhibit locations. In this scenario, the largest upfront cost of the solution will be that of developing the website and/or the app. The price of the beacons will be negligible by comparison.

Deployment of the beacons can be accomplished using a different app that can register each beacon to a location and associate it with specific content. Once deployed, the beacons need not be reprogrammed or upgraded. Their ID is simply linked to content located on a server, which can be updated whenever necessary.

Another nice feature of this system is that trained employees are accustomed to loading content onto web servers. There are very few people who are adept at re-recording audio files onto a 20-year-old talking box or repairing it’s worn out pushbutton. Deployment of the app would be done through the app stores for Google, Apple and other phone OS suppliers. Maybe you could even get started by scanning a single QR code when you enter the venue. But that would be the last of the dreaded QR codes you would need to scan.

Using Bluetooth beacons, an experience such as the North Carolina Aquarium could actually be enhanced by technology in a way that is truly indistinguishable from magic. Some other applications, many of them not new, that I think could benefit from this technology include:

  • Sports like skating, motorsports, and swimming/diving: to enhance safety and enjoyment.
  • Retail stores: to provide special discounts and on-the-spot information.
  • Car dealerships: to offer information to those driving by.
  • Amusement parks: to advise patrons about waiting times or to help staff manage crowd traffic.
  • Art galleries: to improve spectators’ experiences without taking anything visual away from the exhibits by cluttering the gallery with QR codes.
  • In the dining room: Based on being near a beacon, the entire family’s devices can go into a silent “family time” mode that would turn off ringers and even disable texting. Similarly, restaurants, churches, funeral homes, conference rooms and other settings could implement an automatic cellphone quiet zone for those who didn’t want to forget to turn off their ringers.
  • At home or in the car: to customize the operation of a phone or tablet in specific ways based on a person’s preferences.
  • Public buildings or on streets: to ease wayfinding for the visually-impaired.
  • Senior centers: to help the elderly or those with disabilities regain independence by pairing with a wearable device.

Coincidentally, I saw this on the way home the other day. While I still don’t know any details, the concept of using beacons got me thinking.

city-wide-rezoning-notices-tech-atme

What are the chances that some will pull my car over, get out, and scan the QR code on this outdoor sign? If like me, probably slim to none. The same goes for those who are looking to buy real estate and are driving in their vehicles. What good is the QR code to you in this situation?

remax-encore-bluetooth-beacons

Unless I’m walking or want to go through the trouble of getting out of my vehicle to scan the sign, or worse yet try and scan the sign while driving, I probably won’t utilize the attached QR code. Using beacons will not only eliminate risks, but will expedite the process altogether. What if we enable the real estate apps with access to the mobile device’s Bluetooth? Now we can look for Bluetooth beacons placed strategically at properties that are for sale and collect information about properties without getting out of the vehicle, and even more importantly, without taking our eyes off the road.

There is enormous potential for the use of Bluetooth Smart beacons anywhere signs are posted and wherever further information is available online. The real estate market is just one of many example use cases, where the implementation of beacons could be a key differentiator for companies willing to become early adopters.

You do have to focus on the revenue generating applications, but there are countless other applications where QR codes located on larger signs could be replaced by beacons to make it easier to access information and reduce the total size and number of signs.

One example is this QR code-equipped sign to encourage people to walk instead of driving their cars…

fitness-navigation-cues-as-bluetooth-beacons

Or this one that provides fitness information to those taking a stroll along the public greenway trail…

fitness-trails-as-bluetooth-beacons-atmel

These are just a few the ways that Bluetooth beacons can help make the world a better place. A new thinking in terms of apps and getting people to install them is necessary for success. However, if the value of the information becomes high enough, it will happen. Hopefully you will think of more applications and ways to design Bluetooth Smart beacons to support them. And when you do, be sure to look at the lowest power and lowest total bill-of-material cost solutions from Atmel.

Time traveling through augmented reality and smell


This project uses augmented reality to lay virtual images onto a real world landscape, while emitting scents to make it as if you are there.


When one wants to learn about history, he or she will typically head to a museum, read a book, or browse the web. While these resources may offer a glimpse into the past, their static displays can’t actually emulate what it was like during an earlier age. That was until now.

An Institute of Archaeology University College London researcher has found a way to blur the lines between yesterday and today, giving people the illusion that they have indeed traveled through time. Almost sounds like a scene straight out of a Hollywood script, right?

leskernick_landscape_no_masks_clipped

The Dead Men’s Eyes app was initially created by Dr. Stuart Eve as a way to explore the use of augmented reality within archaeological practice. As a user holds the iPad’s camera up to the landscape, virtual renderings are positioned on top of the real world images, while the iPad’s GPS helps to pinpoint the user’s location. This enables the reconstructions to change in real-time as the user moves about their environment.

To bring this project to life, Dr. Eve uses a combination of the Unity3D gaming development platform and Qualcomm’s Vuforia mobile vision SDK to place the virtual layers in their correct location and provide the proper perspective. This is all achieved through archaeological data and then compared to where the user is standing.

SONY DSC

A smell delivery device was also implemented to make the experience even more immersive. The aptly-dubbed Dead Men’s Nose emits scents based on the environment to make it as if one were really transplanted into another era. The system itself is based on an Arduino, an Arduino Wi-Fi Shield (AVR UC3), and a cheap computer fan. The device can either be worn or placed around the landscape, and more importantly, can be used with any odor of the user’s liking.

SONY DSC

In terms of software, the mechanism connects to a web server and a fragrance is fired off by the Unity3D software. As a result, the smells are released in the right place at the right time as the user explores their surroundings. Dr. Eve notes that future models will include multiple aromas as well as an improved 3D-printed enclosure.

SONY DSC

“Technological development is moving at an incredible rate, and already it is possible to wear transparent glasses with forward-facing cameras to overlay the AR information directly onto your field of vision, rather than having to use a portable handheld device such as a mobile telephone,” the archaeologist told Daily Mail in a recent writeup. “As this develops further, this will go some way towards mitigating the disconnectedness of having to hold up a mobile device in order to experience the virtual objects.”

Intrigued? Head over to the project’s official page here.

8 trends shaping the future of making


Our friends at Autodesk explore the significant design and technology trends for 2015. 


Mass personalization will march toward the mainstream

Normal allows its customers to take a few pictures of their ears and uses that to create personalized 3D-printed headphones that fit perfectly in your ear. Normal CEO Nikki Kaufman describes it best as “Personalized, customized products built for you and your body.” In the last few years, we’ve seen companies that offer customers the ability to customize their products, by allowing customers to select from pre-defined options. Diego Tamburini, Manufacturing Industry Strategist at Autodesk predicts that customers will demand products that are uniquely tailored to their needs, tastes and bodies.

(Source: Normal)

(Source: Normal)

Big data will inform our urban landscapes

The design and construction of buildings, infrastructure and the cities they reside in are far too complex to rely on the wooden scale models of old. Architects, engineers and city planners are able to do things that were not possible in the past. As Phil Bernstein, V.P. Strategic Industry Relations at Autodesk put it, “Scale models, however beautifully made, are hardly up to the job of understanding how a building operates in the context of a city.

Thanks to advances in laser scanning, sensors and cloud-based software, cities are now being digitized into 3D models that can be viewed from every angle, changed and analyzed at a moment’s notice.

Cities like Los Angeles, Chicago, Singapore, Tokyo and Boston are working to digitize not just the shapes and locations of the buildings but create a data-rich, living model of the city itself — complete with simulated pedestrian traffic, energy use, carbon footprint, water distribution, transportation, even the movement of infectious diseases.

(Source: Autodesk)

(Source: Autodesk)

Our relationship with robots will be redefined

In the future, humans and robots will collaborate and learn from each other. Today, robots are receiving data and use machine learning techniques to make sense of the world and provide actionable analytics for themselves and humans. Nevertheless, robots are not artists and they will need inspiration and guidance from us for the foreseeable future. In the words of Autodesk Technology Futurist Jordan Brandt, “A robot is no more a craftsman than an algorithm is a designer.”

(Source: Autodesk Gallery France Pop-Up)

(Source: Autodesk Gallery France Pop-Up)

Designs will “grow”

When Lightning Motorcycles wanted to develop a next generation swing arm for their electric motorcycle, they adopted a new Autodesk approach for the project: A computer-aided (CAD) system called Project Dreamcatcher that automatically generates tens, hundreds, or even thousands of designs that all meet your specific design criteria.

Software like Autodesk’s Project Dreamcatcher is ushering a new era of design best described by Autodesk CTO Jeff Kowalski, “We’ll start to see more intensely complex forms, that could appear very organic, or very mathematic.”

(Source: Lightning Motorcycles)

(Source: Lightning Motorcycles)

Manufacturing in space

Made In Space is focused on one thing: making and manufacturing in space. With over 30,000+ hours of 3D printing technology testing, Made In Space has led to the first 3D printers designed and built for use on the International Space Station. As Made in Space CTO Jason Dunn explains, “2015 will be the year of space manufacturing. No longer do engineers need to design around the burdens of launch — instead, in 2015 we will begin designing space systems that are actually built in the space environment. This opens an entirely new book on space system design, a book where complex 3D printed structures that could only exist in zero-gravity become possible.”

(Source: Made in Space)

(Source: Made in Space)

Live materials will be integrated into our buildings

Today, buildings are dead, but new materials and technology are enabling living structures. For example, David Benjamin, founding principal of the design and research studio The Living, is collaborating with plant biologists at the University of Cambridge in England to grow new composite materials from bacteria, a process that uses renewable sugars as a raw material rather than non-renewable petroleum used for plastics. In 2014, The Living delivered Hy-Fi, a “living” installation for the Museum of Modern Art and MoMA PS1’s Young Architects Program competition. The temporary installation involved a 40-foot-tall tower with 10,000 bricks grown entirely from compostable materials — corn stalks and mushrooms — and developed in collaboration with innovative materials company Ecovative. That building was disassembled at the end of the summer and all of the bricks have been composted, returning to grade A soil.

(Source: The Living)

(Source: The Living)

Virtual and augmented reality will be integrated into everyday apps

New virtual devices like the Oculus Rift and augmented reality applications will require an innovative generation of spatial designers. According to Autodesk Technology Futurist Jordan Brandt, current touchscreen interaction will give way to ‘Immersion Design’ that leverages the spatial dimensions offered through emerging augmented and virtual reality platforms.

There’s a bright future for architecture students, game designers and multi-dimensional talent to join app development teams.

(Source: Autodesk and Neoscape)

(Source: Autodesk and Neoscape)

The amount of 3D data will rapidly increase

“With the ability to create 3D models on mobile devices through apps like 123D Catch or the Structure sensor, virtually anyone can begin to capture the spatial world around them. Coupled with the broader adoption of WebGL technology and 3D printing, we can expect an explosion in the amount of 3D data available in 2015. Responding to user demand, social platforms will enable direct sharing of 3D data and start to provide immersive, collaborative experiences.” — Autodesk Technology Futurist, Jordan Brandt

(Source: 123D Catch)

(Source: 123D Catch)

This article written by the Autodesk team originally appeared on Medium.

 

Report: 30% of smart wearables will be inconspicuous by 2017

As previously reported on Bits & Pieces, we can expect to see wearable technology become less invasive over the next couple of years. Aside from an emergence in smart clothing and e-textiles, a new study from Gartner has revealed that the wearables market will continue to expand and evolve with 30% of the devices to become completely unobtrusive to the eye by 2017.

mann

“Already, there are some interesting developments at the prototype stage that could pave the way for consumer wearables to blend seamlessly into their surroundings,” explained Annette Zimmermann, Gartner Research Director. “Smart contact lenses are one type in development. Another interesting wearable that is emerging is smart jewelry. There are around a dozen crowdfunded projects competing right now in this area, with sensors built into jewelry for communication alerts and emergency alarms. Obtrusive wearables already on the market, like smart glasses, are likely to develop new designs that disguise their technological components completely.”

Gartner went on to share several other predictions around the consumer devices market, including:

  • By 2018, more than 25 million head-mounted displays (HMDs) will have been sold as immersive devices and virtual worlds will have transitioned from the fringe to the mainstream.
  • Interest in HMD devices — which power virtual reality (VR), augmented reality (AR) and other smart glass apps — will continue to rise. So much so that, by 2018, the technology behind them will be found throughout both consumer and business scenarios.
  • More stylish, consumer-grade video eyeglasses will result in explosive growth for HMDs — driving device adoption when paired with VR and AR content.
  • By 2016, biometric sensors will be featured in 40% of smartphones shipped to end users.
  • Fingerprint scanning will be the primary biometric feature introduced by most vendors, given its intuitive and unobtrusive usage.
  • Other biometrics, ranging from facial and iris to voice and palm vein authentication, will also surface yet will remain relatively niche.
  • Through 2017, one-third of consumers in emerging markets will have never owned a Windows device.
  • In mature markets, PC penetration is still relatively high with more than 90% of consumers currently using a Windows PC.
  • The rise in smartphones and their subsequent drop in price will lead some users to purchase their first smartphone for under $50.

Interested in learning more? You can read the entire press release and access the report here.