Tag Archives: Virtual Reality

PowerUp FPV turns your paper plane into a live-streaming drone

PowerUp FPV lets you experience flight from a first-person view paper airplane drone with a live-streaming camera.

Who could ever forget the simple joy of folding a piece of paper into a plane, throwing it and then watching it soar through the air? As a child, it was tons of fun. As you got older, not so much. This was something Shai Goitein and the PowerUp Toys team wanted to change. With aspirations of taking the age-old form of entertainment to new heights, the Tel Aviv-based startup has created PowerUp FPV — a kit that lets you outfit your paper plane with motorized propellers and a first-person view live-streaming camera.


Built in collaboration with Parrot, PowerUp FPV is a super lightweight camera-and-propeller rig that keeps your plane up in the air for up to 10 minutes per charge, while being capable of achieving speeds up to 20mph with a range of up to 300 feet. The kit enables you to feel as if you’ve been shrunken down and placed inside the cockpit of the paper plane.

That in itself is great, but what’s truly remarkable about PowerUp FPV is that it even has a 360-degree wide-view camera that can transmit an image back to you in real-time. This can be anything from a quick snapshot from off the wings to the ultimate ‘selfie’ with a rear-view picture as you launch your plane.


What’s more, it can be controlled either through a Google Cardboard headset, a head-mounted display or simply using an on-screen gamepad via PowerUp’s accompanying app. Connecting via Wi-Fi, you can watch the footage or wirelessly transfer it to your smartphone, and then upload it to YouTube, Facebook, Twitter and several other networks. Unlike the paper plane of yesterday, PowerUp FPV also has an auto-pilot mode for easy flying, and a fast-acting crash detection system that automatically shuts down its motors and rotating blades.


In terms of hardware, PowerUp FPV employs an Atmel MCU along with a 500mAh LiPo battery, a microUSB port for charging, a microSD card, dual-band MIMO antennas, a compass sensor, a three-axis accelerometer and gyroscpe, a barometer, a Wi-Fi module for connectivity, a swivel wide-angle camera, a microphone, a buzzer, as well as a durable carbon fiber and nylon reinforced with a crash-proof bumper.

Ready for an immersive paper airplane experience? Apparently so are thousands of others, as PowerUp FPV soared past its $100,000 goal on Kickstarter in a matter of four hours. The team hopes to begin shipping in June 2016.

This wearable device lets you touch your virtual reality world

UnlimitedHand is the world’s first video game controller with newly-developed haptic feedback technology.

It’s safe to say that virtual reality has grown leaps and bounds in recent years; however, despite these advancements, one thing that has been lacking was an interface that actually let you ’touch’ the VR world. This is exactly what one Tokyo-based startup set out to develop.


The team over at H2L has created what they’re calling the world’s first video game controller that enables you to feel things as if they existed in your real environment. UnlimitedHand is essentially a haptic sensor that goes around your arm and syncs with your hand to interact with onscreen objects.

Through Bluetooth, the wearable contraption delivers your finger and arm movements to the game while receiving data back that is felt in the form of haptic feedback. In other words, you will be able to grab, push, throw, hit and manipulate your digital surroundings as if you were really there.


The device, which straps around your forearm like an Ace bandage, is equipped with a muscle sensor, a 3D motion sensor, a multi-channel electronic muscle stimulator (EMS) and a vibration motor. The embedded motion and muscle sensors are tasked with recognizing user input, just like any other haptic gadget. UnlimitedHand then stimulates your muscles through EMS, controlling your fingers and hands while mirroring what’s happening in the game. By integrating this technology into the accessory, you will be able to ‘feel’ whatever your character experiences.


What will surely be appealing to developers, UnlimitedHand has made it easy to integrate with existing games through Unity plug-ins. Furthermore, its circuit is also Arduino-compatible, allowing Makers to hack it for a wide range of other applications. Whether it’s feeling the strings of a virtual guitar as it’s being played or commanding a robotic arm, the possibilities are endless. Intrigued? Head over to its Kickstarter campaign, which has already garnered well over its asking goal of $20,000. Units are expected to begin shipping in March 2016

These smart gloves will actually let you feel virtual reality

Gloveone is a pair of embedded gloves that lets you feel your way around the virtual reality world.

The relationship between video gamers and gloves hasn’t been all too dandy to say the least. Remember the Nintendo Power Glove from the early ‘90s? The accessory had been designed to provide players with buttons conveniently located on their forearm. Along with the wearable controller, the user was able to perform various hand motions to command a character on-screen. Unfortunately, the trend never really caught on.


Fast forward 25 years, video games have come a long way. Not just with their killer graphics, but more immersive experiences than ever before thanks to virtual reality. Cognizant of this, one Miami-based startup has set out to create a pair of gloves that works alongside a VR headset to offer users a sense of texture and depth. And with the emergence of industry heavyweights like Oculus, Google, Samsung and HTC each debuting simulated googles of their own, this innovation couldn’t have come at a better time.

Surely more exciting than Nintendo’s ill-fated attempt at a body-adorned gaming device, the aptly named Gloveone slips onto a person’s hands while sensation and texture are created through a series of complex vibrations. The wearable, which is based on an ATmega32U4 MCU, features 10 actuators on each fingertip and the palm that translates touch into haptic feedback at various frequencies, times and intensities to accurately reproduce sensations in the VR world.


What’s more, the device is embedded with IMU sensors to track and mimic movement on-screen in a natural manner, a Li-Po battery for four hours worth of power, a microUSB for low-latency mode via cable, Bluetooth for wireless communication (meaning no getting tangled up in wires), and those who work up a sweat during gameplay can take comfort in knowing that the gloves are comprised of breathable and anti-bacterial fabric.

The gadget enables users to perceive texture, sense sound and temperature, as well as distinguish between weight of objects. In other words, this means a wearer can feel heat from touching a fire burning in a game, a raindrop falling from the virtual sky or even tell if one augmented item is heavier than another. Beyond that, four sensors located in the palm, thumb, index and middle fingers communicate with one another, allowing a user to shoot a cannon, grab a flower petal or simply control the main menu. Unlike other gesture recognition systems, contact-triggered commands do not suffer from false positives or negatives, which can often times be very frustrating for users.


Aside from enhancing the gaming industry, Gloveone could surely play an integral role in bringing sci-fi-like technology into the healthcare setting by assisting those with impaired mobility to re-learn movements such as picking up and holding an item or even walking.

Currently available in three sizes (XS-S, M-L, XL-XXL), the system does rely upon auxiliary sensors like Leap Motion or Intel RealSense to track a user’s hands. However, it can also work with other tech including Microsoft Kinect and OpenCV. Sound like something you’d like to use? Head over to its official Kickstarter campaign, where the NeuroDigital Technologies crew is seeking $150,000. Delivery is slated for February 2016.

DORA is an immersive teleoperated robotic platform

DORA is bridging the gap between immersive virtual simulations and real world physical telepresence.

Telepresence robots have been used in a wide-range of applications, from remotely attending a meeting and visiting a museum to exploring space and scoping out a battlefield. As its name implies, these machines enable a user to make it as though they are standing in a distant location by navigating around an environment via a robotic surrogate. Yet, despite advancements in technology, the experience is still not exactly like real-life. That may may soon all change, especially if left in the hands of University of Pennsylvania engineers.


One team of researchers has set out to revolutionize telepresence robotics by building what they call Dexterous Observational Roving Automaton (DORA), which works with the Oculus Rift VR headset to establish a groundbreaking physical-virtual interface. Nowadays, most commercial devices are merely screens or tablets on moving platforms. However, DORA aspires to make it as though a user has actually been transported to another place.

In an effort to offer such an immersive experience, the remote robot is equipped with a pair of cameras that not only stream three-dimensional views of its terrain, but looks up/down, forward/backward and left/right as a VR headset wearer turns their own head. This is accomplished by precisely matching the movements of the headset wearer’s own neck in all six degrees of freedom through an inertial measurement unit (IMU) and infrared beacon tracking. That data is wirelessly transmitted to the robot’s embedded Atmel based Arduino and Intel Edison microcontrollers, prompting its camera-equipped head to mimic the motions of the user.


“DORA is based upon a fundamentally visceral human experience—that of experiencing a place primarily through the stimulation of sight and sound, and that of social interaction with other human beings, particularly with regards to the underlying meaning and subtlety associated with that interaction. At its core, the DORA platform explores the question of what it means to be present in a space, and how one’s presence affects the people and space around him or her,” its creators tell IEEE Spectrum.

Still in its prototyping stage, DORA operates over a radio link with a line of sight range of just over four miles. However, the team is looking to improve its responsiveness with a lag less than 60ms and to transition to Wi-Fi or 4G connections. This will allow for the system to be used in a variety of settings, such as virtual tourism, emergency response, and maybe one day even video chat.

Intrigued? Head over to the team’s official page to explore the project in more detail.

The Hands Omni glove will let gamers feel virtual objects

Rice University students create a feedback wearable device for virtual reality environments.

Though virtual reality has grown by leaps and bounds over the years, a vast majority of recent advancements have been focused around the audible and visual senses — touch not so much. With that in mind, a team of Rice University engineering students has unveiled a haptic glove that lets a wearer feel simulated objects as if they’re actually there. In other words, to make virtuality reality even more “real.”


The Hands Omni glove was designed to provide a way for gamers and others interested in VR to experience the environments they inhabit through the likes of three-dimensional heads-up displays. The prototype — which was introduced at the George R. Brown School of Engineering Design Showcase and developed in collaboration with gaming technology company Virtuix — works by providing force feedback to a user’s fingertips as they touch, press or grip things inside their virtual world.

The right-handed glove is comprised of inflatable bladders that sit underneath each finger, and expand and contract as necessary. What’s more, the wearable is wireless to allow the user to have a full-range of motion without ever having to worry about unwanted cables getting in the way during gameplay.


While the team’s agreement with its sponsor Virtuix means the underlying technology of the glove must remain top-secret, the students did reveal that an Atmel based Arduino is at the heart of its system. Its creators also point out that programmers will find it pretty straightforward to implement the glove’s protocols in future games and other immersive projects.

Basically, as a game is played, signals are sent from a computer using Arduino over to its proprietary system, which in turn inflates each of the individual bladders. The fingers are individually addressable, though pressure on the ring and little fingers is triggered as one unit in the prototype.

For example, say you come across an apple, a baseball or even some sort of weapon in a Call of Duty-style game, and want to pick it up, the Hands Omni will enable you to simply reach out and make it so that it’s as if you are touching a physical object.


The Hands Omni glove weighs around 350 grams (just over 12 ounces), which its creators say makes it light enough to be comfortably worn on a hand for long sessions without ever noticing it’s there.

“We had our own constraints based on testing to determine the amount of perceptible weight that could be strapped to your fingers, arms, legs and limbs — the maximum weight that is perceptible to users — and we came up with 660 grams on the forearm and much less than that on the back of the hand or on the fingers,” explains team member Kevin Koch. “We wanted as much mass as far back on the hand as possible, and that’s exactly what we’re doing.”

Intrigued? You can head over to the project’s official page here.

Widerun is bringing virtual reality to indoor cycling

This interactive bike trainer is designed to deliver engaging fitness sessions through VR headsets. 

Let’s face it, stationary biking can be boring. But what if, during your workout, you were suddenly immersed in an intense uphill battle in the Tour de France, a leisurely ride along the picturesque Pacific Coast Highway, or a thrilling escape from zombies in a Walking Dead-like post-apolocayptic world? That may soon be a reality thanks to one Italian startup that has debuted on Kickstarter.


While turbo trainers that allow avid cycling enthusiasts to use their actual bike indoors is fairly common, Widerun is a smart bike trainer designed to connect to virtual reality head-mounted units. At the moment, the system offers support for Oculus Rift and Samsung Gear VR, as well as other mobile VR displays. Widerun pairs to either a PC or smartphone via Bluetooth Low Energy with a theoretical distance over 100 meters.

Everything on the bike functions as it would had you actually been riding in these various settings. Meaning, when you switch gears to cycle faster or slower, Widerun transmits the real-world changes caused by the cyclist into the virtual world. As a true plug-and-play system, users don’t need a special bike to enjoy an immersive VR cycling experience. Instead, Widerun accommodates any piece of equipment with a wheel radius between 26″ to 29” — no adjustments necessary.


What’s more, Widerun features real-time, coherent feedback between your movement and your reaction in the VR world.

“One of the crucial aspects into delivering the best immersive virtual reality biking experience is the possibility to regulate the resistance and the inertia on the rear wheel according to the position in the 3D VR world,” the team writes. In other words, a rider will feel as if they are climbing mountains, breezing through forests or descending steep hills, as the trainer will automatically regulate its resistance.

Beyond that, Widerun also offers gamification and community elements that encourage users to choose among various VR settings to ride, engage with other cyclists, locate people to challenge, and monitor their performance history.


In order to create the most optimal VR biking experience possible, the team designed Widerun with two components: the trainer itself and a steering part that appears to be based on an Arduino Micro (ATmega32U4). An embedded MCU receives inputs from both the game environment and the bike trainer to regulate back the electrical signals to match the virtual experience the user is having, such as steering degree, speed magnitude and ground resistance.

Widerun hopes to get other VR software developers onboard in the coming months. The team notes, “We believe that there are many amazing wizards in game design and development out there able to create even better 3D environments where to bike through Widerun. We decided to include with any type of pledge the complete SDK to let you build and (if you like) upload your own VR worlds! We’re looking forward to bike in your VR creations!”

Interested? Head over to its official Kickstarter page, where the Widerun team is currently seeking £30,000. Shipment is expected to begin early next year. 

Time traveling through augmented reality and smell

This project uses augmented reality to lay virtual images onto a real world landscape, while emitting scents to make it as if you are there.

When one wants to learn about history, he or she will typically head to a museum, read a book, or browse the web. While these resources may offer a glimpse into the past, their static displays can’t actually emulate what it was like during an earlier age. That was until now.

An Institute of Archaeology University College London researcher has found a way to blur the lines between yesterday and today, giving people the illusion that they have indeed traveled through time. Almost sounds like a scene straight out of a Hollywood script, right?


The Dead Men’s Eyes app was initially created by Dr. Stuart Eve as a way to explore the use of augmented reality within archaeological practice. As a user holds the iPad’s camera up to the landscape, virtual renderings are positioned on top of the real world images, while the iPad’s GPS helps to pinpoint the user’s location. This enables the reconstructions to change in real-time as the user moves about their environment.

To bring this project to life, Dr. Eve uses a combination of the Unity3D gaming development platform and Qualcomm’s Vuforia mobile vision SDK to place the virtual layers in their correct location and provide the proper perspective. This is all achieved through archaeological data and then compared to where the user is standing.


A smell delivery device was also implemented to make the experience even more immersive. The aptly-dubbed Dead Men’s Nose emits scents based on the environment to make it as if one were really transplanted into another era. The system itself is based on an Arduino, an Arduino Wi-Fi Shield (AVR UC3), and a cheap computer fan. The device can either be worn or placed around the landscape, and more importantly, can be used with any odor of the user’s liking.


In terms of software, the mechanism connects to a web server and a fragrance is fired off by the Unity3D software. As a result, the smells are released in the right place at the right time as the user explores their surroundings. Dr. Eve notes that future models will include multiple aromas as well as an improved 3D-printed enclosure.


“Technological development is moving at an incredible rate, and already it is possible to wear transparent glasses with forward-facing cameras to overlay the AR information directly onto your field of vision, rather than having to use a portable handheld device such as a mobile telephone,” the archaeologist told Daily Mail in a recent writeup. “As this develops further, this will go some way towards mitigating the disconnectedness of having to hold up a mobile device in order to experience the virtual objects.”

Intrigued? Head over to the project’s official page here.

16 tech trends Andreessen Horowitz is most excited about

This list lets you inside the mind of Marc Andreessen and Ben Horowitz. 

One of, if not the, most prominent VC groups in Silicon Valley has revealed the hottest tech trends changing the world right now. For those wondering, that firm is Andreessen Horowitz and we’re referring to no other than its “16 Things” list. The breakdown, which highlights the most investable spaces at the moment, unsurprisingly includes Internet of Things, digital health, crowdfunding, and security — a couple of areas in which we know a little something about.


“We don’t invest in themes; we invest in special founders with breakthrough ideas,” Andreessen Horowitz writes. “Which means we don’t make investments based on a pre-existing thesis about a category. That said, here are a few of the things we’ve been observing or thinking about.”

While the list — which includes several themes that were evident throughout the CES 2015 show floor — will likely change over time, it does provide a nice glimpse into the firm’s thinking at the start of this year. Just in case you don’t feel like clicking through and navigating a16z in its entirety, here’s a brief overview of those breakthrough areas.

Virtual Reality

“VR will be the ultimate input-output device. Some people call VR “the last medium” because any subsequent medium can be invented inside of VR, using software alone. Looking back, the movie and TV screens we use today will be seen as an intermediate step between the invention of electricity and the invention of VR. Kids will think it’s funny that their ancestors used to stare at glowing rectangles hoping to suspend disbelief.”

Sensorification of the Enterprise

“For enterprise, the value of the sensors is in being a shortcut for the user interface, potentially even replacing typing so we can concentrate on the easy, fun, creative things.”

Machine Learning and Big Data

“The key here is in more automated apps where big data drives what the application does, and with no user intervention.”

Full Stack Startups

“The old approach startups took was to sell or license their new technology to incumbents. The new, ‘full stack’ approach is to build a complete, end-to-end product or service that bypasses incumbents and other competitors.”


“The next step in containerization is treating the datacenter, with all its containers, like one giant computer or server. Many applications today are really just distributed systems: Applications aren’t necessarily confined to just one container.”

Digital Health

“Tomorrow? To understand your personal diagnostic data, you might soon depend more upon an iPhone app developed in a garage than on your local MD.”

Online Marketplaces

“We’re continuing to see tremendous innovation in marketplaces. The first generation of net companies saw a few big horizontal marketplace winners like eBay and Craigslist. But entrepreneurs are continuing to create the next generation of online marketplaces.”


“There are two things now driving the security industry: (1) The bad guys are already inside. (2) New platforms — cloud and mobile — have arrived… Both are forcing a different set of technologies, and the creation of new kinds of companies.”

Bitcoin (and Blockchain)

“The clock has just begun on Bitcoin’s acceptance more broadly. Crash or no crash, we should expect a significant increase in the level of institutional adoption this year. Specifically, a large number of companies will put together groups focused on what Bitcoin means to them.”

Cloud-Client Computing

“Endpoints aren’t just phones; they could be wearables and other small devices and screens connected to the internet. Beyond the devices themselves, it all adds up to a massive amount of compute power. The next decade of computing will be about doing something with it.”


“Crowdfunding is going somewhere it never has — into the mainstream. That, in turn, will change all sorts of other things.”

Internet of Things

“Something often overlooked when we talk about all the shiny new connected gadgets emerging out of the Internet of Things is what happens to all the old things. I’m fascinated by the power of adding multiple sensors to old things and then connecting them to the Internet…. With the IoT we’re headed to a world where things aren’t liable to break catastrophically — or at least, we’ll have a hell of a heads up.”

Online Video

“What we do know is that online video is far from done… so it will be interesting to see what even a little competition will do here.”


“Insurance is all about distributing risk. With dramatic advances in software and data, shouldn’t the way we buy and experience our insurance products change dramatically? Software will rewrite the entire way we buy and experience our insurance products — medical, home, auto, and life.”


“The rise of the hyperscale cloud datacenter has now made this job much harder as developers have had to hack together tools and complex scripts for pushing code to thousands of pancake servers. This complex cloud infrastructure — coupled with the growth of the DevOps movement today — has opened up many opportunities, starting with helping developers and companies to manage the entire process … to much more.”


“The goal is not to fail fast. The goal is to succeed over the long run. They are not the same thing.”

8 trends shaping the future of making

Our friends at Autodesk explore the significant design and technology trends for 2015. 

Mass personalization will march toward the mainstream

Normal allows its customers to take a few pictures of their ears and uses that to create personalized 3D-printed headphones that fit perfectly in your ear. Normal CEO Nikki Kaufman describes it best as “Personalized, customized products built for you and your body.” In the last few years, we’ve seen companies that offer customers the ability to customize their products, by allowing customers to select from pre-defined options. Diego Tamburini, Manufacturing Industry Strategist at Autodesk predicts that customers will demand products that are uniquely tailored to their needs, tastes and bodies.

(Source: Normal)

(Source: Normal)

Big data will inform our urban landscapes

The design and construction of buildings, infrastructure and the cities they reside in are far too complex to rely on the wooden scale models of old. Architects, engineers and city planners are able to do things that were not possible in the past. As Phil Bernstein, V.P. Strategic Industry Relations at Autodesk put it, “Scale models, however beautifully made, are hardly up to the job of understanding how a building operates in the context of a city.

Thanks to advances in laser scanning, sensors and cloud-based software, cities are now being digitized into 3D models that can be viewed from every angle, changed and analyzed at a moment’s notice.

Cities like Los Angeles, Chicago, Singapore, Tokyo and Boston are working to digitize not just the shapes and locations of the buildings but create a data-rich, living model of the city itself — complete with simulated pedestrian traffic, energy use, carbon footprint, water distribution, transportation, even the movement of infectious diseases.

(Source: Autodesk)

(Source: Autodesk)

Our relationship with robots will be redefined

In the future, humans and robots will collaborate and learn from each other. Today, robots are receiving data and use machine learning techniques to make sense of the world and provide actionable analytics for themselves and humans. Nevertheless, robots are not artists and they will need inspiration and guidance from us for the foreseeable future. In the words of Autodesk Technology Futurist Jordan Brandt, “A robot is no more a craftsman than an algorithm is a designer.”

(Source: Autodesk Gallery France Pop-Up)

(Source: Autodesk Gallery France Pop-Up)

Designs will “grow”

When Lightning Motorcycles wanted to develop a next generation swing arm for their electric motorcycle, they adopted a new Autodesk approach for the project: A computer-aided (CAD) system called Project Dreamcatcher that automatically generates tens, hundreds, or even thousands of designs that all meet your specific design criteria.

Software like Autodesk’s Project Dreamcatcher is ushering a new era of design best described by Autodesk CTO Jeff Kowalski, “We’ll start to see more intensely complex forms, that could appear very organic, or very mathematic.”

(Source: Lightning Motorcycles)

(Source: Lightning Motorcycles)

Manufacturing in space

Made In Space is focused on one thing: making and manufacturing in space. With over 30,000+ hours of 3D printing technology testing, Made In Space has led to the first 3D printers designed and built for use on the International Space Station. As Made in Space CTO Jason Dunn explains, “2015 will be the year of space manufacturing. No longer do engineers need to design around the burdens of launch — instead, in 2015 we will begin designing space systems that are actually built in the space environment. This opens an entirely new book on space system design, a book where complex 3D printed structures that could only exist in zero-gravity become possible.”

(Source: Made in Space)

(Source: Made in Space)

Live materials will be integrated into our buildings

Today, buildings are dead, but new materials and technology are enabling living structures. For example, David Benjamin, founding principal of the design and research studio The Living, is collaborating with plant biologists at the University of Cambridge in England to grow new composite materials from bacteria, a process that uses renewable sugars as a raw material rather than non-renewable petroleum used for plastics. In 2014, The Living delivered Hy-Fi, a “living” installation for the Museum of Modern Art and MoMA PS1’s Young Architects Program competition. The temporary installation involved a 40-foot-tall tower with 10,000 bricks grown entirely from compostable materials — corn stalks and mushrooms — and developed in collaboration with innovative materials company Ecovative. That building was disassembled at the end of the summer and all of the bricks have been composted, returning to grade A soil.

(Source: The Living)

(Source: The Living)

Virtual and augmented reality will be integrated into everyday apps

New virtual devices like the Oculus Rift and augmented reality applications will require an innovative generation of spatial designers. According to Autodesk Technology Futurist Jordan Brandt, current touchscreen interaction will give way to ‘Immersion Design’ that leverages the spatial dimensions offered through emerging augmented and virtual reality platforms.

There’s a bright future for architecture students, game designers and multi-dimensional talent to join app development teams.

(Source: Autodesk and Neoscape)

(Source: Autodesk and Neoscape)

The amount of 3D data will rapidly increase

“With the ability to create 3D models on mobile devices through apps like 123D Catch or the Structure sensor, virtually anyone can begin to capture the spatial world around them. Coupled with the broader adoption of WebGL technology and 3D printing, we can expect an explosion in the amount of 3D data available in 2015. Responding to user demand, social platforms will enable direct sharing of 3D data and start to provide immersive, collaborative experiences.” — Autodesk Technology Futurist, Jordan Brandt

(Source: 123D Catch)

(Source: 123D Catch)

This article written by the Autodesk team originally appeared on Medium.