The latest innovation from Bang & Olufsen is an intelligent and sociable music system that integrates your music collection and streaming services into one.
Back at CES 2015, Copenhagen-based Bang & Olufsen debuted their incredibly innovative BeoSound Moment, which integrates sound collections and services into a playful music system boasting what is surely the world’s very first touch-sensitive wood interface. As advocates of both capacitive touch and Internet-enabled gadgets, we couldn’t help but to fall in love with this musical masterpiece. This smart device is packed with a number of features, including the company’s PatternPlay feature, which enables the system to learn the listening patterns of its users, suggest music or programs that fit a specific time, memorize preferences, and make listening both familiar and explorative with access to more than 35 million songs from streaming service Deezer.
“Over time, BeoSound Moment will gradually start to know your taste in music, and be able to play what you most likely want to hear, without you even having to ask. Just like friendship, it only gets better with time.”
With just one touch of the elegant oak panel, music begins to play based on a user’s personal preferences. Indeed, the BeoSound Moment comes in two parts: a dock/base station and a wooden-interfaced wireless control. The detachable, double-sided UI enables two different listening experiences. Those seeking a somewhat more traditional, controllable style should adhere to its aluminum panel, which is equipped with a touchscreen for engaging interaction. In essence, it’s a tablet.
However, flip it over and users will find an entirely look — an oak side donning wheel control designed for one-touch access to exactly the sound experience that fits the listener’s daily rhythm. The beautiful panel of touch-sensitive wood (embedded with capacitive sensors just under a thin layer of veneer) allows user to have their favorite music flowing from the speakers with just one touch on the wheel.
Since the dual-part BeoSound Moment system is compatible with B&O’s entire range of wired and wireless speakers, the device is capable of integrating digital tunes that best suit a listener’s mood. This works depending on how close their finger is to its center, as the very middle selects from a list of only favorites while the outer parameters tempts listeners to check out more adventurous songs. The MoodWheel is divided into a color gamut that ranges from melancholic blue over a passionate red zone to an energetic yellow area. Combined, these two dimensions on the intuitive MoodWheel offer limitless possibilities for defining your selection of music.
It’s quite difficult to believe that just decades ago, the mere concept of a touch-enabled device could only be found in sci-fi flicks and novels. Nowadays, it’s practically impossible not to have a touchscreen gadget within an arm’s length. In fact, touchscreens are everywhere — from the thermostats in your homes, to center consoles of your cars, to the phones in your pockets.
The Electronic Sackbut is designed by Hugh Le Caine at his home studio in Ottawa, Ontario.
E.A. Johnson invents the finger-driven, capacitive touchscreen at the Royal Radar Establishment in Malvern, United Kingdom.
(Source: Ars Technica)
Dr. G. Samuel Hurst designs the first resistive touchscreen — almost by accident.
PLATO IV not only became one of the first generalized computer-assisted instruction systems, but the first to be used in a classroom setting. Students could answer questions with the tap of a finger using the device’s infrared touch panel.
One of the early implementations of mutual capacitance touchscreen technology is developed at CERN.
University of Toronto’s Nimish Mehta develops the first human-controlled multi-touch device, dubbed the “Flexible Machine Interface.”
HP Series 100 HP-150 becomes one of the earliest touchscreen computers.
Myron Krueger introduces Video Place, a vision-based system capable of tracking hands, fingers and people using a set of gestures.
Bob Boie of Bell Labs officially develops the first multi-touch overlay.
Casio rolls out its AT-550 watch with a touchscreen.
The Buick Riviera features a touchscreen in its Graphic Control Center.
(Source: Popular Mechanics)
IBM and BellSouth debut the first-ever touchscreen phone, the Simon Personal Communicator.
Palm Inc. releases the Pilot, the first generation of its PDA devices.
Wayne Westerman and John Elias create FingerWorks, a company that specializes in multi-gesture input devices.
Alias|Wavefront launch the Portfolio Wall for large design and 3D animation teams.
(Source: Car Design News)
Sony introduces mutual capacitive touch recognition with SmartSkin.
Andrew D. Wilson develops a gesture-based, 3D-capable imaging touchscreen called TouchLight.
(Source: Seattle Pi)
JazzMutant releases the Lemur, a music controller with a multi-touch screen.
Jeff Han unveils an interface-free, touch-driven computer screen at TED.
Apple successfully releases its touchscreen-equipped iPhone.
Microsoft introduces the Surface table.
Nortd Labs launches TouchKit, a DIY modular development solution to make multi-touch readily available in an open source manner.
(Source: Nortd Labs)
Apple introduces the iPad.
(Source: Discovery News)
Microsoft and Samsung partner to introduce the SUR40 touch-capable surface with PixelSense.
Atmel XSense is introduced to the world, enabling future curved surfaces and flexible displays.
The Atmel team exhibits AvantCar, a fully-functional center console equipped with two large curved touchscreen displays – without mechanical buttons.
The burgeoning Maker Movement paves the way for Bare Conductive to launch its [ATmega32U4 powered] Touch Board, now enabling everyone to easily transform any material or surface into a touch sensor.
Whirlpool imagines a kitchen of the future with a touchscreen stovetop capable of displaying recipes, social feeds, weather and more.
A team from Carnegie Mellon University’s Future Interfaces Group creates Skin Buttons, touch-sensitive projected icons made on a user’s skin.
(Source: Atmel Blog)
The Centre for Process Innovation devises an idea to remove passenger plane windows and replace them with OLED touchscreens.
(Source: Centre for Process Innovation)
What will be next? As we gaze into the future, unlimited-touch capability will open up a range of endless possibilities for interface designers. From our touchscreen controllers to touch sensors and everything in between, Atmel has and will continue to provide the next-gen technologies enabling innovative and differentiated designs.
While smartwatches are a promising new interactive platform, their small size makes even basic actions cumbersome. As a result, the Carnegie Mellon team has designed a new way to “expand the interactive envelope around smartwatches, allowing human input to escape the small physical confines of the device.”
Using tiny laser projects that are integrated into the smartwatch to render touch-sensitive icons allows for the expansion of the interaction region without increasing device size, and more importantly, sacrificing precious real estate on a wearer’s arm.
“Maybe in 15 or 20 years you’ll have a device that’s as powerful as a smartphone but has no screen at all,” explained Chris Harrison, Head of the Future Interfaces Group. “Instead it’s like a little box of matches that you plunk down on the table in front of you and now all of a sudden that table it interactive. Or a watch that’s screen-less. You could just snap your fingers and you whole arm becomes interactive.”
The proof-of-concept implementation can be used for a range of applications, many of which typically found on a mobile device, such as accessing music, reading emails and text messages, as well as checking the time or setting an alarm.
The prototype smartwatch contains four fixed-icon laser projectors along with accompanying infrared proximity sensors. These are connected to an ATmega328P based Femtoduino board, which communicates over USB with a host computer. Additionally, a 1.5-inch TFT LCD display is driven from a host computer. While the team used an external computer for prototyping, it appears that a commercial model would be self-contained.
“If you put a button on your skin, you expect people to be like, “What the, this is totally insane!” Harrison told Wired. “But actually people don’t generally react like that. People think it’s cool but they get over the coolness really fast and just start using it.”
Electronic DesignTechnology Editor Bill Wong recently had the chance to catch up with Patrick Hanley, Atmel Product Marketing Manager for Touch Technology, to talk about recent market trends as well as the company’s latest offerings. The interview, which was published on September 26, 2014, can be found below.
Wong: The world of touch-enabled devices is skyrocketing; from the proliferation of smartphones to tablets, almost everyone wants to tap a screen even if it’s not touch-enabled. What do you think has led to the widespread adoption?
Hanley: With the introduction of the iPhone in 2007, the general consumer market became more comfortable and aware of capacitive touch-enabled products to infiltrate our lives. For years prior, the idea of a capacitive touch was an unfamiliar concept that consumers were less comfortable with.
Today most individuals approach all displays with the assumption it is touch-enabled. The world of touch can be seen in a vast range of formats and devices, at its most basic levels in buttons, sliders, and wheels, to more advanced touchscreens that provide multiple, true X/Y coordinates. These touch devices also reach a multitude of applications. From GPS systems to wearables to all-in-one PCs, there is a place for touch in all of these devices.
Hanley: The mXT106xT family is a continuation of our T-series family of products. It is aimed at the largest growth touchscreen market, screens between 7 to 8.9-inches. We introduced adaptive sensing, which is a hybrid of mutual- and self-capacitance. This enables the best glove, finger hover sensing and stylus support available, even in the presence of moisture. Adaptive sensing is crucial, as it enables touch classification where the touch controller is able to determine the difference between a single finger, multi-touch, glove, hover, and stylus, and reacts to the user appropriately.
We unveiled several new features including the peripheral touch controller (PTC), the first touch controller that enables capacitive button capabilities within the same controller without compromising any additional x/y-lines. The PTC improves noise immunity, eliminates external components, and simplifies the sensor design. Additional features include voltage triplers and non-HDI (high-density interconnect) packages. The voltage tripler reduces external BOM components, saving the customer space and cost. The non-HDI package enables customers to reduce PCB layers, further reducing costs.
Wong: Sounds interesting. So, we all know device features are everything, starting from the initial touch performance carrying through to everything else that influences the UI. How is Atmel aiming to continue improving these features?
Hanley: The user interface can make or break the success of a product. An intuitive, yet attractive, UI can create demand for products where customers “have to have” these new products. This is the easiest way for an OEM to differentiate their end product.
Improving stylus performance is vital for a variety of applications and vertical markets. Active stylus support is becoming a must-have for higher-end tablets, which are typically identified for professional or artistic uses. Alternatively, passive stylus support is geared toward free-writing capabilities for general users as well as everyday uses. Passive stylus support carries universal stylus capabilities, even as standard as a no. 2 pencil, ultimately revolutionizing the “pen-to-paper” experience.
Atmel also offers features like hover support. We continuously improve range and accuracy while decreasing manufacturing costs through the flexibility of new materials, as well as enable immersive features like advanced gesturing. Features such as hover empower our devices to be able to think beyond the surface, creating the next wave of smart, intuitive products.
Wong: I also see that Atmel’s maXStylus was announced earlier this year at CES. How is this transforming the “pen-to-paper” experience?
Hanley: Historically, to achieve high performance with active stylus solutions, OEMs were spending upwards of $30, adding more inductive layers to the sensor stack-up. The maXStylus is the first capacitive active stylus to provide accurate active-pen performance without an additional sensor layer. This reduces the costs for tablets, laptops, and smartphones while maintaining excellent performance. The result for the user is fewer missing strokes, false detections, longer pen hover range, and more accurate and readable letters and characters. You can even go from using the stylus to your fingers without compromising performance or battery life.
Wong: What upcoming trends and user-interface technologies are you most excited about?
Hanley: Fingerprint security is exciting. It enables improved security with ease-of-use capabilities and more. 3D gesturing is another interesting and popular technology. As seen in the film Minority Report, technologies such as 3D gesturing and motion control allow users to interact with their devices without touching it. It gives you freedom both mentally and physically.
Additionally, Atmel is the leader in sensor hubs, which enable sensor fusion. Sensor fusion leads to more accurate readings of the movements, locations, temperatures, etc., of an object, all while increasing the battery life of the product despite the always-on capabilities.
At Atmel, we believe that these technologies are allowing OEMs and developers to create best-in-class products that let industry leaders create what they have always imagined.
Wong: Atmel recently announced the latest in touch with the introduction of the mXT106xT family. Can you elaborate?
For the system designer, adding haptics capability and tactile feedback to a system is both easy and difficult. In general, the sensor has little or moderate impact on the BOM, space, and cost, while the software has no BOM and minimal cost implications. But the embedding hardware drive circuitry and the physical actuator require considerable assessment of design issues, power consumption, PC board and enclosure real estate as well as the cost dimension.
There are four key elements to implementing a haptics function in a system design:
1) The sensor, which may not be needed, depending on the system,
2) The software, which decides what output waveform is needed,
3) The hardware driver for the haptics actuator, and
4) The actuator itself.
These four blocks interact to close the loop and meet the unique requirements of haptics functionality. This article will take a deeper look at the sensor, hardware driver and the actuator in a haptics system.
While many haptics functions have or add a sensor to sense a physical parameter, others do not need a sensor function at all. For example, in a basic gaming system the software determines what the user is seeing and doing and generates outputs such as controller rumble that correspond to the action. Other systems make use of existing sensors, and only need to add additional software and output drive. For example, the Atmel AT42QT1085 QTouch microcontroller is specifically designed to use with the existing capacitive-touch sensors of a screen. (Fig. 1) It embeds software to interpret the user’s finger actions on various screen sliders, buttons and wheels, and then decides the appropriate response.
Fig. 1: The Atmel AT42QT1085 QTouch microcontroller is designed to work with a standard capacitive-sensing touchscreen and function as the interface between the panel and system, by assessing the meaning of the user’s finger swipes. (Source: Atmel)
More complicated designs need a haptics-specific sensor as their signal and data source with the use of multi-axis accelerometers to determine the motion of a game controller or robotic control. This accelerometer is almost always a MEMS device, and may provide a digitized output directly or an analog one that the system microcontroller must digitize. Fortunately, the resolution, accuracy and speed requirements are relatively modest: 8 to 10 bits at just several hundred samples per second or less.
Not all haptics inputs are based on motion and position. Some also sense the temperature of a user’s hands, which can be done with an inexpensive diode-based sensor or a low-cost easy-to-interface IC, such as a member of the Analog Devices AD590 family.
Driver and Actuator
In order to understand the hardware driver needed, a design engineer first must know what sort of haptics actuator will be used. This is one of the most difficult choices because the laws of physics clearly show that such actuation implies motion, and motion means “work,” so power needs to be delivered in the form of current and voltage. Each of the four most common actuators has pros and cons that affect the trade-off decision.
The design team must not only weigh the characteristics of the actuator, but also its unique electronic drive requirements. In contrast, the software that controls each actuator is relatively simple; OEMs can write their own code, they may be able to get it directly from the actuator vendor, or it may be available as a licensed library from the IC vendor or third-party sources, similar to what is done for audio sound-effect clips.
The four most common actuator options in use are:
1) An eccentric rotating mass (ERM)is a tiny motor with an off-balance rotating weight. The ERM is low cost since it is used for “rumble” effects in high-volume applications such as gaming controllers. The operating current is in the 150-mA range, and it vibrates between 100 and 200 Hz. The response time of the ERM is somewhat slow (50 ms), so it is not a good choice of keyboard or button response. The ERM’s size is approximately 10 mm long with a 5 mm swing of the eccentric mass.
2) The linear resonant actuator (LRA) is a motor-like component that looks like small puck, typically about 10 mm in diameter and 4 mm thick. LRAs vibrate a single frequency between 100 and 200 Hz, with a response that is somewhat faster than ERMs, at about 30 msec. They are more costly than ERMs but need less current (about 75 mA) for actuation as well as an AC drive signal.
3) Piezoelectric modules can provide high vibration fidelity due to their fast response time (5 ms) over a wide frequency range of approximately 150 to 300 Hz, and thus are a good choice for typing feedback applications. They come in a long, relatively slim form factor, about 3.5 mm square and 40 mm long. Piezo modules are high performance actuators that can be used to create realistic haptic effects and can be applied in smartphones or tablets. Although they draw no more aggregate power than ERMs or LRAs, they do need a relatively high current pulse of 300 mA at around ±75 V (often derived from a 3 V supply). Piezo modules are a highly capacitive load, which the interface IC must be able to drive. Although they are ceramic-based transducers and might be assumed to be brittle, they are actually rugged due to their small mass and layered construction.
4) The electroactive polymer (EAP) actuator is a thin, flat, durable panel that is less than 1 mm thick. It requires a somewhat complex mounting arrangement on a sled to act in conjunction with the associated movable mass. Due to its fast response time of just a few milliseconds and resonance at around 100 Hz, it is a good choice for touchscreen feedback and is most commonly found in top-of-the-line smartphones. The biggest challenge with EAPs is that they require a drive voltage of approximately 800 V, which means a sophisticated and somewhat costly boost power supply subsystem.
There is no simple “best” answer for providing tactile haptics feedback to the user, as each actuator type has unique and sometimes complex electronic drive requirements. IC vendors have responded to this need with a variety of drivers that bridge the voltage/current differential between the conventional low voltage and currents of most systems and the needs of the various actuators.
For example, Texas Instruments offers the DRV2605 driver for ERM and LRA actuators, using the I2C bus for the processor interface (Fig. 2). It embeds a smart-loop architecture, which simplifies achieving auto-resonant drive for the LRA as well as feedback-optimized drive for the ERM. This feedback feature provides automatic overdrive and braking, which in turn allows creating a simplified input-waveform model, along with reliable motor control and consistent motor performance. An audio-to-haptics mode automatically converts an audio input signal to consistent haptic effects. There is also no need to design haptic waveforms with this IC, since the vendor offers a royalty-free library of over 100 haptics effects. An evaluation kit and board aids design and experimentation (Fig. 3).
Fig. 2: Texas Instruments offers the DRV2605 haptic driver IC for LRA and ERM transducers; it incorporates features which help eliminate the design complexities of haptic motor control, and also has an available library of royalty-free haptics drivers. (Source: Texas Instruments)
Fig. 3: For those unfamiliar with either haptics in general, or specific software and hardware issues related to LRAs and ERMs, the available DRV2605EVM-CT evaluation kit eases the implementation process and reduces the design-in time for the DRV2605 haptic driver. (Source: Texas Instruments)
For piezo transducers, driver components such as the Texas Instruments DRV2667 are available (Fig. 4). Like the ERM/LRA driver IC, it interfaces to the processor via the I2C bus, but the similarities end there. It integrates a 105 V boost switch, power diode, fully differential amplifier, and digital front-end including waveform synthesizer. Its output is designed for capacitive loads. For example, it can drive a 330 nF load at 100 Vp-p and 300 Hz, and a 680 nF load at 50 Vp-p, also at 300 Hz. As with most haptics drivers/transducers, vendors offer evaluation kits (Fig. 5), so designers can explore operating characteristics and subtle design issues of these somewhat unique electrical-to-motion transducers.
Fig. 4: For piezo actuators, the DRV2667 from Texas Instruments provides the fast-slewing high-voltage drive into a capacitive load that is required by this actuator technology. (Source: Texas Instruments)
Fig. 5: Evaluation kits such as the DRV2667EVM allow designers to easily interface to and drive a piezo transducer and thus understand its special operating characteristics. (Source: Texas Instruments)
There’s no doubt that adding haptics to a design adds another dimension to the user experience and human machine interface (HMI). For system designers, haptics technology also brings new challenges of actuator considerations, associated drive circuitry, and increases in space as well as power requirements.
According to market research firm DisplaySearch, the share of flexible smartphones in the overall smartphone market is expected to reach 40% in 2018, up from merely 0.2% last year. This should come with little surprise following recent analyst forecasts projecting the flexible display market to cross the $3.89 billion threshold by 2020 – growing at an impressively high CAGR from 2014 to 2020.
It should also be noted that Jennifer Colegrove, who owns Touch Display Research in Santa Clara, California, says the potential market for XSense and similar technologies will increase from $200 million in 2013 to $4 billion by 2020, primarily for tablet computers and other larger mobile devices.
So far, tech giants Samsung and LG have jumped into the curved smartphone waters as seen during last October’s unveilings of both the Galaxy Round and LG G-Flex, respectively.
“Touchscreens that are thin, light, responsive, sleek and flexible create a multitude of possibilities for the future of design beyond familiar industrial and consumer applications, including wearables, mobile devices, automotive infotainment and other curved surfaces,” explained Jalil Shaikh, Atmel’s Vice President and GM.
As we’ve previously discussed on Bits & Pieces, Atmel’s XSense continues to play a role in the rapidly evolving flexible display market. Essentially, XSense is a high-performance, highly flexible touch sensor which allows engineers to design devices with curved surfaces and even add functionality along product edges. This offers manufacturers the capability to build light-weight, sleek, edgeless smartphones, tablets and other touch-enabled devices.
So if you are attending one of our ToT events, or happen to see us stopping to refuel, be sure to come on over and take a selfie with the Atmel crew and our tech-packed mobile trailer. Don’t be camera shy, because you could win a brand new Samsung Galaxy Tab 3!
Atmel’s Tech on Tour (ToT) crew has tirelessly crisscrossed the globe for many years, offering hands-on technical training for a wide range of company products. This month, Atmel kicked off a new ToT era with a tricked-out mobile trailer that will be hitting the road this month.
In addition to hands-on training, Atmel will leverage the fact that it is at the heart of the Maker Movement and well positioned at the center of IoT innovation. From my perspective, the IoT will be led by a rising generation of tinkerers, inventors and innovators. These are dedicated people who are working out of universities, garages and small companies. We will go and meet them.
Our mobile Tech on Tour trailer provides a familiar setting for customers, engineers and Makers, as well as designers, students, professor and executives. We want to meet people in the market working on projects like electronics, robotics, transportation, alternative energy and sustainable agriculture. That is why we are offering hands-on training and access to soldering irons, along with a chance to brainstorm about the future together.
To be sure, the ToT trailer is quite a scalable platform, functioning not only as a mobile training center, a showroom and conference center, but also as a trade show booth, entertainment center, content creation platform, executive meeting center, recruitment platform, tech support center and employee engagement engine.
On top of that, we are partnering with all global distribution partners, customers, third parties, Makers, government officials and universities to bring Atmel to the market. We are very excited about the concept and the pull from the market and distribution partners has been very promising.
Going to the 2013 International Consumer Electronics Show (CES) in Las Vegas next month? In our meeting room at the show, Atmel will showcase embedded technologies that inspire smart, connected designs. Among our many demos:
QTouchADC and QMatrix touch sensing algorithms, proximity sensing, haptics and buttons, sliders and wheels functionality
To schedule a meeting with Atmel executives and Tech Experts at CES, contact your local sales representative or send an email to firstname.lastname@example.org. We will also be in the ZigBee Pavilion with demos of our ZigBee Light Link and Wireless Composer/Sniffer solutions.