Monday, November 30, 2015


TerraSwarm Funded Researcher Roozbeh Jafari's Group Developing Wearable Technology to Translate Sign Language into Text

Using a system of sensors to record and decode the muscle activity associated with each sign, TerraSwarm funded researcher Roozbeh Jafari's group at Texas A&M University is developing a wearable technology to translate sign language into text. Possible applications include decreased communication barriers for those who use sign language.

The current proof of concept device is able to send the resulting translation from the device to a computer or smartphone using Bluetooth.  The system is designed to learn from its user to accommodate the individual movement of each person. Sophisticated algorithms allow this process of adjustment and are also used to enable real time translation. Going forward the team hopes to develop a smaller device that can be worn as a watch, add a synthetic voice, and  translate full sentences rather then individual signs. 

Tuesday, November 17, 2015

TerraSwarm funded paper “Fast Redistribution of a Swarm of Heterogeneous Robots” Receives Best Paper Award at the 9th EAI International Conference on Bio-inspired Information and Communications Technologies

The paper, "Fast Redistribution of a Swarm of Heterogeneous Robots," received best paper award at the 9th EAI International Conference on Bio-inspired Information and Communications Technologies in New York.  

The paper states: "The authors gratefully acknowledge the support of ONR grants N00014-15-1-2115 and N00014-14-1-05-10, ARL grand W911NF-08-2-0004, NSF grant IIS-1426840, and TerraSwarm, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA."

Amanda Prorok, M. Ani Hsieh, Vijay Kumar. Fast Redistribution of a Swarm of Heterogeneous Robots. Proceedings of 9th EAI International Conference on Bio-inspired Information and Communications Technologies, December, 3- 5, 2015, New York.

Abstract:
We present a method that distributes a swarm of heterogeneous robots among a set of tasks that require specialized capabilities in order to be completed. We model the system of heterogeneous robots as a community of species, where each species (robot type) is defined by the traits (capabilities) that it owns. Our method is based on a continuous abstraction of the swarm at a macroscopic level, as we model robots switching between tasks. We formulate an optimization problem that produces an optimal set of transition rates for each species, so that the desired trait distribution among the tasks is reached as quickly as possible. Our solution is based on an analytical gradient, and is computationally efficient, even for large choices of traits and species. Finally, we show that our method is capable of producing fast convergence times when compared to state-of-the-art methods.

Wednesday, November 4, 2015

TerraSwarm Funded PI Vijay Kumar Receives $5.5 Million DARPA Grant

A three year, $5.5 Million dollar grant has been awarded to GRASP Laboratory at The University of Pennsylvania from the Defense Advanced Research Project Agency for the purpose of creating new robots that are capable of navigating unfamiliar environments independently and swiftly.

TerraSwarm funded PI Vijay Kumar and fellow researchers Daniel Lee, Camillo J. Taylor, Kostas Daniilidis and Jianbo Shi will use their various expertise to develop autonomous robots, capable of flying 20 meters per second, and weighing less than three kilograms.  Possible applications include search and rescue and disaster response, particularly in situations that are hazardous for humans.

Friday, September 25, 2015

TerraSwarm funded paper "Robust Online Monitoring of Signal Temporal Logic" Receives Best Paper Award at Runtime Verification '15, Vienna

The paper, "Robust online Monitoring of Signal Temporal Logic," received best paper award at Runtime Verification '15 in Vienna.  

The paper states: "This work was supported in part by TerraSwarm, one of six centers of STARnet, a Semiconductor Research Corporation program sponsored by MARCO and DARPA, by NSF Expeditions grant CCF-1139138, and by Toyota under the CHESS center at UC Berkeley."

Citation:

Jyotirmoy V. Deshmukhh, Alexandre Donze, Shromona Ghosh, Xiaoqing Jin, Garvit Juniwal, Sanjit A. Seshia. Robust Online Monitoring of Signal TemporalLogic, Runtime Verification '15, Vienna, September 22, 2015.

Abstract:

Signal Temporal Logic(STL) is a formalism used to rigorously specify requirements of cyberphysical systems (CPS), i.e., systems mixing digital or discrete components in interaction with a continuous environment or analog components. STL is naturally equipped with a quantitative semantics which can be used for various purposes: from assessing the robustness of a specification to guiding searches over the input and parameter space with the goal of falsifying the given property over system behaviors. Algorithms have been proposed and implemented for offline computation of such quantitative semantics, but only few methods exist for an online setting, where one would want to monitor the satisfaction of a formula during simulation. In this paper, we formalize a semantics for robust online monitoring of partial traces, i.e., traces for which there might not be enough data to decide the Boolean satisfaction (and to compute its quantitative counterpart). We propose an efficient algorithm to compute it and demonstrate its usage on two large scale real-world case studies coming from the automotive domain and from CPS education in a Massively Open Online Course (MOOC) setting. We show that savings in computationally expensive simulations far out- weigh any overheads incurred by an online approach. 

Thursday, September 24, 2015

TerraSwarm researcher Alex Halderman is one of PopSci's Brilliant 10

University of Michigan Professor and TerraSwarm-funded researcher J. Alex Halderman has been named one of PopSci's Brilliant 10.  The article "Brilliant 10: Alex Halderman Strengthens Democracy Using Software," discusses Halderman's 2010 efforts surrounding a mock Washington D.C. election where researchers were invited to break into the electronic voting system.  Halderman and his group were able to take complete control of the system, which they directed to play the Michigan fight song each time a vote was cast.

The article also covers TapDance, software that allows citizens of countries with restricted firewalls to bypass government censors.  TapDance is described in the following paper

* Eric Wustrow, Colleen Swanson, Alex Halderman. TapDance: End-to-Middle Anticensorship without Flow Blocking, Usenix Security 2014, 20, August, 2014.

Abstract: In response to increasingly sophisticated state-sponsored Internet censorship, recent research has proposed a new approach to censorship resistance: end-to-middle proxying. This concept, developed in systems such as Telex, Decoy Routing, and Cirripede, moves anticensorship technology into the core of the network, at large ISPs outside the censoring country. In this paper, we focus on two technical barriers to the deployment of end-to-middle proxy designs-- the need to selectively block flows, and the need to observe both directions of a connection-- and we propose a new construction, TapDance, that avoids these shortcomings. To accomplish this, we employ a novel TCP-level technique that allows the anticensorship station at an ISP to function as a passive network tap, without an inline blocking component. We also apply a novel steganographic encoding to embed control messages in TLS ciphertext, allowing us to operate on HTTPS connections even with asymmetric flows. We implement and evaluate a proof-of-concept prototype of TapDance with the goal of functioning with minimal impact on normal ISP operations.

This work was supported in part by TerraSwarm, one of six centers of STARnet, a Semiconductor Research Corporation pro- gram sponsored by MARCO and DARPA.


Note that in 2014, University of Michigan Professor and TerraSwarm-funded research Prabal Dutta was named one of the Brilliant 10.  See "The Brilliant Ten: Prabal Dutta Powers The Internet of Things."

Madhur Behl wins the Best-in-Session award (IoT Systems session) at the 2015 SRC TECHCON

Madhur Behl (final year PhD at UPenn) won the Best-in-Session award for the IoT Systems session at the 2015 SRC TECHCON on September 22, 2015. This is for his TerraSwarm work on Data-driven Demand Response Recommender System (DR-Advisor), titled "Sometimes, Money Does Grow on Trees".

Last year,  Zhihao Jiang (also final year PhD at UPenn) won the Best-in-Session award for his TerraSwarm work on Formal Verification of Closed-loop Medical Devices.

Wednesday, September 16, 2015

TerraSwarm Robot Swarm, Connected Sensors, Video Summarization and Audio Machine Learning Technology at DARPA Wait, What? Conference

TerraSwarm participated in the DARPA Wait, What? technology forum held in St. Louis, September 9-11, 2015.

The Wait, What? Demo illustrated TerraSwarm infrastructure innovations on robot swarms and connected sensors. The demonstration included cooperating robots solving more than one problem, and on-the-fly re-tasking to move from one problem to the next.

At the DARPA Wait What? conference, the TerraSwarm team from five universities integrated five exciting and best-in-class technologies to showcase the types of applications enabled by the TerraSwarm research center. The demo focused on a robot delivery service where a team of Scarab robots (designed, built, and programmed at the University of Pennsylvania) delivered snacks to onlookers at the touch of a button. The user interface for the demo was a smartphone app, running on unmodified smartphones. After making a selection, the system dispatched a robot carrying the snack directly to the attendee, even as the attendee walked around. While the robot's main goal was the delivery application, to demonstrate how the robots could be repurposed in real-time, a context-aware machine learning application ran in the background that could interrupt the robots in response to an event. Finally, one of the robots simultaneously performed a surveillance task by carrying a video camera. The video stream was fed to a video summarizer which extracted the most interesting and novel clips from the stream in real-time. Instead of watching an hour long video, examining what the robot had seen was compressed to a minute long summary of several interesting clips.

This composition of various technologies highlights what can be done when independent systems work together. Applications in smart manufacturing, warehouse management, in-the-field delivery, disaster relief, and monitoring and security can all benefit from these services interoperating.

The demo centered around our smartphone app which both presented to visitors the snack options the robots could deliver, and also ran an indoor localization service known as ALPS (developed by a team at Carnegie Mellon University) in the background. ALPS uses fixed anchor nodes (which were positioned around the demo space) which periodically transmit ultrasonic chirps to localize off-the-shelf smartphones. While the chirps are at a frequency above the human hearing range, they are still audible to the microphone circuitry in phones, and based on when the phone receives the chirp and the speed of sound, the ALPS system can calculate the location of the phone.
Figure 1. Students using the app to request a delivery.

The attendee's snack request and current location are fed to the application coordinator which is an Accessors based application running on the Ptolemy software platform (both developed by a team at UC Berkeley). Ptolemy is a visual, actor-oriented programming environment where applications are composed by connecting the inputs and outputs of the system blocks together. Accessors enable those blocks to represent external, real-world devices, such as the robots or the phones running ALPS. Ptolemy provides a central point to both describe and implement the application, as well as manage the data flowing through the system.
Figure 2. Scarab Robot with camera used for video summarization and candy. (source: Ilge Akkaya)


The robots operate semi-autonomously, and use the ALPS position updates as waypoints to complete their deliveries. With a known map of the space, the robots receive their next waypoint, calculate a path to that goal, and navigate that path independently. If any obstacles present themselves, including other robots, people walking around, or changes in the environment, the robots both detect the change and navigate around the obstruction.
Figure 3. The map showing the direction of robots. (source: Ilge Akkaya)

Figure 4. Scarab robots in motion, making deliveries. (source: Ilge Akkaya)


While the robots are making deliveries, a microphone is passively listening in the background and running a machine learning model trained to detect applause (as setup by the UC Berkeley team). The relevant features of the audio stream are extracted locally, and then processed by the GMTK machine learning toolkit (developed at the University of Washington) to determine if there was applause in the space. When applause is detected, the robots are interrupted from their deliveries, spin in place, and then continue serving snacks. This reconfiguration is a placeholder for a more serious event, such as the robots being able to re-task themselves to respond when a disaster is detected.

The robot that is filming its environment is streaming the video feed to a video summarization tool (developed by a team at the University of Washington). The video summarizer is, in real-time, analyzing the stream for interesting and novel clips that best describe the video as a whole. It aggregates these clips as a summary of the video. Essentially, the information inside the video is preserved, but the length is significantly reduced, making an originally intractable problem---monitoring the video feeds from a swarm of robots---feasible.

These systems are the cutting-edge technologies and research occurring inside the TerraSwarm research center. To demonstrate how they can be used connected to create otherwise impossible applications, the team at the University of Michigan focused on the systems engineering aspect: architecting the application, bringing together the components, defining interfaces between them, and ensuring the demo worked as expected. By successfully incorporating state-of-the-art research projects from multiple universities, the Wait What? demo showcased the true potential of multi-institution research collaborations like TerraSwarm.

(Source of text: Brad Campbell)

In addition, Professor Alberto Sangiovanni-Vincentelli gave the following presentation:
Design Technology for the Trillion-Device Future, September 10, 2015.

See also: Rebecca Boyle, "Wait, What? The Most Amazing Ideas From DARPA's Tech Conference," September 15, 2015, popsci.com.

Saturday, September 12, 2015

IoT Lacks Tools: DARPA can play a role in building the infrastructure

On September 10, 2015, TerraSwarm researcher Professor Alberto Sangiovanni-Vincentelli spoke at DARPA's Wait What Conference about "Design Technology for the Trillion-Device Future" (slidesvideo).   His talk was covered by Rick Merritt in the eeTimes article, "IoT Lacks Tools, Says EDA Vet."

In the talk, Professor Sangiovanni-Vincentelli suggested that IoT tools should be similar to the EDA tools used to design large chips.  However IoT is far more heterogeneous and in IoT the notion of time must be treated as a first class citizen.

Professor Sangiovanni-Vincentelli was quoted as stating "The end devices have to cost a few cents each, so how can you make money from that? Another question is who will build the infrastructure…DARPA can play a role as an independent partner to coalesce people and help build the infrastructure."

The talk was also covered in by Sharon Guadin for ComputerWorld in an article titled: "Get ready to live in a trillion-device world" who quotes Sangiovanni-Vincentelli as stating: "The brain-machine interface will have sensors placed in our brains, collecting information about what we think and transmitting it to this complex world that is surrounding us," said Sangiovanni-Vincentelli. "I think I'd like to have an espresso and then here comes a nice little robot with a steaming espresso because I thought about it."

Thursday, August 13, 2015

TerraSwarm Students Win Best Environmental Impact Award for AC Plug-Load Meter

TerraSwarm students, Sam DeBruin and Branden Ghena, recently won the ‘Best Environmental Impact Award’ at the 2015 TI Innovation Challenge North America Design Contest for their AC plug-load meter, PowerBlade.  

According to the students' webpage and project submission page, their new design enables affordable large-scale studies of plugload energy usage – an area of growing national importance. PowerBlade is the smallest, lowest cost, and lowest power AC plug-load meter that measures real, reactive and apparent power, and reports this data, along with cumulative energy consumption, over an industry-standard Bluetooth Low Energy radio. PowerBlade’s footprint is only 1.1 inch x 1.1 inch with a 1/16 inch profile, allowing it to sandwich between the plug and the wall outlet. It operates on a budget of less than 1mA, a usable bandwidth for its application.

Sam DeBruin and Branden Ghena are graduate students at the University of Michigan and are advised by Assistant Professor Prabal Dutta.

To read more about the PowerBlade, visit their website or their contest submission page:

For more information about the TI Innovation Challenge go to:





Wednesday, July 29, 2015

TerraSwarm Students at UTD Win Second Place in 2015 TI Innovation Challenge

TerraSwarm UTD students, Jian Wu, Zhongiun Tian and Lu Sun, won second place in the highly competitive 2015 Texas Instruments Innovation Challenge for their project entitled “Real-time American Sign Language Recognition Using Wrist-worn Inertial and Surface EMG Sensors”.  Their second place finish, out of a field of 276 competitors, will earn a $7500 cash award. 

Their submission for the contest, a real-time American Sign Language recognition system, combines surface electromyography (sEMG) with a wrist-worn inertial sensor at the feature level. The inertial sensor-based system is capable of capturing hand orientations and hand and arm movements during a gesture while the sEMG measures finger movements and the muscle activity patterns for the hand and arm. Coupling these two systems increases accuracy, thus enabling recognition of a larger vocabulary of signs.

The Texas Instruments Innovation Challenge is an annual contest designed to encourage engineering students to design projects that utilize TI technology.  All designs must incorporate the use of three different TI devises, as well as TI integrated circuits (ICs) and a TI processor.  A panel of judges selects the winners based on originality, creativity, real world application, level of engineering analysis, usage of TI ICs and processors, and a working demonstration video of the design.

See the UTD team’s video submission and read more about the technology they have developed at: 
https://e2e.ti.com/group/universityprogram/students/m/students_repository/665999 .


Thursday, May 14, 2015

Advancing the State-of-the-Art in Medical and Cyber-Physical Systems

TerraSwarm PIs Vijay Kumar and Rahul Mangharam will be participating in two NSF projects being awarded $8.75 million over five years to advance the state-of-the-art in medical applications and cyber-physical systems (CPS).  

UPenn professor Vijay Kumar will lead the bioCPS Project which aims to combine microrobots with synthetic cells to perform functions that may one day lead to tissue and organ regeneration. He will join computer scientists, roboticists and biologists from Boston University and the MIT to develop a system that combines the capabilities of nano­scale robots with specially designed synthetic organisms. Together, they believe this hybrid "bio­CPS" will be capable of performing previously impossible functions, from microscopic assembly to cell sensing within the body. Researchers so far have not been able to control and coordinate the behavior of synthetic cells in isolation, but the introduction of microrobots that can be externally controlled may change that.
The Cyberheart Project is a a virtual, patient-­specific human heart model that can be used to improve and accelerate medical-device testing.  This effort is being led by UPenn associate professor Rahul Mangharam.  He will join researchers from six universities and centers to develop far more realistic cardiac models than currently exist. Such models are essential to improving the software that governs implantable devices, such as pacemakers, defibrillators and cardiac rhythm therapy devices. The team’s "Cyberheart" can be used to test and validate medical devices faster and at a far lower cost than existing methods. The models also can be used to design optimal procedures on a patient­specific heart with fewer risks to the patient.

To read more about these projects, go to the NSF Press Release "Exploring a new frontier of cyber-physical systems: The human body" or the University Pennsylvania news release "Penn Researchers Join Two NSF Projects on Medical Cyber-Physical Systems".








Tuesday, April 21, 2015

UTD announces official opening of their TI Innovation Lab

The University of Texas, Dallas (UTD) announced the official opening of their new Texas Instruments Innovation Lab (Texas Instruments Supports Innovative Research with New Lab).  The 1100-square-foot facility will be a place for students and faculty in engineering and computer science to develop electronic and electromagnetic solutions for a variety of engineering problems. The TI lab is located within the UTDesign Studio. It will be used for academic classes, independent and collaborative student projects and workshops taught by TI staff. The TI workshops will focus on how to design well-rounded products that account for reliability, cost savings, product size and harsh environments. 

The official opening included a demonstration of new technology developed by TerraSwarm PI and UTD associate professor Roozbeh Jafari and his students whereby an electrode-studded glove worn by a student communicating with sign language was translated onto a computer screen.  It’s a promising new technology that may one day help the hearing impaired communicate more easily.
To read the announcement in full, go to: Texas Instruments Supports Innovative Research with New Lab.