Session Abstracts

December 2nd

Building End-to-End Machine Learning Workflows with Arm

Cloud Service with Austin Blackstone | Developer Evangelist at Arm

December 2nd |10:40am - 12:10pm

Gateway Solution with Wei Xiao and Gian Marco Iodice | AI ecosystem and Staff ML Software Engineer for Machine Learning Group at Arm

December 2nd |1:00pm - 3:50pm

Machine learning workflows consist of tasks including data collection, aggregation, ML model training, evaluating, fine tuning and model deployment etc. Arm not only provides processors that perform your complex ML workflow locally, but also SDKs and tools that enable rapid application development and optimization, as well as services that orchestrate and automate the end-to-end ML tasks.

In this series of ML end-to-end workshops, we are going to introduce how to use Pelion Device Management and Treasure Data to train, validate and deploy your ML models (cloud service workshop), run and optimize ML algorithms on CPUs, GPUs, and NPUs with Arm NN and Arm Compute Library (gateway solution workshop), and use ML frameworks to push machine smarts to the tiniest Arm MCUs (endpoint solution workshop).  

Warming climate, population sprawl threaten California’s future with more destructive wildfires. Machine Learning could help combat blazes, aid in recovery and prevent wildfires from starting to begin with. In the gateway workshop, we are going to show how you can use Arm NN SDK and Arm Compute Library to build a fire prediction and detection system.

Scalable ML acceleration with ONNX Runtime

Manash Goswami |Principal Program Manager at Microsoft

December 2nd |10:40am - 11:10am

Using Jetson Nano Arm64 to execute ONNX models using the ONNX Runtime inference engine. ONNX Runtime is an
open source inference engine for ONNX Models. ONNX Runtime Execution Providers (EPs) enables the execution of any ONNX model using a single set of inference APIs that provide access to the best hardware acceleration available.

In simple terms, developers no longer need to worry about the nuances of hardware specific custom libraries to accelerate their machine learning models. This technical session will demonstrate that by enabling the same code to run on different HW platforms using their respective AI acceleration libraries for optimized execution of the ONNX model (

TinyML Application Development for Everyone

Sandeep Mistry | Senior Software Engineer at Arduino

December 2nd |10:40am - 12:10pm

Step through sensor data capture, training, and model deployment to build an ML application based on data you’ll sample yourself in the session. You’ll learn how to use your gestures to train a classifier in TensorFlow and then deploy to an Arm Cortex-M board running MbedOS.

This easy-to-follow workshop focuses on gesture recognition - but the sensor sources onboard the hardware provided include accelerometer, gyroscope, color, ambient light, proximity, temperature, humidity, and pressure. You can be as creative and ambitious with your TinyML applications as you like!

Running AI/Neural networks on microcontrollers made simple with the STM32Cube.AI

Markus Mayr | Product Marketing Manager at STMicroelectronics

December 2nd | 11:10am - 11:40am

This session will focus on the STM32Cube.AI software tool and its Ecosystem. The STM32Cube.AI toolbox generates optimized code to run neural networks on STM32 ARM Cortex-M based microcontrollers. It is bringing AI to microcontroller-powered intelligent devices at the edge, on the nodes, and to deeply embedded devices across IoT, smart building, industrial, and medical applications. With STM32Cube.AI, developers can now convert pre-trained neural networks into C-code that calls functions in optimized libraries that can run on STM32 MCUs.

Let’s Talk Do It Yourself Autonomous Cars

Chris Anderson | CEO at 3D Robotics

Rahul Ravikumar | DIY Robocars Community Member and Software Engineer at Google

December 2nd | 11:10am - 12:10pm

Ever thought of building an autonomous car? What about racing one? This is your chance to join Chris Anderson, the creator of DYI Robocars one of the largest autonomous race car communities in the world. These might be small and inexpensive cars but don’t underestimate them, they can run a full suite of car software on them. Chris will be joined by Rahul Ravikumar, an active community member, to discuss pros and cons of the various techniques used by the DIY Robocars competitors: behavioural cloning, reinforcement learning, simple supervision, computer vision, SLAM, etc.

Growing Trends or Waste of Time? What 16,000 Developers Told us About Emerging Tech.

Moschoula Kramvousanou | Head of Client Relations at SlashData

December 2nd | 11:10am - 12:10pm

Developers' interest in new technologies and their eventual adoption is not a linear progression. It can accelerate or stagnate depending on many factors from a new innovation's complexity and availability to its potential to make a global impact.

SlashData's Developer Economics surveys are a key source for the tech community to get to know where developers are going and how to better service them. This discussion will present developer adoption trends for emerging topics like: Quantum Computing, Self-driving Cars, Robotics, Computer Vision, Biometrics, Drones, Blockchain applications, Conversational platforms, Cryptocurrencies, Fog/Edge Computing, Mini Apps, and DevOps. We will dig into what could be blocking growth or pushing the trend and what's next?

Building Noise-Immune Speech Interfaces for IoT

Chris Rowen | CEO at BabbleLabs

December 2nd | 1:00pm - 1:30pm

IoT devices span a huge range of tasks and applications. Some are so remote or distributed that people rarely see them or interact with them. But many others monitor or control services that people use every day. Making those interactions hands-free, robust, and intuitive naturally begs for speech interfaces. Unfortunately, today's speech UIs face a number of limitations:
. they work quite poorly in noisy environments,
. they often rely on continuous cloud connectivity which may compromise 24/7/365 availability, and data privacy,
. they require significant computing resources, either locally or in the cloud, making low cost and low power operation difficult,
. long latency of response precludes use in many high interaction and mission-critical control tasks.

BabbleLabs has taken key lessons from its deep learning speech enhancement technology to build a new configuration and runtime software solution for optimized speech interfaces with the modest-sized vocabularies - typically up to about 100 short phrases per application. This class of UIs is increasingly important across industrial, consumer, and automotive command and control systems. The Clear Command software subsystems run on the ARM Cortex M micro-controller line, and fit in a memory footprint of tens to hundreds of kilobytes for the code, model and working memory combined. Despite tiny compute budgets, these neural networks can extract far-field speech from the background with remarkable accuracy, achieving phrase errors rates more than 4x better than cloud-based speech recognition systems under the same noisy conditions.

Privacy-Focused Voice AI in Intelligent Robotics

Samreen Islam | Community Manager at MATRIX Labs
Carlos Chacin | Software Developer at MATRIX Labs
Alfred Gonzalez-Cuzan | Innovation Manager at MATRIX Labs

December 2nd | 1:00pm - 2:30pm

We are at a point in time when Voice AI is integrating seamlessly into our lives. We have them in our homes, on-the-go through our phones, and increasingly embedded into our everyday devices- headphones, cars, and TVs to name a few.

It should come as no surprise that voice-control for robots is also becoming increasingly popular due to its hands-free and potentially conversational nature, whether it be with a humanoid, robot arm, or autonomous rover. This workshop will go over how you can easily employ a privacy-focused voice assistant, Snips, with MATRIX devices, an edge & IoT development platform powered by Arm microcontrollers and Xilinx FPGAs, to quickly deploy an effective voice-enabled robot of your choice, complete with sensors, feedback loops, and motor control.


Attendees should have these on their computers before the workshop:
Node.js - Dependency for Sam CLI Tool -
Snips' Sam CLI Tool - Creates & manages Snip assistants on your Pi -
A registered account -
SFTP Client (ex: for Mac & Windows, Linux can mount in file system) -
Text editor of choice (we like Visual Studio Code) -

Resources attendees can check out:
Documentation -
GitHub Source Code -
Hackster Project Page -
Community Forum -

Get Started with Drone Development on PX4

Jinger Zeng, Community & Partnership Manager at Auterion
Iain Galloway, Drone Program Lead at NXP Semiconductor

December 2nd | 1:00pm - 2:30pm


All attendees need to bring a laptop.

Please pre-install MAVSDK on your laptop.

Pelion Unified ID - The Key to IoT at Scale

Alan Tait | Director of Engineering at Arm

December 2nd | 1:30pm - 2:00pm

Driven by a vision of a trillion connected devices by 2035, Arm has been working with partners to ease ways in which stakeholders can establish using device identity. As an ever increasing array of devices get connected to IoT, identity of a device across the connectivity, (device) life-cycle management and application layers needs to be unified and simple to manage to acquire and use data for business outcomes. This is increasingly important as we move from 10s of devices to 100,000s of devices.

For years, SIM cards have provided a robust, trusted and highly tested mechanism for secure identity for cellular connected devices at scale and offer a perfect credential store. Arm Pelion along with partners is easing the management of identity and its context across device, account or organisational level and application clouds. As you go through this session, you'll work through how today, a Pelion SIM connected device can be configured very simply, in field with no need for physical access, with power on to changing applications across clouds of your choice.

WebThings by Mozilla

Kathy Giori | Sr Product Manager at Mozilla

December 2nd | 1:30pm - 2:30pm

Want to manage your own private smart home? Want your connected things to be interoperable across brands, securely accessible and controllable over the web? Come see how Arm processors are powerful enough to run your entire smart home on the edge, in your own home, no clouds required! We will demonstrate how you can run the WebThings Gateway on a Raspberry Pi (or in a Docker container on your favorite platform), and manage IoT devices that you build or buy. And yes, we'll demonstrate how to build your own "web things", in minutes, using open source WebThings framework libraries.

Note: There will be Adafruit Circuit Playground Express and BBC micro:bit boards available to borrow for the hands-on portion of the BoF.

A Full Arm IoT Stack from Sensors to the Cloud

Carl Perry | Ecosystem Engineer at Packet
Rahul Ravikumar | Software Engineer at Google
Ed Murphey | Software Engineer at June Oven

December 2nd | 2:40pm - 3:10pm

Connecting trillions of devices will need a significant rethinking of how infrastructure is built and delivered, and the Arm Neoverse initiative was created to address those challenges.

Workloads are being brought back from the Cloud, and moving to the Edge, closer to the end users or to IoT endpoints, which improves the service delivery experience. miniNodes is building a complete demonstration of connected Cloud Servers, Edge Servers, and IoT Devices, running entirely on Arm. Environmental data will be captured by IoT endpoints running Arm Mbed, provisioned via Arm Pelion, feeding data to Edge servers, that will in turn connect to an Ampere eMAG server hosted by

We will talk through the infrastructure build, issues and challenges along the way, and potential use cases. Then we will open up to discussion, to solicit feedback and hopefully find solutions to the challenges faced.


Pete Warden | TensorFlow Lite Engineering Lead at Google

December 2nd | 2:40pm - 3:50pm

Join Pete Warden, to discuss how to run machine learning on embedded systems. You will be able to discuss technology like Tensorflow Lite Micro, CMSIS-NN and other related topics.

This is a great opportunity to get your questions answered directly by the experts, and share your own knowledge with the community.

Agora – Develop, Deploy, and Maintain Devices Profitably

Garrett LoVerde | IoT Systems Engineer at Embedded Planet

December 2nd | 2:40pm - 3:50pm

Embedded Planet will introduce a general-purpose hardware platform and guide to deploy application-specific IoT devices on budget, and on time. Agora is a multi-connectivity, multi-sensor IoT development platform that enables prototyping of Cellular, Bluetooth, and LoRaWAN applications. It also has 7 sensors.

Learn how to:
· Ease development with a versatile MCU-based IOT hardware platform driven by mbed OS.
· Develop intelligent applications that transform environmental factors such as temperature, humidity, air quality, audio, proximity, and force into Information.
· Discover the value add for business utilizing this information.
· Go to production immediately with the preconfigured subset of the development kit that most closely meets your application need.
· Go to production at scale with a fully customized subset of the development kit that matches your application need most optimally.

The IoT Developer Journey - Understanding the Developer Experience in IoT When You Seem to Have All the Wrong Skills.

Bert Froeba | Principal Engineer at Arm

December 2nd | 2:40pm - 3:50pm

The IoT Developer Journey - Understanding the developer experience in IoT when you seem to have all the wrong skills.

Bert Froeba, Principal Engineer, ArmIoT accelerates the collision of embedded and web technologies.  One moment you are building a server, looking at databases and JavaScript libraries, the next you’re wrangling toolchains and C++ dependencies.  Then you’re getting to grips with deploying to a cloud service while trying to reduce the noise coming out of your device’s sensors.

Come chat about the perils and pitfalls of getting into IoT development and have an open discussion about how to make it easier for other developers and reduce our common problems we encounter every day.

Getting Started with a Self-Driving RC Car

Rahul Ravikumar | Software Engineer at Google

December 2nd | 3:10pm - 3:50pm

This talk explores Donkeycar, an open source, high level self-driving library written in Python with a special emphasis on the software and the algorithms being used. First, we will look at the anatomy of a self-driving RC car. We will then do a quick run through of typical hardware used to build such a car.  We will dive into different algorithmic approaches to self driving, including techniques from Computer Vision, and Deep Learning. Finally, we will also explore the software stack available and how you can customize your car and go beyond just getting things to work.

This talk will help you get started on your own journey of building a self-driving RC car, and competing in races.

AIoT Dev Summit Community Reception

December 2nd | 5:00pm - 7:00pm

Join our Dev Summit community reception to unwind after a long day of workshops and talks and tuck into free food and drink! You'll get the opportunity to ask the panelists, speakers and attendees questions about what you have learnt throughout the Summit, plus you'll get the chance to learn more about how to get involved with community groups from local representatives. Join other developers with similar passions and work on important causes around diversity, sustainability, and common interests.

December 3rd

Arduino ML Station

Massimo Banzi | Co-founder and CTO at Arduino

December 2nd |10:30am - 11:10am

December 2nd |3:50pm - 5:00pm

Join the Arduino team to build your first tiny ML project using TensorFlow Lite Micro deployed to an Arm Cortex-M board running MbedOS. The Arduino Nano BLE Sense includes accelerometer, gyroscope, compass, color, ambient light, proximity, temperature, humidity, and pressure. You can be as creative with your tiny ML applications as you like!

Building End-to-End Machine Learning Workflows with Arm

Endpoint Solution with Neil Tan and Austin Blackstone | Developer Evangelists at Arm

December 3rd |10:40am - 12:10pm

Machine learning workflows consist of tasks including data collection, aggregation, ML model training, evaluating, fine tuning and model deployment etc. Arm not only provides processors that perform your complex ML workflow locally, but also SDKs and tools that enable rapid application development and optimization, as well as services that orchestrate and automate the end-to-end ML tasks.

In this series of ML end-to-end workshops, we are going to introduce how to use Pelion Device Management and Treasure Data to train, validate and deploy your ML models (cloud service workshop), run and optimize ML algorithms on CPUs, GPUs, and NPUs with Arm NN and Arm Compute Library (gateway solution workshop), and use ML frameworks to push machine smarts to the tiniest Arm MCUs (endpoint solution workshop).

Extending Machine Learning to Industrial IoT Applications at The Edge with AWS

Ian Perez Ponce | Global Business Development IoT at Amazon Web Services

December 3rd |10:40am - 11:10am

This presentation, we will discuss real-world use case trends that intersect Industrial IoT (IIoT) applications and the cloud with an emphasis on how customers are using distributed computing with machine learning and analytics at the edge to accelerate innovation and time-to-value.

End-to-End Security with Arm-based Edge Devices & IoTeX Blockchain

Raullen Chai | CEO & Co-founder at IoTeX

Xinxin Fan | Head of Cryptography at IoTeX

December 3rd |10:40am - 12:10pm

Trusted Tracker is a GPS + environmental tracking device, developed by IoTeX with Arm Trustzone and Cryptocell technology. We will provide Trusted Tracker devices and giveaways for the workshop.

1. Trusted Tracker Hardware Overview (~15 min)
Describe major hardware components of Trusted Tracker: Cortex-M33 processor with Arm TrustZone, Arm CryptoCell 310 security subsystem, Connectivity module (LTE-M, NB-IoT), and sensors.

2. IoTeX Blockchain Platform Overview (~15 min)
Introduce high-level design of IoTeX blockchain, including system architecture, trusted computing framework, smart contracts, and core APIs.

3. End-to-End Security Architecture (~45 min)
Showcase end-to-end security architecture of Trusted

Listen to your data: Swim builds AI models and predicts in real-time

Dr. Simon Crosby | CTO at

December 3rd | 11:10am - 11:40am leads the open source Swim project. Swim automatically builds, runs and manages scalable, distributed, intelligent dataflow pipelines that analyze, learn and predict insights on-the-fly - and stream their insights to apps or to humans in real-time. The talk will present the architecture, a demo and use cases.

Swim automates building and operating intelligent dataflow pipelines that analyze, learn and predict directly from streaming data - without needing to store then analyze, or train then deploy models. It cuts costs using stateful computation and a distributed fabric that exploits lightweight Arm edge devices, fog and cloud resources. It makes it easy for anyone with streaming data to get deep insights – in real time.

Examples: Swim tracks the real-time state of every mobile handset for a mobile operator for real-time customer care. Each handset digital twin analyzes, learns and predicts from its own data. Swim is used for traffic prediction in US cities, for failure prediction in PCB assembly, for automation in aircraft assembly, for smart grid optimization, and security in Oil & Gas.

Accelerate IoT Development with Docker

Jenny Fong | Sr. Director of Product Marketing at Docker

December 3rd | 11:10am - 12:10am

Join us to learn to build IoT applications with the Docker platform and discover how Docker technology accelerates development of IoT workloads!

Together with Arm, we have released a number of new features to simplify the development process while balancing security and governance needs. We’ll talk about the latest features and best practices when developing with Docker tools specifically targeting IoT devices running Arm Cortex-A CPUs. As workloads move from cloud to edge, these tools are essential to manage your applications at scale.

Arduino Community Room

Massimo Banzi | CTO at Arduino
Dominic Pajak | VP Business Development at Arduino
Sandeep Mistry | Senior Software Engineer at Arduino

December 3rd | 11:10am - 12:10am

Join the Arduino team to discuss and ask questions about the latest developments in IoT and Machine Learning.

From Concept to Production: Design and Optimize Enterprise Computer Vision Applications on Embedded Arm Devices

Dr. Nitin Gupta | VP Product at Dori AI

December 3rd | 11:40am - 12:10pm

Launching an embedded computer vision application doesn't stop with just a trained model; there are a number of steps and best practices that should be followed to have a successful launch and support future upgrades of your models/use cases. This talk will cover what to consider when trying to annotate, train, optimize, benchmark, deploy, and monitor a computer vision ML application for embedded Arm devices.

Ci in IoT/Embedded: Creating a Simple, Scalable and Automated Workflow

Charles McCann | Solutions Architect at Arm

December 3rd | 1:00pm - 1:30pm

IoT/embedded software developers are being asked to do more and more, especially in small/medium-sized companies. Beyond developing software, the roles of device management, test generation, and application deployment (to name a few) are moving under the 'software developer' job scope. The creation and automation of software testing and delivery, historically a dev-ops role, now often adds complexity to the developer's work.

This talk aims to simplify creating, maintaining, and scaling a quality continuous integration (CI) pipeline. Both software engineers and dev-ops professionals working with IoT/embedded devices will learn realistic best-practices for CI/CD in software development. Two use-cases will be highlighted: one regarding rich IoT (a Linux-based app), the other constrained IoT (a bare-metal-based app). The steps to set up software tests, testing environments, automated testing pipelines, and delivery mechanisms will be discussed through practical, reproducible examples.

Secure IoT With Microchip and Kinibi-M

Richard Hayton | CTO at Trustonic

December 3rd | 1:00pm - 2:30pm

In this session you will learn how to program a MicroChip  SAML11 microcontroller to generate secure messages that a server/cloud can validate came from your device and can decrypt and display.

A SAML11 ExplainPro board will be provided (which you can take away) and the session will cover an introduction to Kinibi-M and the steps required to write the secure on device code to talk to your PC or a provided cloud service.

Unlock the Potential of IoT Security with Microsoft's Azure Sphere

Brian Willess | Technical Solutions Engineer at Avnet

December 3rd | 1:00pm - 2:30pm

As billions of new devices are connected, organizations need to secure them to help protect data, privacy, physical safety, and infrastructure.

Join our hands-on Azure Sphere workshop and learn how this new security solution have integrated hardware, software, and cloud to provide a turnkey solution for IoT devices.

Get to Know Your Users – Learning How to Build User Personas on Your Own

Wen Chou | Senior UX Research Specialist at Arm

December 3rd | 1:00pm - 2:00pm

Ask yourself: do you really understand your users? How well do you know them? Knowing your users’ behaviour and their needs before you start development can set the success of your product.

Come along to learn tips and tricks of building persona in this 1-hour hands-on workshop.

The Future of SBCs

Maurizio Caporali | IoT CTO and UDOO Product Manager at SECO

December 3rd | 1:00pm - 2:00pm

Join the UDOO team for a chance to hear about our plans and help us define the next killer Arm based single board computer (SBC). This is a chance for you to discuss what you would like to see on the next Udoo developer board. Do you have a use-case or an application that needs a powerful and reliable Arm based developer platform? Do you have specifications for your next proof of concepts that are not currently met by the available SBCs out there? Udoo is designed for developers like you, join our conversation and let’s design it together!

Building Robots Together - Online, Hands-on and Face to Face

Steve Okay | Systems Integrator & Freelance Roboticist

Jillian Ogle | Roboticist in Residence at Circuit Launch

Leah Lucas | Director Education & Engagement at Circuit Launch

December 3rd | 1:00pm - 2:00pm

Is your cupboard full of half-finished hardware projects? Did you start a self-driving MOOC only to drop out halfway? In our experience, hardware isn't that hard when you build with a community. Let's share stories about what works, and what doesn't. Bring your own half-finished demos and meet the Silicon Valley Robotics' 'Roboticists in Residence' Steve Okay and Jillian Ogle, and Leah Lucas from Circuit Launch, the cowork space for hardware.

Innovator Sessions | OpenMV Cam H7 Pro

Kwabena Agyeman | President at OpenMV

December 3rd | 1:00pm - 2:30pm

We'll have TensorFlow Lite for Microcontrollers running on the OpenMV Cam H7 with a person detector model running onboard. Grab an OpenMV Cam today and get started with TensorFlow on the edge!

Next Generation Machine Learning for Mobile and Embedded Platforms

Dr. Rajen Bhatt | Director Of Engineering, Machine Learning and Data Science, Qeexo

December 3rd | 1:30pm - 2:00pm

Most companies are only thinking about heavyweight machine learning: huge cloud servers with big cost, big latency, and big privacy implications. At Qeexo, we think about the billions of small, microcontroller-powered devices that occupy our world and would benefit from intelligence. We’ve spent the last five years developing a lightweight machine learning platform and embedded engine that enables radical new capabilities on device categories previously overlooked.

Typically, on such constrained platforms, one must choose between latency, model size or classification accuracy, but this need not be the case with the right tools and algorithms. You will learn commercial uses of Qeexo’s engine, and the processes and suite of tools we used to deliver new interactive experiences to hundreds.

AI Workflow for Large Scale Deployment of Far-Edge ML devices

Kabir Manghnani | ML Platform Engineer at Shoreline IoT

Mark Stubbs | Principal Architect at Shoreline IoT

December 3rd | 2:00pm - 2:30pm

As ML on the far edge has the begins to reshape traditional notions of asset management many questions remain unanswered regarding how specialized machine intelligence can be brought to users unfamiliar with the technology.

In this talk we discuss a set of novel ML workflows that take advantage of containerized cloud computing to allow for simultaneous training and deployment of millions of ML models trained specifically for the sensors they operate on.

Such workflows allow ML engineers to easily develop, train, and deploy models onto many far-edge devices (without writing embedded C/C++ code) as well as open the doors to adaptive models that don't ask users to finetune them for their applications.

How We Got Rid of Passwords in IoT

Carel Grove | Technical Trainer at Arm

December 3rd | 2:40pm - 3:10pm

One of the key issues in IoT is security, and the identity management system in Pelion device management provides a strong solution to ensuring security in our devices.

We will present a session that illustrate where all the certificates and keys that end up on an Pelion IOT device originate from and the journey to get onto the device in the factory. We have presented this material on Arm internal training and it has been very popular.

Representing it visually makes it possible to quickly grasp how certificates in factory provisioning for Pelion devices work. Until we created this factory provisioning certificates was understood by the team who built it but not that clearly within ISG or the rest of Arm.

It is relevant to understand by the wider community because it is the bedrock of our IOT security differentiator. It is what we do to completely get rid of passwords in IOT.

Bringing your idea to life

Alex Glow | Hardware Nerd at

December 3rd | 2:40pm - 3:40pm

Let's chat about going from an idea to a product prototype! We have some great tips for turning your product designs into real objects, we can help you with prototyping enclosures, PCBs and much more. Come ready to share your favorite tools, services, and materials that make prototyping a snap.

ROS Community Best Practices

Katherine Scott | Developer Advocate at Open Robotics

December 3rd | 2:40pm - 3:40pm

In this BoF session we'll talk about the emergent ROS hardware world and discuss best practices for creating open source packages for robotics hardware and software with ROS 1 and ROS 2.

Build Your Own Harry Potter Wand with TensorFlow Lite Micro

Pete Warden | TensorFlow Lite engineering lead at Google
Kirk Benell | CTO at SparkFun
Rob Reynolds | Creative Technologist at SparkFun
Scott Hanson | CTO and Founder at Ambiq Micro
Arpit Shah | Director of Technology and Partner Enablement at Ambiq Micro

December 3rd | 2:40pm - 5:00 pm

This is an experiential workshop that focuses on the use of TensorFlow Lite on a low-power microcontroller to perform machine learning. Modalities covered include word recognition in speech; gesture recognition using accelerometer values; and human presence detection using imagery.

The workshop will cover machine learning model training, deployment, and operation, with a majority of time spent on a gesture-recognition activity, “Magic Wand”, that is based on the content of Pete Warden’s forthcoming book, TinyML: Machine Learning with TensorFlow on Arduino and Ultra-Low-Power Microcontrollers.

All example applications will use the SparkFun Edge board, which integrates Ambiq Micro’s Apollo3 Blue Arm Cortetxt-M4F microcontroller, microphones, camera access, and on-board low-power management to deliver a complete package that is spec’d to run Tensorflow Lite using only 6uA/Mhz. The SparkFun Edge board can run solely on a CR2032 coin cell battery for up to 10 days.

Containers in an Iot World - How Docker Can Make It Easier to Deploy Your Application on an Edge Device

Jenny Fong | Sr. Director of Product Marketing at Docker
Maurizio Caporali | IoT CTO and UDOO Product Manager at SECO
Alessandro Genovese | IoT and UDOO Software Engineer at SECO

December 3rd | 2:40pm - 5:00 pm

The growth of Arm based devices in recent years is exponential. Last year alone, Arm and its partners have shipped 23 billion Arm processors. With such growth in Arm devices, IoT and embedded software developers are looking for easier and more scalable ways of creating portable code which can be deployed easily on different platforms while reducing the time to market.

With the partnership between Arm and Docker, it is now possible to easily build and deploy containerized applications everywhere from edge to cloud, this allows software developers to scale from a PoC to a manageable and robust IoT solution.

Developers will gain hands-on experience in developing and deploying a reusable Docker application on an IoT device. They will also learn about specific benefits containerization has over more classical software development approaches.

Bootstrapping Edge Infrastructure for AIOT Applications with Open Source Software

Galem Kayo | Product Manager for IoT at Canonical

December 3rd | 3:10pm - 3:40pm

Deploying AI-powered IoT applications at the edge can lower latency while increasing privacy and autonomy. However, access to infrastructure is a prerequisite to such private deployments. We believe the pace of innovation will be accelerated in a world where developers can bootstrap AIoT infrastructure with COTS hardware and open-source software. Consequently, in this session, we will be presenting and demonstrating open source software primitives for AIoT deployments at the edge. We will explore edge cloud setups with MicroStack, AI workload orchestration with Microk8s, simple edge AI/ML pipelines creation with Kubeflow, gateway and IoT device provisioning with MaaS, and embedded application lifecycle management with Ubuntu Core.

Why AI is Being Adopted on the Edge

Amit Goel | Sr. Product Manager Autonomous Machines at Nvidia

December 3rd | 3:40pm - 4:10pm

We are entering a new era of software-defined autonomous machines. Tomorrow's autonomous machines are being defined by AI software, an approach that gives them a new level of capability and robustness that could not be achieved in the past.  Jetson platform  embedded system-on-module (SoM) powered with ARM CPU architecture along with state of the art Nvidia GPUs  make it possible to build these autonomous machines from smart cameras to industrial robots . We will discuss the industry trends that are driving adoption of AI at the edge.

Robot Operating System (ROS): Current and Future Capabilities on Embedded Systems

Katherine Scott | Developer Advocate, Open Robotics

December 3rd | 4:10pm - 5:00pm

Robot Operating System (ROS) is a collection of free and open-source software packages used by a large and growing developer community to build, simulate, and test robotic systems. While ROS is generally thought of as a high-level API for robot construction there is now a growing community of developers bringing ROS compatibility to embedded systems. This talk will cover the current embedded system capabilities in ROS 1 and the future of these systems in ROS 2.

This talk will demonstrate some of the core primitives of ROS, how they can be implemented on embedded devices, and how this approach can create re-usable hardware in an open-source robot ecosystem.

Copyright © 2019 Arm - All Rights Reserved.

This event is brought to you by Arm. Arm technology is at the heart of the computing and connectivity revolution that is transforming the way people live and businesses operate. From the unmissable to the invisible, Arm's advanced, energy-efficient processor designs are enabling the intelligence in more than 150 billion silicon chips and securely powering products from the sensor to the smartphone to the supercomputer. Learn more about Arm, by visiting our developer website: