The SISA2015 holds two tutorial sessions for students and researchers. We treat the following topics:
- Hardware Connectivity and Real-Time Processing with MATLAB and Simulink
- Developing Video Signal Processing Algorithms for Embedded Vision Systems
Each of the sessions is instructed by the corresponding leading researcher; we believe that they are interesting and useful for implementation of smart-information systems.
Any attendee of SISA2015 is admitted free of charge.
(This Tutorial sessions in cooperation with Chiba Institute of Technology)
Date:
Aug. 26, 2015 (Wednesday)
Place:
Chiba Institute of Technology, Tsudanuma Campus, No. 6 Building, 1st Floor
Session 1: (15:30-16:30)
Title:
Hardware Connectivity and Real-Time Processing with MATLAB and Simulink
Speaker:
Yoshio Okita (Academic Technical Evangelist, MathWorks Japan)
Abstract:
In recent years, the maker movement has embraced high-performance, low-cost hardware, such as Arduino®, Raspberry Pi™, and Robot Operating System (ROS)-enabled robots. This hardware is also used in engineering education, specifically project-based learning. MathWorks has developed hardware support in MATLAB® and Simulink® to facilitate the development of teaching materials using this hardware. In this session, we introduce the MATLAB and Simulink hardware support that enables direct communication with and embedded system deployment to the target device. Input/output functions and special functions of the device (sensor, actuator) are provided as MATLAB functions or Simulink blocksets. Users can generate C code from the functions or the blocksets without programming using another development environment, such as an SDK. This session also introduces real-time stream processing with System objects® in MATLAB. Most real-time signal processing applications use stream processing, a memory-efficient technique for handling large amounts of data. Since System objects can manage buffer updates and frames of streaming data automatically, just a few lines of MATLAB code are needed to implement stream processing on the target hardware device.
Session 2: (16:40-17:40)
Title:
Developing Video Signal Processing Algorithms for Embedded Vision Systems
Speaker:
Shogo Muramatsu (Associate Professor, Niigata University)
Abstract:
This tutorial explains how to develop and evaluate video signal processing algorithms for embedded vision systems. Some recent single board computers are reviewed and MathWorks MATLAB/Simulink is introduced as a development environment. By adopting a gradient filter as an example, a series of development procedures is demonstrated. First, a way of simulating an image processing algorithm is illustrated. Second, the algorithm is extended to a System Object, which is available for code generation targeting an embedded system. The unit testing framework of MATLAB is also dealt with. Then, it is explained how to use the System Object for modeling a video stream processing. It is shown that the model can be implemented on Raspberry Pi 2 in a standalone manner. Some examples of educational and research activities by using the above approach are presented. Last, a hardware/software co-design example is shown for developing a system on Xilinx Zynq-7000 programmable SoC, which is equipped with both of FPGA and CPU.
EmbVision Totorial : http://msiplab.eng.niigata-u.ac.jp/embvision/en/