Accessathon - Accessibility Hackathon at Rhine-Waal University

How do I participate?

First of all - hurry up and register, spots are limited! Then BYOD - Bring your own device - and we take care of the rest! There will be snacks, drinks and an occasional pizza delivery, so that you can focus on your projects. Of course, there will be also enough time for relaxed networking and a lot of fun in between! You can use state of the art technology or pen and paper to work on your ideas. Feel free to explore something new during the Hackathon! Do you already have a project idea? Share your thoughts with us during the registration and we will propose your project to the participants!

This time we're introducing something new: an extra presentation / competition day on 16th June! Finalize your project idea, present it to the audience (students, representatives from companies / associations and anyone else who's interested) and the jury and finally, win cool prizes. The jury will consist of experts from inside and outside the university, the best three teams will be rewarded. In addition to the presentations in front of the audience, every team has the possibility to present their project at their own booth so that you have the chance to network with those who might be interested in your idea.

Agenda

Speaker Topic / Description Time
Logo Accessathon

The Accessathon Team

Welcome! Nice to meet you!
Check-in / Registration for the Hackathon.

15:00 - 16:00
Logo Accessathon

The Accessathon Team

Quick Intro

Why are we here? What happened before? Impressions of the last two accessibility hackathons

16:00 - 16:10
Nils Beinke

Nils Beinke

MakersHelpCare

MakersHelpCare

MakersHelpCare - Nils Beinke is a special eds teacher with a 3D-printer. In his spare time he designes DIY-solutions for his students with special needs. On his blog MakersHelpCare.de he shows his designs and other cool stuff.

16:10 - 16:30
Professor Kai Essig

Prof. Dr. Kai Essig

Rhine-Waal University

ADAMAAS - Individual assistance with Augmented Reality

Recently, different stationary and mobile assistive systems, such as Smart Glasses (i.e., Google Glass, Microsoft HoloLens), Head-Mounted Displays, Virtual Reality devices (e.g., Oculus Rift, Samsung Gear VR, HTC Vive) or mobile eye and motion tracking systems have been developed. These systems can provide new possibilities for recording, analyzing, and optimizing people's performances and therefore to provide individualized support, coaching or assistance in various application fields, such as working and training environments, human-machine interaction, speech and action assistance, as well as in household appliances. ADAMAAS (Adaptive and Mobile Action Assistance System) can be applied to support elderly and people with or without disabilities in everyday scenarios and rehabilitation. ADAMAAS combines techniques from memory research, eye tracking and vital parameter measurement (such as pulse or heart rate, object and action recognition (Computer Vision), as well as Augmented Reality (AR) with modern diagnostics and corrective intervention techniques. The system is able to identify problems in actual action processes, to react when mistakes are made, as well as to display situation and context dependent assistance in textual, visual or avatar based format, superimposed on a transparent virtual plane in users’ field of view

Kai Essig graduated in 1998 in Computer Science and Chemistry (M. Sc.) at Bielefeld University, Germany. He joined the Neuroinformatics Group at Bielefeld University in 1998 and received a Ph.D. in Computer Science in 2007. From 2008 on he joined the Neurocognition and Action - Biomechanics Research Group at the Faculty of Psychology and Sport Sciences, as well as the Center of Excellence "Cognitive Interaction Technology" (CITEC) at Bielefeld University. Since September 2017 he is Professor of Human Factors, Intelligent Systems at the Faculty of Communication and Environment at the Rhein-Waal University.

16:30 - 16:50
Adriana Cabrera

Adriana Cabrera

Researcher / Teacher
Rhine-Waal University

Soft and Wearable Electronics

Adriana Cabrera is one of the leading researchers in the field of experimental wearables design. Working together with makers from all over the world she provides great opportunities for students of the Rhine-Waal University to learn and create beautiful and practical things. Among other amazing projects, she has recently taught an intensive interdisciplinary Academy course on personal fabrication for care. Enjoy the review of the impressive course results to inspire your creativity during the Accessathon.

16:50 - 17:10

TBA

TBA

Creating a World of Accessible Digital Contents

The internet is full of visual contents that are only accessible to people with full sight. Alternative texts describing the visual content provide information for people with visual impairments. Unfortunately, a lot of pictures on the web don't have alt-texts, not yet at least.
A team of HSRW students developed a browser plugin that uses a learning algorithm to bring blind users' experience on a whole new level, adding automatically generated descriptions to the pictures that lack them and using the input of the sighted community to improve it's own accuracy.

17:10 - 17:30
Professor William Megill

Prof. William Megill

Rhine-Waal University

Something smart you can wear every day

Professor Megill is both an engineer and a biologist. This combination, all-round experience, his endless curiosity, charisma and dedication for students' projects made him a highly valuable asset for the Rhine-Waal University. Before that, he has spent two years developing something very practical and sofisticated. You can expect to be entertained and learn a lot of interesting things about the world of smart underwear.

17:30 - 18:00
Logo Accessathon

The Accessathon Team

The Hackers' Tools

There is a wide range of soft- and hardware tools and techniques we can use to create something new or improve something old. A brief introduction to the tools available to you during the weekend.

18:15 - 18:30
Abdul Saboor

Abdul Saboor

BCI Lab
Rhine-Waal University

Hybrid Brain-Computer Interface (BCI) to assist people with disabilities

Abdul Saboor is a researcher at Rhine-Waal University and works for the BCI-Lab at the Faculty of Technology and Bionics. Reading signals directly from the brain can be helpful for people with severe disabilities to assist with communication and control systems for smart environments.

18:30 - 19:00
Unknown

You!

Pitches

If you have a project idea you want to share or find contributors, this is the right time and place to do it.

19:00 - 19:30
Christian Silva

Christian Silva

Special Guest

Every Child can be a SuperHero

Christian Silva is a skilled mechatronics engineer from Bogotá, Colombia. With his public benefit startup he produced and donated dozens of beautiful, empowering and personal prothesis to children from families with low income. We are absolutely thrilled he could join the third Accessathon to share his experience.

TBA
Pizza

Stomach rumble

Pizza and networking

It is hard to think on an empty stomach. Time for a snack and a good chance to meet and greet the other hackers!

19:30 - 23:00

End of day one

23:00

Registration

Register now, capacity is limited - first come first served.



We will reach out to you, to discuss everything in detail.

© GeoBasis-DE/BKG, Google

Location

Rhine-Waal University of Applied Sciences
Friedrich-Heinrich-Allee 25
47475 Kamp-Lintfort

How to get here

Results from the first two Accessathons

Here, you can find some of the abstracts of the projects participants worked on during the first Accessathon in June 2017. In the meantime, we're working hard on a platform to publish detailled project documentations.

FeyeND: Assisted Living



Team members:
  • Alexander Duseti
  • Anoshan Indreswaran
  • Arindam Mahanta
  • Sanchay Cholkar
Functionality:
  • Ask the app to “FeyeND” something
  • Point camera in any direction and click an image
  • App tells you if your object is in that direction
  • Guides user to the object (planned)

The idea behind FeyeND is to assist visually challenged people to locate things in their surroundings easily. It is an interactive audio based app which uses natural language processing and computer vision. On request, speak out to the app what you are looking for. The app uses the camera to sense the surrounding. It sends the voice command to a language processing API to identify which object the user is looking for. It sends the image to the Microsoft computer vision API to detect the objects in the image. It checks if the object asked by the user is on the objects found by the CV API. It speaks out the response. - Progress so far

Portable Music Player For Blind Users


(A device, not to be confused with a mobile app)

Team members:
  • Jaideep Singh Champawat

Technology has developed to the point where you can do almost anything with your smartphone. However, the interfaces of smart phones are not very usable for blind and visually impaired people and the user experience for a blind user is totally not enjoyable. Starting from this awareness, I was motivated to develop a music player for blind people. A small research reveals how visually impaired people struggle with current devices needing need greater independence and freedom, and how there aren't any feature rich music player for them - supporting cloud services like Spotify and such. To fill the gap, I presented the concept of a portable music player which incorporates easy navigation features which are essential in a music player by the means of tactile buttons on the device.

Pushy! : Robotic arm as an accessibility extension



Team members:
  • Deep Bhatt
  • Husam Shakeeb

During the introductory phase of the “Accessathon”, my team consisting of my colleague, Deep Bhatt, and I, Husam Shakeeb, were motivated by the presentation given by Mr. Christian Bayerlein. He was explaining all the accessibility functions and tools he has at his disposal with the exception of being able to operate buttons or carry out simple tasks that are otherwise not possible from his chair or smartphone. We worked closely with Mr. Bayerlein and explained the simple concept of using the wireless connection capability already available in his chair to attach a mechanical arm that can be used to push buttons and in an advanced stage to add the ability to handle objects. He was excited about the idea and he even chose the name for the project. The idea was to build or acquire a mechanical arm that would be used to carry out the tasks and to build an interface between the arm and the smartphone using an Arduino microcontroller and infrared or Bluetooth directly from the phone to send commands. During the “Accessathon” using the hardware available in the lab we were able to demonstrate the concept by building the interface with a single motor and sending signals to operate the motor from Mr. Bayerlein’s smartphone. We still hope to receive the necessary hardware in order for the project to be completed, other students may also pick up where we left off in case we graduate first!