Data Science Summit

The Annual Data Science Summit, organized by DSCE, takes place in the Frits Philips Muziekgebouw in Eindhoven on November 27, 2018. The goal of the summit is to show that great scientific research can be done in close cooperation with and inspired by industry and other partners.

During this day we will have several excellent speakers who will share the latest insights in various Data Science aspects, focused around TU/e's research programs. There will also be ample time for discussion and hands-on idea sharing during the poster sessions. Here you can see and discuss examples of Data Science methods and applications with practitioners. The Data Science Summit inspires and gives directions for solving your specific challenges. The day is aimed at a broad audience: from (research) specialists in the field to applicants and practitioners.

For registrees, the Summit is free of charge. However, we have to charge € 25 for no-shows (to cover costs of drinks, lunch and snacks that are not consumed).

Please note that a professional photographer will take pictures of the audience during the lectures and the poster sessions. By registering for the Summit you agree to these photos being taken and the possibility of those images being used on our website to give a general impression of the day.

Program

Start

End

Topic

Speaker

Title

9:00

10:00

Registration, coffee & posters

10:00

10:10

Opening

Frank Baaijens (TU/e)

 

10:10

11:00

Keynote

Daniel Keim (University of Konstanz)

The power of visual analytics: unlocking the value of big data

11:00

11:15

Poster pitches

PhDs / PDs

 

11:15

11:45

Posters, coffee & snacks

11:45

12:15

Quantified Self

Arno Knobbe (Leiden University)
Aarnout Brombacher (TU/e, ID)

Data Science in elite and recreational sports

12:15

12:45

Health Analytics

Reinder Haakma (Philips Research)
Paulo Serra (TU/e, M&CS)

Probabilistic modeling of PPG signals with application to premature beat detection

12:45

13:00

Poster pitches

PhDs / PDs

 

13:00

14:30

Lunch & posters

14:30

15:00

Internet of Data

Remco Schoenmakers (Thermo Fischer)
Georgios Exarchakos (TU/e, EE)

Future-proof modeling of systems: optimizing distributed resources in the age of fog computing

15:00

15:30

Smart Manufacturing & Maintenance

Mark ten Have (Océ Technologies)
Rob Basten (TU/e, IE&IS)

Data Science is the key to proactive maintenance

15:30

16:00

Customer Journey

Jan Veldsink (Rabobank)
Mykola Pechenizkiy (TU/e, M&CS)

Know your customer responsible predictive analytics

16:00

17:30

Networking drinks & posters

Keynote

Title
The power of visual analytics: Unlocking the value of big data

Speaker
•    Daniel Keim (University of Konstanz)

Abstract
Never before in history data is generated and collected at such high volumes as it is today. For the analysis of large data sets to be effective, it is important to include the human in the data exploration process and combine the flexibility, creativity, and general knowledge of humans with the enormous storage capacity and the computational power of today's computers. Visual Analytics helps to deal with the flood of information by integrating the human in the data analysis process, applying its perceptual abilities to the large data sets. Presenting data in an interactive, graphical form provides effective ways to understand and analyze large data sets, allowing novel discoveries and empowering individuals to take control of the analytical process.

The talk presents the potential of visual analytics and discusses the role of automated versus interactive visual techniques in dealing with big data. A variety of application examples ranging from customer feedback analysis over network security to sports analytics illustrate the exciting potential of visual analysis techniques but also their current limitations.
 

image

Biography Daniel Keim
Daniel A. Keim is professor and head of the Information Visualization and Data Analysis Research Group in the Computer and Information Science Department of the University of Konstanz, Germany. He has been actively involved in data analysis and information visualization research for more than 25 years and developed a number of novel visual analysis techniques for large data sets. He has been program co-chair of the IEEE InfoVis and IEEE VAST as well as the ACM SIGKDD conference. 

Dr. Keim got his Ph.D. and habilitation degrees in computer science from the University of Munich. Before joining the University of Konstanz, Dr. Keim was associate professor at the University of Halle, Germany and Senior Technology Consultant at AT&T Shannon Research Labs, NJ, USA.

 

Quantified Self

Title
Data Science in elite and recreational sports

Speakers
•    Arno Knobbe (Leiden University)
•    Aarnout Brombacher (TU/e, ID)

Abstract
One of the more interesting application areas in the field of “Quantified Self” is the field of sports. While in many application domains the (new) rules and regulations on privacy (such as the recent GDPR) make it more difficult to obtain (bodily) data from real people in real life this is only true to a much lesser extend for the field of “sports”.  Sporters, both recreational sporters as well as elite-sporters, do not mind sharing data for research purposes provided that improves their performance (especially for elite sporters), their health and/or the health of others. In this lecture dr. Knobbe will give an overview of research that is taking place at Leiden University (LIACS) in the field of data-mining and data analytics of data obtained at (elite-)sporters. Prof. Brombacher will give an overview of the national activities in the field of (big-)data and sports as they are currently being developed on a national level by the TopTeam Sports and Eindhoven University.

image

Biography Arno Knobbe
Dr. Arno Knobbe works as assistant professor at the Leiden Institute of Advanced Computer Science, where he heads a research group that focuses on data science in sports.  His work is aimed at injury prevention and performance optimization by means of detailed sports data captured by athletes, coaches and embedded scientists. Focusing on elite sports, his group has worked with the national men’s volleyball team, cycling and speed skating team Lotto-Jumbo, the Leiden Marathon, and the women’s national soccer team (who happened to win the European Championship). Knobbe also heads the Sport Data Center, a cooperation between research institutes from Leiden, Amsterdam, Delft and Twente.
 

image

Biography Aarnout Brombacher
Aarnout Brombacher was appointed 1-7-1993 as full professor at Eindhoven University of Technology. He currently is professor in “Design theory and information flow analysis” in the faculty Industrial Design of Eindhoven University of Technology. With this chair he is responsible for research and education in the fields Quality Information Flows and Customer Perceived Quality in highly innovative product design and development processes, especially on products, systems and services that relate to (recreational) sports and vitality. He has authored and co-authored over 100 journal papers and is, on university level, coordinator of the (university) program People, Sports and Vitality and member of the national TopTeam (= advisory body to the cabinet) on Sports and Vitality. From March 1st, 2010 until March 1st 2018 he was also Dean of the department Industrial Design.
 

Health Analytics

Title
Probabilistic modeling of PPG signals with application to premature beat detection

Speakers
•    Reinder Haakma (Philips Research)
•    Paulo Serra (TU/e, M&CS)

Abstract
In this work we introduce a Bayesian model for photoplethysmography (PPG) signals. Photoplethysmography is an unobtrusive, inexpensive, optical technique to measure blood volume changes in cardiovascular tissue. PPG signals are acquired by a device typically placed on the wrist or  nger. The signal carries information about more than just heart rate and can thus be used for extensive analyses of health status in a broader sense. This makes PPG signals very informative, but also complex. Our approach decomposes the PPG signal by extracting a so called baseline wandering component. This component is connected with respiration. The remainder of the signal is quasi-periodic and is decomposed into a set of pulses that are modelled using a Bayesian model. The resulting method is self-contained, automated and provides a good compression of the data while keeping the rich information of the PPG signal. We illustrate our approach by showing how it can be used to automatically detect premature beats in patients. This is joint work with M. Regis and E.R. van den Heuvel (Eindhoven University of Technology, Philips), and L.M. Eerikainen (Philips).

image

Biography Reinder Haakma
Reinder Haakma (MSc Electrical Engineering, 1985, University of Twente; PhD, 1998, Technical University Eindhoven) joined Philips in 1987. From 1997 onwards he has been scientist and team leader of various research activities in personal health, wearable technology and out-patient monitoring. He is involved in a number of public-private partnership collaborations. His expertise includes user‐system interaction, embedded user interface architectures, behavior and physiological modelling and analysis, and unobtrusive sensing technologies.

image

Biography Paulo Serra
Dr. Paulo Serra currently is assistant professor at the Stochastics group of the Department of Mathematics and Computer Science of TU/e. He is also involved in the Health Analytics and the Smart Manufacturing & Maintenance research program of DSCE. He was a postdoc at the University of Amsterdam, working as part of the NETWORKS project, funded under NWO's Zwaartekracht grant.

Paulo’s research interest is wide, focusing on (non-arametric) Mathematical Statistics. Most of what he does involves Markov processes and Markov chains, Markov chain Monte Carlo and Poisson processes. A lot of work is theoretical but he is very interested in implementation of algorithms and their practical applications, programming and on-line (recursive) algorithms.

Internet of Data

Title
Future-proof modeling of systems - optimizing distributed resources in the age of fog computing

Speakers
•    Remco Schoenmakers (Thermo Fischer)
•    Georgios Exarchakos (TU/e, EE)

Abstract
Thermo Fisher Scientific’s electron microscopes, like many other sensors, produce massive amounts of data at an ever increasing data rate. In order to collect the best possible data, immediate analysis of this data and even feedback to the instrument is necessary, but compute resources near the microscope are scarce and outdate rapidly. At the same time, software is evolving as well. How can we (re)distribute data processing tasks dynamically in such a way that the total cost (in $’s) is minimized? How can we make this a future-proof system that adapts if new hard- or software becomes available? In essence, network and compute resources should be dynamically allocated to a task given the load, local and remote capacity as well as their attached usage cost. Fog computing is a paradigm that allows IoT achieve stringent non-functional requirements (e.g. latency) at a high price on complexity. While years of research on model-based design have given powerful solutions, uncertainties in vastly heterogeneous and distributed systems are largely ignored. Evolutionary fog systems are able to integrate usage, application, network-driven uncertainties and continuously balance between resource utility, user experience and cost. 
 

image

Biography Remco Schoenmakers
Remco Schoenmakers is Sr Manager R&D in the Advanced Technology group of the electron microscopy department of Thermo Fisher Scientific. He holds a PhD in Astrophysics (1999) from the University of Groningen. He has worked at Philips Medical Systems on 3D CT and MRI visualization software, after which he moved to FEI in 2001 to develop 3D electron tomography acquisition software, in particular the image processing components thereof. He has developed the widely used Inspect3D electron tomography reconstruction package. He has fulfilled the role of group leader Application Software, and currently leads the Algorithm research group where he coordinates a.o. research on artificial intelligence, data management and high performance computing.
 

image

Biography Georgios Exarchakos
George Exarchakos is assistant professor with the Electro-Optical Communications group at the TU/e and holds a PhD on Distributed Computing (2009) from University of Surrey, U.K.. His areas of expertise include low-power networks, network resource allocation, complex networks and swarm intelligence. Georgios leads the Internet of Data research program of Data Science Center of TU/e and the Internet of Things Communications program of Center for Wireless Technology of TU/e. He is co-author of the “Networks for Pervasive Services” book (Springer, 2011), co-editor of two scientific books and more than 50 peer-reviewed international journal and conference publications.
 

Smart Manufacturing and Maintenance

Title
Data Science is the key to Proactive Maintenance

Speakers
•    Mark ten Have (Océ Technologies)
•    Rob Basten (TU/e, IE&IS)

Abstract
Data Science is driving the maintenance world from “Break-Fix” to “Sense & Respond” service concepts. Océ has developed and introduced this new full service concept in various steps. Several TU/e students have developed the necessary building blocks for this new concept. Further elaboration and introduction of these building blocks has been done via small field trials. During the presentation we briefly explain the introduction concept of this new full service concept, the several building blocks and the next research steps on which two Postdocs are working.
 

image

Biography Mark ten Have
Mark ten Have is a Service Product Manager at Océ Technologies B.V., which is a global leader in digital imaging, industrial printing and collaborative business services. The introduction of a new technology for a new market required a new full service concept at Océ. Mark has developed and introduced this new concept. Besides all technical challenges like Design for Availability, remote data collection, data processing, prognostics, CbM models, decision- and planning tool development, also an organizational shift from “Break-Fix” to “Sense & Respond” has been made. Mark has 28 years of experience in the high-tech industry:
•    11 years in Service Management
•    5 years in Project Management
•    7 years in Manufacturing & Logistics Management
•    5 years in Product Development
 

image

Biography Rob Basten
Rob Basten is an Associate Professor at the Eindhoven University of Technology (TU/e) where he is primarily occupied with maintenance and service logistics and its interfaces. Most of his research focuses on after sales services for capital goods: maintenance policies and maintenance optimization, spare parts inventory control, and design of after sales service supply chains. He is especially interested in using new technologies to improve after sales services. For example, 3D printing of spare parts on location and using improved sensoring and communication technology to perform ‘just in time’ maintenance. He is further active in behavioral operations management, trying to understand how people can use decision support systems in such a way that they actually improve decisions and add value. Many research projects are interdisciplinary in order to carry out high quality research and solve problems in practice. For the latter reason, research is typically performed in cooperation with high-tech industry: companies such as ASML, Océ, Vanderlande and Marel Stork. Experience with industry is also used in teaching, for example in the course Maintenance & Service Logistics.
 

Customer Journey

Title
Know your customer responsible predictive analytics

Speakers
•    Jan Veldsink (Rabobank)
•    Mykola Pechinizkiy (TU/e, M&CS)

Abstract
Over the past years most of the banking turned digital and data-driven. Customer Due Diligence (CDD) and Know Your Customer (KYC) concepts emerged and are getting more important from fraud-detection perspective and become driven by more stringent rules and regulations from many different regulators around the globe. The general challenge is to automate as far as possible the process of getting customer insights for addressing internal and societal needs for preventing and mitigating cybercrime, money laundering and different kinds of fraud. In this talk we will share some of the insights gained through research collaboration between TU/e and Rabobank in studying how to develop advanced and responsible predictive analytics solutions that can leverage information hidden in heterogeneous evolving data streams, and accommodate the explicit and implicit demands of regulators and rights of the customer with respect to fairness, accountability and transparency of algorithmic decision making.

image

Biography Jan Veldsink
Jan Veldsink is a creative energetic new thinker, with passion for technology and people. He is speaker, senior advisor, trainer and coach, specialized in AI and Intuitive interventions in organizations. His mission is to contribute to a secure and endurable environment within teams and organizations. Working with AI since the last AI winter. At his work with some of the large banks in the Netherlands he applies AI within different Compliance and Fraud related topics. His expertise areas are AI, Cyber security, Systems thinking, Serious gaming and Innovation. Jan is working as consultant and teacher on the subjects of Cybercrime, AI, digital transformation, organizational development and innovation at Rabobank and Nyenrode Business University.

He is Lead AI and Cognitive Technologies within Global Compliance Surveillance & Integrity Investigations of the Rabobank and Core Teacher of the module AI and Cyber, one of the Modular Executive MBA in Business & IT program at Nyenrode.

image

Biography Mykola Pechinizkiy
Mykola Pechenizkiy is Professor of Data Mining at the Department of Mathematics and Computer Science, TU/e. His core expertise and research interests are in predictive analytics and its application to real-world problems in industry, medicine and education. At the Data Science Center (DCSE) he leads the Customer Journey and Responsible Data Science interdisciplinary research programs aiming at developing techniques for informed, accountable and transparent analytics. As principal investigator of several applications-inspired research projects he aims at developing foundations for next generation predictive analytics and demonstrating their ecological validity in practice. Over the past decade he has co-authored more than 100 peer-reviewed publications and served on the program committees of the leading data mining and AI conferences.