Invited Speakers


Robert W. Sumner
Associate Director, Disney Research Zurich

https://graphics.ethz.ch/~sumnerb/

 

 

Bio: Dr. Robert Sumner is the Associate Director of Disney Research Zurich and an Adjunct Professor at ETH Zurich. At DRZ, Robert leads the lab’s research on animation and interactive graphics. His research group strives to bypass technical barriers in the animation production pipeline with new algorithms that expand the designer’s creative toolbox in terms of depiction, movement, deformation, stylization, control, and efficiency. Robert received a B.S. (1998) degree in computer science from the Georgia Institute of Technology and his M.S. (2001) and Ph.D. (2005) from the Massachusetts Institute of Technology. He spent three years as a postdoctoral researcher at ETH Zurich before joining Disney. At ETH, Robert teaches a course called the Game Programming Laboratory in which students work in small teams to design and implement novel video games. In 2015, Robert founded the ETH Game Technology Center which provides an umbrella over ETH research, teaching, and outreach in the area of game technology.

Title: Amplifying Creativity in Animation and Games

Abstract: “Art challenges technology, and technology inspires the art.” These are the words John Lasseter used to describe his experience as an artist working with the technology leaders at Pixar three decades ago to pioneer what we know today as computer-generated animation. At the heart of this statement lies the idea that technology and art, when joined together, hold a unique and promising potential to amplify creativity. This very concept forms the central vision of the Animation and Games group at Disney Research Zurich. In this keynote talk, I will share our experiences as researchers working with Disney artists on technology to amplify creativity, including several tough challenges that art has given us, as well as a few successes in which we could inspire the art. Attendees can expect examples of recent research advances in animation, simulation, stylization, and, in Disney style, a little bit of singing.


Jean-Paul Laumond
Directeur de Recherche, LAAS-CNRS, France

http://homepages.laas.fr/jpl/

 

 
 

Bio: Jean-Paul Laumond, IEEE Fellow, is a roboticist. He is Directeur de Recherche at LAAS-CNRS (team Gepetto) in Toulouse, France. His research is devoted to robot motion.He research is devoted to robot motion planning and control. From 2000 to 2002, he created and managed Kineo CAM, a spin-off company from LAAS-CNRS devoted to develop and market motion planning technology in the field of virtual prototyping. Siemens acquired Kineo CAM in 2012. In 2006, he launched the research team Gepetto dedicated to Human Motion studies along three perspectives: artificial motion for humanoid robots, virtual motion for digital actors and mannequins, and natural motions of human beings. He teaches Robotics at Ecole Normale Supérieure in Paris. He publishes in Robotics, Computer Science, Automatic Control and recently in Neurosciences. He has been the 2011-2012 recipient of the Chaire Innovation technologique Liliane Bettencourt at Collège de France in Paris. His current project Actanthrope (ERC-ADG 340050) is devoted to the computational foundations of anthropomorphic action.

Title: The Yoyo-Man

Abstract: Humans are not walking, they are rolling! The objective of the talk is to give sense to this obscure statement. Indeed, the wheel may be a plausible model of bipedal walking. We report on preliminary results developed along three perspectives combining biomechanics, neurophysiology and robotics. From a motion capture data basis of human walkers we first identify the center of mass (CoM) as a geometric center from which the motions of the feet are organized. Then we show how rimless wheels that model most passive walkers are better controlled when equipped with a stabilized mass on top of them. CoM and head play complementary roles that define what we call the Yoyo-Man.


Nadia Berthouze
Reader in Affective Interaction and Computing, University College London

http://www.ucl.ac.uk/uclic/people/n-berthouze

 

Bio: Prof Nadia Berthouze leads the Affective Computing group within the University College London Interaction Centre. She pioneered the study of body posture/movement (both kinematics and muscle activity) and touch behaviour as modalities for affective automatic recognition and modulation in technology-mediated scenarios (games, health sector). Her work has gone beyond acted emotions by investigating naturalistic affective expressions such as laughter and pain. She was invited to write chapters for prestigious handbooks (Oxford Handbooks, APA Psychology series), to give a TEDxStMartin talk and being a keynote speaker for various academic and industry –led conferences. She has published more than 150 papers in affective computing, HCI, and pattern recognition and she has been PI and Co-I in various UK, EU and Japan funded projects.

Title: The body and its actions: how they can be used to manipulate the player’s experience

Abstract: Recent years have seen the emergence of game technology that involves and requires its users to be engaged through their body. In addition, studies in psychology have shown that our body expressions affect our emotional state, our cognitive abilities and our attitude towards the environment around us. This has generated an increased interest in understanding and exploiting this modality to automatically recognize, respond to and regulate users’ affective experience. In the first part of my talk, I will report on our studies aimed at understanding how body expressions and body movement can be used to modulate user experience in games and physical activity in real-life situations such as walking and physical rehabilitation. Then, I’ll discuss how emotional states can be automatically detected from body expressions, including muscle activity and touch behaviour. Examples from games and physical rehabilitation will be presented.


Alex Champandard
Editor in Chief, Technical Director & Entrepreneur at AiGameDev.com

http://aigamedev.com/

 

Bio: Alex is the co-founder of the nucl.ai Conference, the largest worldwide event dedicated to AI in creative industries. He has worked in the industry as a senior AI programmer for many years, most notably for Rockstar Games, and regularly consults with leading studios in Europe—most recently on the multiplayer bots for Killzone 2-3 at Guerrilla Games. Alex authored the book “AI Game Development” and often speaks about his research and experiments. He’s associate editor for the IEEE Transactions on AI in Games, and serves on the program committee for the AIIDE and CIG conferences.

Title: Modern AI and Its Impact on Animation and Movement

Abstract: In this talk, you’ll learn about the fast progress being made in Artificial Intelligence in the games industry, and see how this is impacting challenging problems like animation and movement. What exactly is motion matching and why is it only possible now? How can machine learning help and why are developers ready for new techniques? Alex will explain all this and more in a forward looking presentation.