-
Glette, Kyrre
(2020).
Evolutionary algorithms for intelligent robots.
-
T?rresen, Jim; Glette, Kyrre & Ellefsen, Kai Olav
(2019).
Intelligent, Adaptive Robots in Real-World Scenarios.
-
T?rresen, Jim; Glette, Kyrre & Ellefsen, Kai Olav
(2019).
Adaptive Robot Body and Control for Real-World Environments.
-
Becker, Artur; Herrebr?den, Henrik; Gonzalez Sanchez, Victor Evaristo; Nymoen, Kristian; Dal Sasso Freitas, Carla Maria & T?rresen, Jim
[Show all 7 contributors for this article]
(2019).
Functional Data Analysis of Rowing Technique Using Motion Capture Data.
Show summary
We present an approach to analyzing the motion capture data ofrowers using bivariate functional principal component analysis(bfPCA). The method has been applied on data from six elite rowersrowing on an ergometer. The analyses of the upper and lower bodycoordination during the rowing cycle revealed significant differences between the rowers, even though the data was normalized toaccount for differences in body dimensions. We make an argumentfor the use of bfPCA and other functional data analysis methods forthe quantitative evaluation and description of technique in sports.
-
Ellefsen, Kai Olav
(2019).
Hva Kan Roboter L?re av Biologisk Liv?
-
T?rresen, Jim
(2019).
Intelligent Robots and Systems in Real-World Environment.
-
-
Miura, Jun & T?rresen, Jim
(2019).
Intelligent Robot Technologies for Care and Lifestyle Support .
-
Rohlfing, Katharina J. & T?rresen, Jim
(2019).
Explainability: an interactive view.
-
Comba, Joao Luiz Dihl & T?rresen, Jim
(2019).
Visual Data Analysis of Unstructured and Big Data.
-
T?rresen, Jim
(2019).
Kunstig intelligens – hvem, hva og hvor.
(Eng. Artificial Intelligence – who, what and where).
-
Nygaard, T?nnes Frostad; Nordmoen, J?rgen Halvorsen; Ellefsen, Kai Olav; Martin, Charles Patrick; T?rresen, Jim & Glette, Kyrre
(2019).
Experiences from Real-World Evolution with DyRET: Dynamic Robot for Embodied Testing.
-
Nordmoen, J?rgen Halvorsen; Nygaard, T?nnes Frostad; Ellefsen, Kai Olav & Glette, Kyrre
(2019).
Evolved embodied phase coordination enables robust quadruped robot locomotion
.
-
Nordmoen, J?rgen Halvorsen & Fadelli, Ingrid
(2019).
A new method to enable robust locomotion in a quadruped robot.
[Internet].
TechXplore.
-
Nygaard, T?nnes Frostad; Martin, Charles Patrick; T?rresen, Jim & Glette, Kyrre
(2019).
Self-Modifying Morphology Experiments with DyRET: Dynamic Robot for Embodied Testing.
-
Ellefsen, Kai Olav; Huizinga, Joost & T?rresen, Jim
(2019).
Guiding Neuroevolution with Structural Objectives.
-
Ellefsen, Kai Olav & T?rresen, Jim
(2019).
Self-Adapting Goals Allow Transfer of Predictive Models to New Tasks.
-
Teigen, Bj?rn Ivar; Ellefsen, Kai Olav & T?rresen, Jim
(2019).
A Categorization of Reinforcement Learning Exploration Techniques Which Facilitates Combination
of Different Methods.
-
Glette, Kyrre
(2019).
Kunstig intelligens for tilpasningsdyktige roboter.
-
T?rresen, Jim
(2019).
Making Robots Adaptive and Preferable to Humans.
-
Martin, Charles Patrick & T?rresen, Jim
(2019).
An Interactive Musical Prediction System with Mixture Density Recurrent Neural Networks.
-
Nygaard, T?nnes Frostad; Nordmoen, J?rgen Halvorsen; Martin, Charles Patrick; T?rresen, Jim & Glette, Kyrre
(2019).
Lessons Learned from Real-World Experiments with
DyRET: the Dynamic Robot for Embodied Testing.
-
N?ss, Torgrim Rudland; T?rresen, Jim & Martin, Charles Patrick
(2019).
A Physical Intelligent Instrument using Recurrent Neural Networks.
-
Martin, Charles Patrick & Torresen, Jim
(2019).
An Interactive Music Prediction System with Mixture Density Recurrent Neural Networks.
-
Glette, Kyrre
(2019).
Kunstig intelligens for tilpasningsdyktige roboter
.
-
T?rresen, Jim
(2019).
Design and Control of Robots for Real-World Environment.
-
Miseikis, Justinas; Brijacak, Inka; Yahyanejad, Saeed; Glette, Kyrre; Elle, Ole Jacob & T?rresen, Jim
(2019).
Two-Stage Transfer Learning for Heterogeneous Robot Detection and 3D Joint Position Estimation in a 2D Camera Image Using CNN.
-
Glette, Kyrre; Nygaard, T?nnes Frostad & Vogt, Yngve
(2019).
Her er universitetets nest selvl?rende robot.
[Business/trade/industry journal].
Teknisk ukeblad.
-
T?rresen, Jim
(2019).
Intelligent and Adaptive Robots in Real-World Environment.
Show summary
240862
259293
247697
288285
262762
-
Faitas, Andrei; Baumann, Synne Engdahl; Torresen, Jim & Martin, Charles Patrick
(2019).
Generating Convincing Harmony Parts with Simple Long Short-Term Memory Networks.
-
Martin, Charles Patrick; N?ss, Torgrim Rudland; Faitas, Andrei & Baumann, Synne Engdahl
(2019).
Session on Musical Prediction and Generation with Deep Learning.
-
T?rresen, Jim
(2019).
Future and Ethical Perspectives of Robotics and AI.
-
T?rresen, Jim
(2019).
Artificial Intelligence and Applications in Health and Care
.
-
T?rresen, Jim
(2019).
Hva er kunstig intelligens?
-
T?rresen, Jim
(2019).
Sensing Human State with Application in Older People Care and Mental Health Treatment.
-
Ellefsen, Kai Olav & T?rresen, Jim
(2019).
Evolutionary Robotics: Automatic design of robot bodies and control.
-
T?rresen, Jim
(2019).
Supporting Older People with Robots for Independent Living.
Show summary
247697
288285
262762
-
T?rresen, Jim
(2018).
Kunstig Intelligens – L?rende og tilpasningsdyktig teknologi.
-
T?rresen, Jim
(2018).
Remote Lab and Applications for High Performance and Embedded Architectures.
-
T?rresen, Jim
(2018).
Intelligent Systems for Medical and Healthcare Applications.
-
T?rresen, Jim
(2018).
UiO Visit to UFRJ – An overview of research .
-
T?rresen, Jim
(2018).
N?r etikk betyr alt.
Dagens n?ringsliv.
ISSN 0803-9372.
-
-
T?rresen, Jim
(2018).
Kunstig intelligens – hvem, hva og hvor.
-
T?rresen, Jim
(2018).
Artificial Intelligence Applied for Real-World Systems.
-
-
-
-
-
Martin, Charles Patrick
(2018).
MicroJam.
Show summary
MicroJam is a mobile app for sharing tiny touch-screen performances. Mobile applications that streamline creativity and social interaction have enabled a very broad audience to develop their own creative practices. While these apps have been very successful in visual arts (particularly photography), the idea of social music-making has not had such a broad impact. MicroJam includes several novel performance concepts intended to engage the casual music maker and inspired by current trends in social creativity support tools. Touch-screen performances are limited to 5-seconds, instrument settings are posed as sonic "filters", and past performances are arranged as a timeline with replies and layers. These features of MicroJam encourage users not only to perform music more frequently, but to engage with others in impromptu ensemble music making.
-
-
-
-
Martin, Charles Patrick; Glette, Kyrre; Nygaard, T?nnes Frostad & T?rresen, Jim
(2018).
Self-Awareness in a Cyber-Physical Predictive Musical Interface.
Show summary
We introduce a new self-contained and self-aware interface for musical expression where a recurrent neural network (RNN) is integrated into a physical instrument design. The system includes levers for physical input and output, a speaker system, and an integrated single-board computer. The RNN serves as an internal model of the user’s physical input, and predictions can replace or complement direct sonic and physical control by the user. We explore this device in terms of different interaction configurations and learned models according to frameworks of self-aware cyber-physical systems.
-
(2018).
Real-World Evolution Adapts Robot Morphology and Control to Hardware Limitations.
-
Nygaard, T?nnes Frostad; S?yseth, Vegard D?nnem; Nordmoen, J?rgen Halvorsen & Glette, Kyrre
(2018).
Stand with the DyRET robot.
-
Glette, Kyrre
(2018).
Automatic design of bodies and behaviors for real-world robots.
-
-
Nygaard, T?nnes Frostad; Martin, Charles Patrick; T?rresen, Jim & Glette, Kyrre
(2018).
Exploring Mechanically Self-Reconfiguring Robots for Autonomous Design.
Show summary
Evolutionary robotics has aimed to optimize robot control and morphology to produce better and more robust robots. Most previous research only addresses optimization of control, and does this only in simulation. We have developed a four-legged mammal-inspired robot that features a self-reconfiguring morphology. In this paper, we discuss the possibilities opened up by being able to efficiently do experiments on a changing morphology in the real world. We discuss present challenges for such a platform and potential experimental designs that could unlock new discoveries. Finally, we place our robot in its context within general developments in the field of evolutionary robotics, and consider what advances the future might hold.
-
Martin, Charles Patrick
(2018).
Deep Predictive Models in Interactive Music.
-
T?rresen, Jim; Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav & Martin, Charles Patrick
(2018).
Equipping Systems with Forecasting Capabilities .
-
Garcia Ceja, Enrique Alejandro; Ellefsen, Kai Olav; Martin, Charles Patrick & T?rresen, Jim
(2018).
Prediction, Interaction, and User Behaviour.
Show summary
The goal of this tutorial is to apply predictive machine learning models to human behaviour through a human computer interface. We will introduce participants to the key stages for developing predictive interaction in user-facing technologies: collecting and identifying data, applying machine learning models, and developing predictive interactions. Many of us are aware of recent advances in deep neural networks (DNNs) and other machine learning (ML) techniques; however, it is not always clear how we can apply these techniques in interactive and real-time applications. Apart from well-known examples such as image classification and speech recognition, what else can predictive ML models be used for? How can these computational intelligence techniques be deployed to help users?
In this tutorial, we will show that ML models can be applied to many interactive applications to enhance users’ experience and engagement. We will demonstrate how sensor and user interaction data can be collected and investigated, modelled using classical ML and DNNs, and where predictions of these models can feed back into an interface. We will walk through these processes using live-coded demonstrations with Python code in Jupyter Notebooks so participants will be able to see our investigations live and take the example code home to apply in their own projects.
Our demonstrations will be motivated from examples from our own research in creativity support tools, robotics, and modelling user behaviour. In creativity, we will show how streams of interaction data from a creative musical interface can be modelled with deep recurrent neural networks (RNNs). From this data, we can predict users’ future interactions, or the potential interactions of other users. This enables us to “fill in” parts of a tablet-based musical ensemble when other users are not available, or to continue a user’s composition with potential musical parts. In user behaviour, we will show how smartphone sensor data can be used to infer user contextual information such as physical activities. This contextual information can be used to trigger interactions in smart home or internet of things (IoT) environments, to help tune interactive applications to user’s needs, or to help track health data.
-
Martin, Charles Patrick; Glette, Kyrre & T?rresen, Jim
(2018).
Creative Prediction with Neural Networks.
Show summary
The goal of this tutorial is to apply predictive machine learning models to creative data. The focus of the tutorial will be recurrent neural networks (RNNs), deep learning models that can be used to generate sequential and temporal data. RNNs can be applied to many kinds of creative data including text and music. They can learn the long-range structure from a corpus of data and “create” new sequences by predicting one element at a time. When embedded in a creative interface, they can be used for “predictive interaction” where a human collaborates with, influences, and is influenced by a generative neural network.
We will walk through the fundamental steps for training creative RNNs using live-coded demonstrations with Python code in Jupyter Notebooks. These steps are: collecting and cleaning data, building and training an RNN, and developing predictive interactions. We will also have live demonstrations and interactive live-hacking of our creative RNN systems!
You’re welcome to bring a laptop with python to the tutorial and load up our code examples, or to follow along with us on the screen!
-
Martin, Charles Patrick
(2018).
Predictive Music Systems for Interactive Performance.
Show summary
Automatic music generation is a compelling task where much recent progress has been made with deep learning models. But how these models can be integrated into interactive music systems; how can they encourage or enhance the music making of human users?
Musical performance requires prediction to operate instruments, and perform in groups. Predictive models can help interactive systems to understand their temporal context, and ensemble behaviour. Deep learning can allow data-driven models with a long memory of past states.
This process could be termed "predictive musical interaction", where a predictive model is embedded in a musical interface, assisting users by predicting unknown states of musical processes. I’ll discuss a framework for predictive musical interaction including examples from our lab, and consider how this work could be applied more broadly in HCI and robotics. This talk will cover material from this paper: https://arxiv.org/abs/1801.10492
-
N?ss, Torgrim Rudland; Martin, Charles Patrick & T?rresen, Jim
(2019).
A Physical Intelligent Instrument using Recurrent Neural Networks.
Universitetet i Oslo.
-
-
Fjeld, Matias Hermanrud & T?rresen, Jim
(2018).
3D Spatial Navigation in Octrees with Reinforcement Learning.
Universitetet i Oslo.
-
Wallace, Benedikte & Martin, Charles Patrick
(2018).
Predictive songwriting with concatenative accompaniment.
Universitetet i Oslo.
-
T?rresen, Jim; Teigen, Bj?rn Ivar & Ellefsen, Kai Olav
(2018).
An Active Learning Perspective on Exploration in Reinforcement Learning.
Universitetet i Oslo.