Click here to edit subtitle

My name is Artem MELNYK, I am a Software Engineer at Qenvi Robotics.

I worked for  Amadeus as  C++ Software Development Engineer, at  INRIA as Research and Development Engineer in Sophia-Antipolis in HEPHAÏSTOS Project.


I worked at the Cergy-Pontoise University as a research and teaching assistant during two years.


I did my Ph.D. "in co-tutelle" between Université de Cergy-Pontoise and Donetsk National Technical University (France/Ukraine) in 2014 with Pr. Patrick Hénaff (Neurocybernetics Team of Prof. Philippe GAUSSIER) and Pr. Volodymyr Borysenko.


My previous research was focused on (1) human-robot physical interaction and its quantitative analysis (2), (3) tactile perception for the robotic arm. 


My  actual  research is focused on rehabilitation robotics and its coupling with virtual or immersive reality (4) , (5) motion tracking for gait rehabilitation, the biologically inspired robot control (6) and tensegrity robots (7). 


Keywords: bio-inspired control, physical human-robot interaction, humanoid robot control, measurement, control, scientific instrumentation.


Please, find my Curriculum Vitae or one-page CV.


My LinkedIn page or Research Gate page or Google Scholar .


https://orcid.org/0000-0003-3991-2720



--- Technical and Personal skills ---


Development and simulation: C/C++, Matlab, Python. Basic ability: R, Qt5, git, CMake, Java.
Development of the applications for signal processing, for data acquisition or for automation/control purpose.
Industry Software: Proficient in: Matlab/Simulink. Basic ability: LabView, DesignSpark, Quartus II.
Mathematical modeling of complex industrial systems for analysis or control.
VR and Immersive technologies: Basic ability: Unity, VTK, Blender.
3-D virtual world creation for Immersive Environment with gamification.
Internet of Things: Basic ability with: Spark (Photon), Raspberry PI, Arduino Yun.
Development of basic interactive system to communicate between sensor and master program using device API.
Micro-controllers: Proficient in: Atmel AVR, ARM and Phidgets. Basic ability with: Feather, Cyclone II.
Realization of several microprocessor real-world applications for human motion measure or robot control.
Sensors: Proficient in: Inertial sensors, Force/Tactile Sensors, Distance Sensors, Basic ability: Kinect.
Sensor integration to the control loop and building real-world robotic applications for control or interaction
Automation and Robotics: Position control, Admittance or Impedance Control, Neural Control.
Analyzing and Reporting data: Exploratory data analysis, Reports, State-of-the-Art, Scientific Writing.
Teamwork: Contributing, Cooperation, Expanding Ideas, Research


--- Previous Employment ---

French Institute for Research in Computer Science and Automation
Engineer R&D
As Engineer in the HEPHAISTOS project, I contribute to the creation of new robot for rehabilitation purposes. I contribute to the development of C/C++/Qt5 written modules for motion analysis software.

Cergy-Pontoise University, Information Processing and System Teams
Temporary Lecturer and Research Assistant
I participated in ETIS laboratory research in the field of the neural network control architectures to implement, for example, a physical interaction between a robot joint covered with artificial skin and a human being.

Donetsk National Technical University (Ukraine)
Senior Researcher / Senior Lecturer/ Assistant
My research was devoted to the studies of the dynamic of thermal and electromagnetic phenomenon of complex industrial systems based on asynchronous motors. I was responsible for the group of 3 researchers working on the research project.
Sophia-Antipolis

August 2016 - December 2018


Cergy-Pontoise

September 2014–August 2016


Donetsk (Ukraine)

August 2003–April 2014


--- List of my Professional Development Courses & Training Seminars ---


C++ Software Development: White belt and Yellow belt. by Moscow Institute of Physics and Technology & Yandex on Coursera (2019).

Unity Basics (2018)

Exchange and debate during scientific actions (2018)

Linux Administration: installation and system administration (2017)

Deep Learning Spring School Deep Learning Spring School (2017)

 Object oriented programming C++ (2016);

Current free-time projects

Abstract: 
We present a neuronal architecture to control a compliant robotic model of the human vertebral column for postural balance. The robotic structure is designed using the principle of tensegrity that ensures to be lightweight, auto-replicative with multi-degrees of freedom, flexible and also robust to perturbations. We model the central pattern generators of the spinal cords with a network of nonlinear Kuramoto oscillators coupled internally and externally to the structure and error-driven by a proportional derivative (PD) controller using an accelerometer for feedback. This coupling between the two controllers is original and we show it serves to generate controlled rhythmical patterns. We observe for certain coupling parameters some intervals of synchronization and of resonance of the neural units to the tensile structure to permit smooth control and balance. We show that the top-down PD control of the oscillators flexibly absorbs external shocks proportionally to the perturbation and converges to steady state behaviors. We discuss then about our neural architecture to model motor synergies for compliance control and also about tensegrity structures for soft robotics. The 3D printed model is provided as well as a movie at the address https://sites.google.com/site/embodiedai/current-research/tensegrityrobots.

https://youtu.be/8TjRaTYpO6c

Poster here.

Motor trajectories prediction

In this poster, we outline a NN model for trajectories learning and prediction online, i.e to learn without knowing the future of the trajectory. Our model is based on the idea that a trajectory can be learned and recognized by chunks.
Each chunk is learned by a prediction unit (PU). Hence, our problem can be divided in two parts : (1) how a PU can learn and recognize online a given chunk as a temporal category; and (2) how a global architecture allow the competition of many prediction units to provide the prediction of the whole trajectory thru time.

Abstract:
The future coexistence of humanoid robots and human being presuppose knowledge of cognitive mechanisms involved in interpersonal human physical and social interactions. The act of a handshake is an example of a physical interpersonal interaction that plays an important social role because it is based on physical and social couplings that lead to synchronization of motion. Studying handshake for robotics is interesting as it can expand the behavioral properties of robots for interaction with a human being in a more natural way. Previous experiments have shown that the phase of physical contact which contains hands tightening initiates a phenomenon of mutual synchrony. Considering the biological rhythmic nature of handshake phenomenon, a robot controller inspired by rhythmic structures of the human motor nervous system is proposed in this paper. The model is based on coupled rhythmic neurons, constituting a central pattern generator with learning mechanisms for the frequency of the movement perceived during the interaction. Experiments with a robot arm show that it is possible for the robot to learn to synchronize its rhythm with the human rhythms imposed during the handshake.
Abstract:
Touch perception is an important sense to model in humanoid robots to interact physically and socially with humans. We present a neural controller that can adapt the compliance of the robot arm in four directions using as input the tactile information from an artificial skin and as output the estimated torque for admittance control-loop reference. This adaption is done in a self-organized fashion with a neural system that learns first the topology of the tactile map when we touch it and associates a torque vector to move the arm in the corresponding direction. The artificial skin is based on a large area piezoresistive tactile device (ungridded) that changes its electrical properties in the presence of the contact. Our results show the self-calibration of a robotic arm (2 degrees of freedom) controlled in the four directions and derived combination vectors, by the soft touch on all the tactile surface, even when the torque is not detectable (force applied near the joint). The neural system associates each tactile receptive field with one direction and the correct force. We show that the tactile-motor learning gives better interactive experiments than the admittance control of the robotic arm only. Our method can be used in the future for humanoid adaptive interaction with a human partner.

My hobby is painting (oil or gouache).