Hi I'm Øyvind Ole B. Roberg

Aliases: Takotzo, Batagames

I am a technical artist from Norway, currently pursuing a Bachelor's degree in Game and Entertainment Technology at Nord University.

Folding Robot

I designed and animated a six-limbed character in Maya, mastering 3D workflows and earning an A-grade for innovative features like a transforming shell.

For a recent school assignment, I worked extensively in Maya to design and develop a unique character featuring six or more limbs. This project encompassed the full 3D production pipeline, highlighting my versatility and proficiency in multiple areas of 3D art.

I began by creating concept art to define the character's appearance and functionality. From there, I modeled the character with attention to proper topology for efficient rigging and animation. Rigging the skeleton came next, where I implemented a robust system to manage the complexities of animating additional limbs.

I also performed UV mapping to prepare the model for texturing and set up a custom controller to streamline the animation process. Finally, I brought the character to life by animating it and assembling all components into a cohesive, polished presentation. This project demonstrates my ability to combine technical expertise and artistic vision to bring creative concepts to fruition.

One aspect I’m particularly proud of is the controller I created, which allows the character to curl into a ball and move its eye independently. This controller significantly simplified the animation process and added versatility to the character’s movements.

From the start, I set out to challenge myself by designing a character that could transform by curling into a ball. This required careful planning and problem-solving, particularly for the shell construction. My solution involved separate plates that functioned as armor when the character was walking and formed a cohesive wheel when curled up. This design not only enhanced the character’s functionality but also pushed me to think creatively about animation and rigging mechanics.

Throughout the semester, we had classes where our teacher introduced new techniques in Maya, walking us step-by-step through the process of creating an animated model. These lessons provided invaluable guidance as we worked on our characters.

Each week, I received constructive feedback from the teacher, which helped me refine and improve my character at every stage. This feedback was instrumental in identifying areas for enhancement and uncovering errors I hadn’t noticed before.

My teacher’s encouragement and positive remarks about my work—especially his comment that he was very happy with my character—were both motivating and affirming.

In the end, the dedication I put into this project, combined with the insights gained through feedback, resulted in an A grade. The experience not only deepened my understanding of the character creation process but also strengthened my skills in problem-solving and attention to detail.

Projection Mapping with MR

Created an unique experience where we combined the technologies of Projection Mapping and Mixed Reality.

For this project I worked on setting up the projectors, aligning the digital worrld with the real world and connecting the MR headsets with the projectors.

I worked in Unity and found a package which let us easily connect cameras from the Unity scene to the projectors, so they could show the parts of the scene we wanted to be visible in real life. Then it was running some tests locally cheking if it was possible by craeting the stage in the scene and playing it in the haedset.

After teh test I started working on connecting multiple headset to the same computer. The setup of the stage is multiple projectors controlled by a PC, while MR headsets connect wirelessly. I added some networking packages to Unity and made it so that the PC acted as a server host and controlled the entire experience. All of the users headsets are then synced and they will all see the same thing and users playing can intereact with objects in the scene and it will update for them all.

Important: Assets seen in the experience are not made buy us and are assets bought from the asstes store.

This is a school project where we are supposed to explore unique ways of using technologies. My group wanted to study the possibilities of combining a virtual stage or a projection mapped room with MR headset, so that we could bring objects out of the wall and into the room with the observers. With a full visuals of the real world the users can see the pictures showned on the walls and as well whatever objects are going through.

We were working on this following the school schedule, until our teacher aprouched us and wanted us to work full time for two weeks on this project to showcase it to various stakeholders. The team agreed to the work and we started producing a scene for a Proof of Concept to show them. We went through multiple ideas of what we could show and landed on an underwater experience, with a turtle floating back and forth with fish swimming around.

We wanted to add some interaction that would happen with the turtle when it was in the space with the users. So we added a feature where whenever a user touched the turtle it would stop and look around for a few seconds before continueing on its jurney.

We recieved a lot of positive feedback from teachers, stakeholders and fellow students who have tested the experience we created.

Important: Assets seen in the experience are not made buy us and are assets bought from the asstes store.

This project was a collaborative effort with valuable contributions from the following team members:

Rive to Unity Implementation

For Flashback, I used Rive to create an interactive UI, integrating dynamic animations and events into Unity.

I worked with a UI creation tool called Rive to design and implement an interactive user interface. Rive utilizes inputs to switch between animations, which can also send signals to Unity for further interaction. After creating the UI and configuring all the events within Rive, I integrated it into Unity by developing a script to manage these events and execute the appropriate methods, such as handling resume and exit button actions.

Some UI elements needed to be dynamically updated at runtime, so I designed accessible methods to allow team members to easily modify these elements during development. Additionally, certain elements, such as sliders, required sending their values to inform other components about their current state.

This was my first experience working with Rive, and it provided an excellent opportunity to deepen my understanding of the tool and its integration with Unity. I utilized various resources provided by Rive to learn how its system communicates with Unity, enabling me to effectively extract inputs and events and implement them in a cohesive workflow.

The UI was developed for a game called Flashback, a university project that ran from October to December. In the game, players experience a narrative where they recall events from the past and relive them through engaging minigames. My role focused on designing and implementing an interactive and dynamic user interface, ensuring it enhanced the game's storytelling and gameplay.

The project did not necessarily require Rive implementations and could have been completed using Unity alone. However, it provided a valuable opportunity to explore and experiment with a new tool, broadening my technical skillset.

Working with a team of fellow students, I contributed to the technical development while collaborating closely with designers and programmers to ensure the UI aligned with the game's overall vision. One key challenge was integrating Rive's unique animation system with Unity, which required researching and applying innovative solutions.

This project taught me the value of experimenting with new tools and technologies, as well as the importance of creating accessible systems that empower team members. Looking back, I would like to explore ways to further optimize the Rive-Unity integration for better performance and extend the UI's interactivity for future projects.

Check out Flashback on itch.io!

Have a look at C# Script for displaying the Rive on Unity.

Have a look at C# Script that handles the Events based on the current Rive scene.

Have a look at C# Script that handles the HUD controllers.

This project was a collaborative effort with valuable contributions from the following team members:

Blend Shape Creation

Topologised a face in Maya and used its blend shape kit to create a blend shape compatible with Apple's ARKit.

First I had to create a face to work on, so I got help from a friend and preformed a 3D scan of my face and imported it to Maya. With the scanned head I started retopologizing my face. I studied the muscles in a face and took an image of myself and started sketching the different muscles on my face. I also looked at a lot of examples of how the topology of a face should look like.

When the face was retopologizde I started working on the blend shapes. I was following the guidelines given to me on Apple's ARKit site and using techniques my teacher showed us in class, like mirroring and soft select. For every face expression that had a left and right expression (e.g. left and right blink) this was modeled on the same shape but later spitted on two different face shapes so that they would fit the ARKit's naming convention

Now I have made a face that are fully adaptable with the Apple's ARKit and usable in aplication as Unity and Unreal Engine.

This was a school assignment I had to complete as part of my CG class. As a class we had a thorough guidance through the process of Face Retopology, studying some of the face's muscles and where to create fitting loops in the mesh. After the Face Retopology we started working on animating the different faces and following then namespaces of the Apple's ARKit.

Have not received a grade, but received positive feedback from the teachers.

Games

Here are some of the games that I've worked on.

A game created as part of a 24 hour GameJam. A platforming game where you play as multiple characters in a trenchcoat and every time you jump you will leave a member behind.

I helped with brainstorming and coming up with the idea and programmed parts of the game and the player controller. I also buildt the levels based on the designers sketch. This game was created in Unity.

Try out the game on itch.io!

A created as part of a school project, manouver around Gearhaven and fight corrupt robot oilgarcs as Spring~Boy. Equiped with boxing gloves and spring arms, you can punch enemies from afar, but also grab onto walls and pull yourself towards them in a high speed and high action game

In this project I was team leader and 3D artist, I woved to myself that I would not touch programming that semester as I wanted to explore other roles in the industry like artist or leader.

Have a look at the game on itch.io! (Requires VR headset with joystick controllers to play)

Find more games i made/worked on at itch.io!

Tools & Software

Icons below represent tools I've worked with.

Autodesk Maya

Adobe Photoshop

Adobe Illustrator

Adobe Premiere Pro

DaVinci Resolve

Rive

GitHub

FMOD