Categories
collaboration unit

Week 5

In this class, the instructor introduced TouchDesigner and gave us some basic knowledge about it.

TouchDesigner is a visual development platform for real-time interactive content creation. It provides a powerful node-based interface that allows users to create complex real-time graphics and interactive applications by connecting various data streams and processing nodes. Users can use TouchDesigner’s functions to process audio, video, 3D graphics, sensor data, etc. to create rich and diverse real-time visual effects. It is widely used in various fields such as stage performances, art installations, exhibitions, and interactive activities.

Categories
collaboration unit

Week 4

In this week’s course, we presented our initial ideas about artifacts to our instructor. The instructor also gave us some reasonable suggestions, which helped us to have a clear positioning and development direction for our project.

Above is my initial idea for Project 2. I drew a storyboard


Story Outline

——On the eve of the deadline, the painter has been working non-stop to complete the work. He is very tired, but still insists on painting.

——During the painting process, he was too sleepy, so he fell asleep unknowingly, with his head resting on his drawing board.

——Suddenly, the painter feels himself sinking, through a fantasy tunnel into a strange but familiar scene.

——When he looks around, he finds that this scene is actually the world in the painting he is creating.

The teacher suggested to me that in the process of traveling through the tunnel, some floating objects could be added around the protagonist, such as his drawing board, his drawing scale, computer mouse, keyboard and other objects, sometimes floating in front of him, sometimes floating behind him.

Categories
collaboration unit

Week 3

In this week’s course, we learned how to use Unreal VCam to create a real-time camera in our project and add it to the Unreal project. In this way, we can observe our scene project in real time through the mobile phone, just like using VR.

In the process of using Unreal VCam, you encountered the problem of filling in the IP address and found that your IP address and the IP address of the classmate next to you are exactly the same, which may cause your mobile phone to be unable to connect to the computer. First, make sure you are in the same network environment. If you are in a school or company network, you may share the same IP segment, which is why your IP address is the same. In this case, you can try to connect with a different device or reconfigure the network to get a different IP address.

Unreal VCam is a powerful tool that turns your mobile device into a virtual camera to control and preview scenes in Unreal Engine. This is very useful for filmmaking, virtual production, and real-time animation. By using the Unreal VCam application, you can adjust the lens angle, focal length, and other camera settings in real time on the scene, which greatly improves work efficiency and creative freedom.

Unreal VCam is a powerful tool that turns your mobile device into a virtual camera to control and preview scenes in Unreal Engine. This is very useful for filmmaking, virtual production, and real-time animation. By using the Unreal VCam application, you can adjust the lens angle, focal length, and other camera settings in real time on the scene, which greatly improves work efficiency and creative freedom.

Common problems and solutions:

Connection failure: If the connection fails, check that the IP address and port number are correct and that the firewall is not blocking the connection.

Lag issues: If there is lag, make sure the network connection is stable and try to use 5GHz Wi-Fi or a wired network to connect to the computer.

App crashes: If the app crashes, try restarting the device and app to ensure that the app is compatible with the Unreal Engine version.

Categories
collaboration unit

Week 2

This week’s course was similar to last week’s, and was also divided into two groups. This group of students stayed in the classroom to study the course, while the other group of students went to the workshop for practice. In this week’s course, we mainly discussed the basic requirements and content of the Artifact project this semester, and helped each student develop a preliminary plan. In addition, we also had individual discussions to help students better advance the completion of the Artifact project.

We first clarified the goals and expected results of the Artifact project. Then, based on each student’s interests and professional fields, we helped them identify suitable Artifact topics and provided relevant resources and guidance. We encouraged students to choose a project topic that can show their unique talents and creativity.

We had a detailed discussion with each student to understand their ideas, schedules, and resource needs. We provided some suggestions and advice to help students develop a reasonable schedule and specific task assignments. We also emphasized the importance of good communication and collaboration, and encouraged students to actively communicate and cooperate with mentors and other students, support each other and learn from each other’s experience.

We also summarized and expanded each student’s project. We provided some creative ideas and ideas to help students further improve and expand their projects. At the same time, we also encourage them to keep an open mind and adjust and improve project plans at any time to adapt to possible changes and challenges.

This week’s course helped students further clarify the requirements and content of the Artifact project and provided them with targeted guidance and support. Through individual discussions and project expansion, students received more inspiration and help, laying a good foundation for their own project planning and completion. We believe that in the following studies, students will be able to give full play to their talents and creativity and successfully complete their own Artifact projects.

Categories
collaboration unit Uncategorised

Week 1

In the first week of our course, the students in our class were divided into two groups and arranged different learning contents. The first group of students stayed in the classroom for regular courses, while the second group of students, which was my group, was taken to a special motion capture studio to experience and learn the basic operation of the Vicon system.

Vicon is a high-precision motion capture system widely used in film and television, game development, sports science and virtual reality. It accurately records complex dynamic movements by installing a series of reflective markers on the captured object and then using multiple high-speed cameras to simultaneously capture the position and movement trajectory of these markers. This data can be used to drive virtual characters and make animations more realistic.

After entering the motion capture studio, we first learned about the basic components of the Vicon system, including cameras, reflective markers, motion capture suits and data processing software. We also learned how to properly set up and calibrate these devices to ensure that the captured data is accurate.

Next, we were fortunate to be able to experience using these motion capture devices in person. Several students put on special motion capture suits, which were covered with multiple reflective markers. The positions of these markers were carefully designed so that the camera can accurately capture their movements from all angles.

Before we officially started the motion capture, the technicians demonstrated how to set up the Vicon system. They explained in detail the placement of the camera, the calibration process, and how to use the dedicated software to monitor and adjust the capture environment. We learned that good calibration and setup are key steps to ensure the accuracy of the captured data.

When all the preparations were completed, the students wearing the motion capture suits began to perform a series of movements. The high-speed camera in the studio synchronously captured every subtle change in movement and transmitted the data to the computer in real time. Through the software interface, we can see a virtual 3D model that accurately reproduces the students’ movements. This data can then be exported and used for a variety of applications such as animation production and motion analysis.

The whole process not only gave us a deeper understanding of motion capture technology, but also allowed us to experience the complexity and fun of its actual operation. Through this practice, we not only learned theoretical knowledge, but also mastered some basic operating skills, laying a solid foundation for future learning and application.

All in all, this was a very interesting and inspiring course.

Concept of a World:
A world is defined by its community, dimension, and actions.
It encompasses beliefs, voice, and the ability to advance or collapse.
Worlds have mythic figures, members, and rules that might appear arbitrary to outsiders.
A world serves the common welfare of its members and provides magic powers and permissions to live uniquely within it.
It creates relevance and meaning through agreed actions and undergoes innovations and upheavals to stay active.

A world is a vessel for stories and structures but is always evolving.

Emphasizes the importance of understanding the user rather than relying solely on the designer’s intuition.
Utilizes mixed methodologies such as observational studies, interviews, personas, scenarios, user-journeys, prototyping, testing, and data analysis.

Categories
collaboration unit

1.10 :Rendering and editing stage

Finally, Chris and Carllos rendered and output the video.Finally, Chris and Carlos rendered and output the video.

After the rendering and editing were completed, everyone discussed and put forward some opinions and output several different versions. Everyone chose the most suitable version together. Finally, we move together to complete the project.

Categories
collaboration unit

1.9 :Make animation in unreal

This process was very smooth in comparison. As for some minor problems, such as the character’s facial deformation and passing through the mold, I found that it was a version problem, so I re-redirected and solved the problem. Then first complete the animation of a character and bring it to the meeting for everyone to see.

Then I completed the animation of several other characters. Finally, you only need to add their own instruments to the characters. Following online tutorials on how to bind weapons to characters, I encountered trouble in the blueprint, so I sought help from the teacher in class and solved the problem.

At the same time, Chris also completed the special effects and sent us the test

Categories
collaboration unit

1.8 Solve the mesh problem in UE

Since time was very tight at this time, the animation part had to be completed as soon as possible, so Carlos and J. and I held a small meeting at school to discuss how to solve the problem.

To solve this problem we have this plan:

  • I first work on the animation to see how to repair the bass forearm, and then add the instrument to the character to change the animation. After it can be exported, I will see if there are any problems with rebinding.
  • The second option is to send me a Metahuman. I can modify it as I like and test whether there are any problems with the animation. If there are no problems, I can use Metahuman.
  • The third thing is to see if there is still any problem if the clothes are replaced with the same bones. Then let’s try again to see if the dazto unreal plug-in can be successful. There may be a problem with importing bones in fbx format.

So I did some tests on these solutions

In Maya, I successfully used adv for bone binding, so that even if the daztounreal plug-in failed, I could still animate in Maya.

Fortunately, my team members successfully solved the problem of daztounreal and passed the project files to me, so we finally decided to use ikretarge to create animations in ue.

Categories
collaboration unit

1.7 : Finding methods to solve problems 02

Since there was a problem installing the plug-in last week, I booked a project clinic this week, but the problem was not resolved. So we are looking for solutions, such as whether we can bind it in ue. Then Chris will do the special effects production.

Since there was a problem installing the plug-in last week, I booked a project clinic this week, but the problem was not resolved. So we are looking for solutions, such as whether we can bind it in ue. Then Chris will do the special effects production.

Categories
collaboration unit

1.6 : Finding methods to solve problems

This week we held an online meeting。Here is the meeting Notes:

  • J has finished 2 characters – will be done Wednesday/Thursday depending on when YouYou gets characters over
  • Yi has Unreal file – going to test animations. Will be done beginning of next week
  • Chris has received and opened Unreal project from Hanyue – is going to start environment/atmosphere VFX. Will be done next Monday then will do character VFX
  • Chris & Carlos will speak offline about scene capture
  • Will speak with Youyou about assets

Here is some showcase of our scene:

Solve problems of the mesh in UE:

I tried to find some tutorials on the Internet to solve this problem, and found that importing the daz model into ue must go through a plug-in called daztounreal. So in class, the teacher helped me find some tutorials on the installation and use of daztounreal.

I went back and followed the tutorial to install it. The plug-in must be installed into unreal through daz. I first installed daz and successfully installed the plug-in in daz. In daz, it shows that the plug-in is successfully installed in ue, but the plug-in cannot be found in the plug-in manager of ue.