- This page contains the process for the making of two exploratory projects namely “No Green No Dream” and “Collision Course: The Shard Struck”
Group Project : No Green No Dream
Critical Reflection
This group project has been really significant for honing my time management and extracting the most out of hardware and software resources effectively. The development was quite entertaining, but a few realizations and shortcomings are worth mentioning.
First of all, I faced some hardware and software limitations in the process of learning. Not having Redshift Render installed at home, I had to acutely rely most of it on a remote desktop to the university’s machines. This was possible, but with its own set of problems—laggy responses due to slow internet and frequent disconnections in the day when other students also request access. Despite all this, I went through the learning phase successfully.
In the shooting phase, I tried to create an HDRi map. My mentor recommended that a 360 camera would work best, so I shot many pictures with different exposures and later stitch them in Photoshop. The exported HDRi quality was disappointing; getting good results required more professional hardware and software manipulation.
On the other hand, learning at the plant FX segment was really great. Me being able to learn several techniques that enable me to complete my work and avoid the simulation part in Houdini. Also, I have worked out or know the VEX functions—quaternions and matrix functions, which I think will help me a lot in my upcoming work.
This has been a useful project to learn how to overcome hardware and software challenges, experiment with new techniques, and acquire advanced functions that will help in enhancing future projects.
Final Video
Story
The advert tells a story of a man who first looks in his phone for a taxi booking service. As he ponders over the options, a leaf flows past him, which attracts his attention. His eyes follow the leaf, and it leads him to an Ebike. Having come across this alternative mean of movement, he gives up the idea of using the taxi and instead chooses the bike.
He approaches and touches the bike, followed by magic—plants suddenly appear before him, and the scene is filled with greenery. Starting at being amazed by this unplanned change, he Suddenly urges to take the bike and just discover. The next scenes ride him through a tunnel from which when he exits to the other side, he meets a much greener world—a shocking difference in the environment as compared to the one he was coming from.
This new world is a testament that his choice had a positive effect. As a matter of fact, the surroundings are much more beautiful and vibrant, elaborating on how immense changes one can bring when making environmentally conscious decisions. He stood in awe as he looked out at what big a difference opting for sustainable transportation would make in creating a much greener and more beautiful world.
Individual efforts
Process Video
For this project, we have decided to create a story that revolves around sustainability. We came up with an advertisement in which a person decides to travel on an E-Bike.
For this project’s live-action CG part, I came up with the idea of growing plants on the surface of the E-Scooter. In the concept, the person comes and touches the bike, and the plants start to grow from the contact. When the person rides the scooter away, the plants grow out of the ground in the wake of the scooter.
FX Concept.
For the concept I came across this tutorial series from “Hellolux-Houdini in bloom” in witch they teach how to make plants procedurally and make them grow without the use of any simulations and only using smart techniques.
References





Group efforts
Week 01 & 02
To kick things off, we started by collecting reference images and sketching out the storyboard. Secondly, we went for location scouting for filming and experimented with the filming equipment.




During the second week of filming, we began shooting at various locations. For the first shot, we chose to film against the green wall in Elephant and Castle. The character is interrupted by a flying leaf, which leads him to an E-Bike and he takes a ride. The second shot took place in the graffiti tunnel near Waterloo Station. This scene will serve as the ending shot, with the character emerging from the tunnel to a green world.
For the filming we use various gear for filming, such as a gimbal, GoPro, tripod, green screen, and Blackmagic camera, all sourced from UAL’s orb platform.



Week 03
For the third week we got together in a meeting and we decided our roles in this project, I decided to look onto the FX and Compositing part of the project.
Our focus then shifted to finalizing our storyboard in preparation for next week’s filming. we had numerous ideas which posed a challenge when it came to depicting them through drawings. After scouring for references, we decided to start the storyboard from scratch and were successful in visualizing the entire story. This was crucial for estimating the duration of each scene through the creation of a storyboard and animatic.





After finishing the storyboard, Moonju made an animatic video that was initially around 50 seconds long. We felt this was too lengthy for the intended filming, so after a quick discussion, we chose to combine a few scenes at the start, bringing the duration down to approximately 45 seconds.
Week 04
In the fourth week, we completed filming at both locations we discussed previously. We began with the building scenes, and fortunately, there weren’t many people around, allowing us to film without disturbing passersby. However, setting up the camera was challenging because we weren’t accustomed to using it. The angle was too narrow despite being quite far from the actor. We realized this was due to sensor cropping, which occurs in the camera settings between 4K and 6K. We attempted to fix it, but since 6K required a high bitrate, the SD card couldn’t write fast enough, causing the camera to stop. Therefore, we decided to stick with 4K.
While filming, a masked thief on a bike almost snatched Minghan’s phone. Fortunately, the thief couldn’t steal it because Minghan had a lanyard attached to his phone. This incident caused significant delays as we were all quite shaken, but we eventually managed to finish filming that part quickly.



After finishing at the first location, we headed to the Waterloo tunnel. Unfortunately, the crowd made it difficult to film without interruptions. We tried various setups, including having the actor wear a chest mount, following the actor with the chest mount, and mounting the camera on the back. While they rode and filmed, I reviewed the footage on my phone using the GoPro app. After several more attempts, we finally completed the filming.
Once we have all the necessary footage and additional shots, Moonju fine-tune the rough cut, including color correction and audio adjustments. The music and sound effects will also be added to enhance the overall atmosphere of the video.
I am confident that with all these elements in place, our project will come together seamlessly and meet our expectations. With the teamwork and creative input from everyone involved, I believe we will produce a fantastic final product that we can all be proud of. I am looking forward to seeing the final result and sharing it with our audience.
For the tracking part, Moonju initially experimented with the footage using point tracking. However, we soon realized there was a problem with the perspective, as there was no rotation or scale data. Additionally, tracking was generally difficult due to the object’s environment reflections, which posed a significant challenge. We sought help from Gonzalo, who came up with a solution by isolating the bike’s handlebar with roto and then using track points on top of that. Now, I can use that track and a rough 3D mesh to grow the plants on top of it.

Collision Course: The Shard Struck
Critical Reflection
The individual project has developed my time management and effective use of hardware and software resources in the personal unit, and it has increased the enjoyment of the development process. However, there are a couple of conclusions and deficiencies worth mentioning.
I chose a GoPro for the live action shoot because it has a wide lens, records at a high frame rate, and is really small in size. I settled on handheld motion for getting a more realistic look when it mimics somebody filming with a phone. I was even naïve until after reviewing the shots back at home; the over-the-top movements were in no way so prominent on the little screen I was using. I must have shaken it really hard to make it pretty hard to follow. Fortunately, I used a very wide-angle lens, combined with 4K and high fps, which recorded enough information in the frame to allow stabilization through cropping and quality reduction, on almost 80% of the footage. It was an important stabilization, for it allowed me to track the footage and to bring the camera data into Houdini for the FX work.
With the FX, I sometimes encountered hardware limitations and memory problems, but eventually pulled through and created a meteor simulation after a few days. One of the major challenges was working with RBD (Rigid Body Dynamics) simulations. Since it was my first experience with RBD simulations, the learning curve was pretty steep. Besides, RBD simulations are particularly slow in making all the associated calculations. End result—there is a lot of waiting time while you update iterations. Another problem is the absence of environment geometry the RBD should physically interact with, like buildings and roads. After a little thinking, I decided to look for a few proxy geometries from Google Earth. With some specific software, I would be able to extract the 3D data, match the geometry with the camera, and create matte renders. The final simulation was very helpful for proxy geometry, but it still didn’t look great. I will work on this more until it looks better, and then attach it in my portfolio.
In general, this project made for a very good exercise of the ways in which CG can be integrated with live action. Largely, it taught me to face unpredictable challenges and to traverse my way out of an impasse. Certainly, it will help me in future projects.
Final Video
Story
It picks up slightly where the first project left, only this time it has had the big impact on the moon that has sent some of its rocks all over in the space. This case in Earth’s gravitational pull and enters the atmosphere, where it starts to burn up as it falls. People down there are not aware of what’s coming to get them and are out in the open admiring the serene sight of the society.
In this serenity, the first subtle hints at the disaster that will strike start to happen. The reflections in water surfaces and flashes from buildings, and the foreboding sound of the sonic boom, come together to hint gradually at the danger approaching the unsuspecting public. By this time they finally realize the danger posed, it is already too late to do anything.
The meteor collided with great impact with the Shard of London. Its top broke off and dropped. In all this madness, the guy who had been watching the action stands with a fixed posture, immobile by the shock and fear, unable to move from the danger quickly approaching him. The debris that broke from the shattered building then falls on him and the screen blacks out, signifying the end all of a sudden.
Process Video
Creating Camera in Nuke



Meteor Creation in Houdini





Motion and Simulation of primary pieces





Motion and Simulation of secondary pieces





Motion and Simulation of Tertiary pieces


Overlook of the FX setup


Cloud Creation






Proxy geometry from google earth


Modeling the shard and simulating








Rendering with farm
