Under Construction: Please excuse me while I continue to build this page. Content shown here depicts some work, but not all work. The current visible content is accessible and provides an overview of some work that I have done. Thank you.
Note: All content on this page is used to demonstrate some of my capabilities. Please do not copy or take content without permission.
****************************Work in progress - Below**********************************
TBD
- Unreal Engine Degraded Visual Environment (DVE) Apache Simulation VR
- Screen control project
- Point Cloud Data Experience
- Camera Projection Fundamentals and Theory
- Depth Imaging
- ZED Camera PTX file and point cloud capturing (to make useful for other applications can consume)
- Unreal Engine Camera Masking work for machine learning
- Unreal Engine 5 Exploration
****************************Work in progress - Above**********************************
Photogrammetry For Simulation Modeling
Photogrammetry is a process that is considered more of a science and technology that focuses on providing a reliable source such as images to quickly reconstruct three-dimensional models or environments. While utilizing photogrammetry software applications such as Reality Capture, 3DF Zephyr, Meshroom, and Metashape etc; I was able to create a workflow that focused on generating 3D models that can be used in simulation environments. In the below video there is a CLI script written by me that shows how to initially start the process of automating your configurations to go through the creation of the model from still images. Due to the nature of the content, I can only show the automated process being used on publicly available data using Reality Capture as the photogrammetry application. This concept has been extended to automating the recreation of models using the Reality Capture photogrammetry application. A simple execution of the .bat file automates the procedure by finding the needed images, configuring and processing alignment, configuring and processing the mesh, and configuring and processing texturing for the final model. I was able to gain an understanding of photogrammetry and learned best practices in a very short period of time to produce meaningful content. In addition to using photos I also had the opportunity to display laser scan/point cloud data using photogrammetry applications with the support of using various point could/laser scan software.
CLI Script Automatic Model Generation Using Reality Capture
AVEVA Serious Gaming and Training Applications
Here is some work that I did while at Aveva. I created a SQLite database and created a display while in the game engine for any part
clicked on and highlighted within milliseconds. I also made it possible to
display the basic and complex geometrical meshes, vehicles, particle systems,
3D sound system, animations, navigation, timeline, lights and lighting system,
multimedia tool, animations, user interface, environmental settings, etc. At
this company I gained a huge amount of experience while working with some
really talented senior software engineering coworkers. I also had the
opportunity to work with AVP Player, Model Prep, as well as the AVEVA AVP
application. Below is a link to one of
the Aveva affiliates demonstrating some of the content that I personally had a
hand in creating. Additionally there are
many videos generated that show some of the content created.
Video Link: Orinox Aveva Application Demonstration
Animation and Flight Control Test For An Apache Model
The below video demonstrates the testing of a blend animation in Unreal Engine to coordinate the controls of an Apache aircraft with it's animations. All animations where created using a combination of Blender and Unreal Engine, and the rigging of the aircraft was also created in Blender to identify portions needed for animation. Other fine tuned controls were added in Unreal Engine that allows for the control of the aircraft in the virtual environment. Refinements continue to progress to eventually create a simulation that demonstrated a Degraded Visual Environment (DVE) in virtual reality or desktop to allow users to directly experience a DVE scenario first hand to help understand the level of difficulty for controlling an aircraft when in that type of situation.
Video Link: Apache Animation Flight Test
Radar, Animation, and Movement Correlation Testing
The below video demonstrates the testing of the correlation of the mini radar to the movement of the aircraft throughout the environment in Unreal Engine using UI interaction observed from in world/game objects. It also served as a test to observe the pulsating animation of the dots, where the larger dot is simply placed on the radar's interface, but does not represent a real object in the world. Notice that there will be no movement from the larger dot since the goal is just to see how well the animation looks at that sized. The smaller dots actually represent objects that are being tracked in the world as the aircraft moves towards them; notice that objects update their relative position on the UI of the radar. Each ring represents an initial measurement of 1 KM or any measurement specified by the user, and can be reflected on the radar mini-map to give an accurate world position of the object with respect to the position of the aircraft. (Note: Enlarge the video to see the object better. It is a small white cube.)
180 Degree Custom In House Built Simulator
Here we have a custom built simulator composed of multiple off the counter products that was composed together to create a multi-purpose simulator. I was able to specify the equipment needed to create the simulator, with the exception of Hansen Tapaha procuring the seat hardware. Hansen and I also created a custom high quality platform to integrate into the simulator that contained the seat and controls. Once the parts where in I applied some custom configurations to setup a visual image that properly displayed on all five screens. The simulator was also setup to be virtual reality compatible to allow users to gain a more immersive experience. The multi-purpose simulator cost little to construct, but provided the robustness, efficiency, and flexibility of a much more expensive simulator without the limitation of having one purpose and cost. It was also created with the idea of having modular easily moveable components to make for easy dissembling and resembling supporting a moveable and adaptive simulator. This simulator allowed for multiple experiments to be ran and has scalability for multiple environments supporting flexibility.
Off The Shelf Simulator Build
Video Link: 180 Degree Custom Built Simulator
Preliminary World, Mini-map, and Multiple Camera View Correlation
This project is an example of a work in its early stages, and used for research studies. The final product would involve the redesign of the graphic material to make a more immersive and realistic environment, depending on the needs of the simulation. Current use involved human research, that did not require heavy graphics for all environmental variables. This content was used for research on human performance evaluations which had less interest on graphics, and more on symbols displayed in the environment. Also custom event triggers where designed to quickly allow for the creation of different simulation environments with the ability to import new meshes, images, and interactions. Interactions included communication between units, threat identification, threat removal, and custom event triggers to create new simulation environments allowing the final application to have multiple uses and the ability to be redesigned for various research experiences, analysis, and evaluations.
Virtual Reality Simulated Environment
Advance Concepts and Simulation: My role focused on research and development. I worked as the sole and lead developer of a very detailed virtual reality simulation using Unreal Engine in combination with Oculus Rift. The virtual reality training environment was created as a part of the work that I performed as a primary member of the Research and Development. Duties included project planning, product design (Hardware), leading project development, software architecture design, software architecture review, supervising, providing direction for individual tasks and project direction, technical demonstrations, and overseeing project stability.This project was used to provide a high-fidelity representation of real world hardware to promote a more efficient way of training soldiers. Using the virtual environment promotes a higher degree of user retention through the feeling of presence and physical replication of actual movements used to operate a specific device or piece of hardware/software. There are also the benefits of cost reduction and scalability. Additionally, it promotes the situational awareness and readiness that all soldiers require at no additional cost. The images below show some of the content within the virtual simulation. The experience provided is a deeply immersive virtual experience involving physical stimuli (visual, auditory, and tactile) to stimulate the user.
Power Supply Placement Indicator
User Interaction Area for Old and New Power Supply Replacement
3D Visualization and Simulation Project
Demonstrates functionality produced by me including UI creation/interactions, animations, modeling, camera user controls, weather/environmental conditions, writing and loading files, and the use of Unreal Engine utilizing UMG, blueprint scripting, and coding within one simulation project with modular components. The content was used to create custom content for simulations.
Unreal Engine Custom Simulation Built Content 1
Video Link: Unreal Code and Blueprint Breakdown
Unreal Engine Custom Simulation Built Content 2
Video Link: Unreal Engine Demo Helicopter and UAV (Reaper)
Getting Sonic into Unreal Engine Initial Workflow Demo
This is a project that allowed me to introduce my youngest son to art, modeling, programming, and game engines. The goal was to bring Sonic the Hedgehog into a game for us to play. This shows the initial steps taken minus much of the behind the scenes work to plan a project on a short timeline for my son to enjoy. This will allow him to grow in this area and assist/help in producing more advanced content with me in the future. It is important to note that this same concept can be applied to any geometry, simulation, or training including military, academia, commercial applications, etc.
This is a level/map layout that was suitable for testing our functionality and marks our success in relation to getting Sonic into a playable game. Created by my son and myself to bring his favorite character into a game. We ended with importing the Sonic model created in Blender, importing our ring model, creating animations for Sonic to run, adding a rotation to our rings, a particle system that activates after a ring is consumed by Sonic, actors to represent Sonic and the rings, placed objects within our virtual environment for user interactions, added input controls to control Sonic, adding sound effects, and the addition of any logic/functionality needed to make our content playable.
If we were to continue this at a later date next steps would include adding in different state animations for jumping and running fast, creating a sample level in a jungle environment for testing, and adding in obstacles and hazards into the environment. We would also need to design and develop various subsystems to support the needed functionality to have a MVP (minimum viable product) that could serve as our first demo to start capturing user feedback for improvements before further building. Please see the sample video demonstrating our process.
Additional Content: Multiple Summaries and Links
Additional Content: Developing Military UI Icons Using a FM 1-02.2 Frame Approach
No comments:
Post a Comment