Introduction By Roger Smith Rivas

The Live Data in Virtual Reality project developed on Unity is an exciting project created for the class 8810 Eye Tracking II: Gaze Sensing and Interaction in XR at Clemson University in spring-2023. The project aims to combine the latest virtual reality technology to create an immersive and interactive user experience in which users can navigate the environment. Every time the user’s gaze focuses on a specific object, the program will fetch a value (number) transmitted by a Web API representing the actual temperature in a water pump.

Github Code:

Getting started

Folder Tree
Projects in GutHub:
  1. The “LiveDataInVR-UnityProject” folder contains the code in Unity, built in the framework 2021.3.18f1 (more information).
  2. The “LiveDataTransmiterAPI” folder contains a web API with ASP.NET Core (more information).
  3. The “docs” folder includes a simple HTML file with all the documentation related to this project.
Cloning the repository from GitHub:
  1. On, navigate to the main page of the repository here.

  2. Above the list of files, click Code.

    "Code" button

  3. Copy the URL for the repository.

    • To clone the repository using HTTPS, under "HTTPS", click .

    • To clone the repository using an SSH key, including a certificate issued by your organization's SSH certificate authority, click SSH, then click .

    • To clone a repository using GitHub CLI, click GitHub CLI, then click .

      Screenshot of the "Code" dropdown menu. To the right of the HTTPS URL for the repository, a copy icon is outlined in dark orange.

  4. Open Git Bash.

  5. Change the current working directory to the location where you want the cloned directory.

  6. Type git clone, and then paste the URL you copied earlier.

    $ git clone
  7. Press Enter to create your local clone.

    $ git clone
                                > Cloning into `Spoon-Knife`...
                                > remote: Counting objects: 10, done.
                                > remote: Compressing objects: 100% (8/8), done.
                                > remove: Total 10 (delta 1), reused 10 (delta 1)
                                > Unpacking objects: 100% (10/10), done.


  1. You will need at least 15 GB of space available in your computer to install Unity Engine and to download the project.
  2. Install Unity Hub application your computer with the editor version “2021.3.18f1”. How to download and install Unity.
  3. Install Visual Studio 2022. How to install Visual Studio 2022
  4. Install GitHub Desktop (optional). How to install GitHub Desktop.
  5. HTC VIVE PRO EYE device with all its dependencies (required if you want to see the eye interaction). Setting up VIVE Pro Eye for the first time. You can use your laptop keyboard to interact with the game in case you do not have the HTC VIVE PRO EYE device.


Open Unity project:
  1. First, open Unity Hub on your computer. If you haven't installed it yet, you can download it from the official Unity website.
  2. Once Unity Hub is open, click on the "Projects" tab located on the left side of the window.
  3. Click on the "Add" button located on the top right corner of the window.
  4. A pop-up window will appear, select "Browse" to find the project folder you want to open.
  5. Select the project folder and click on "Select Folder".
  6. After selecting the project folder, it will appear in the list of projects in Unity Hub.
  7. Select the project from the list, and then click on the "Open" button located at the bottom right corner of the window.
  8. The Unity Editor will launch and open your project.

That's it! Now you can start working on your project in Unity. You will see Unity window like the following image:

Research Log

The following section conveys the record of various activities and processes involved in conducting this research. It provides documentation, observations, and decisions throughout the research process. This section aims to help researchers keep track of the work, ensure consistency, and provide a clear record of the methods and results.

  1. Create a VR enviroment in Unity
  2. Compatible with HTC Vive Pro Eye
  3. Dynamic loading of assets
  4. Fetch live data when user watch an object
  5. Publish on GitHub the Project and methods
  6. Budget friendly (preferably)
  7. Low cognitive load (preferably)

Software and Hardware

Why Unity?

Unity Engine software was chosen for this project compared to other software for the following reasons:

  1. Previous knowledge: The developer working on the project has more experience programming in C# language (since 2016) than other programming languages and has already had some experience working in Unity Engine (since 2022).
  2. Easy to use: Unity has a user-friendly interface and a simple learning curve, making it accessible to developers of all skill levels.
  3. Multi-platform support: Unity supports multiple platforms, including Oculus Rift, HTC Vive, PlayStation VR, and Google Cardboard, among others.
  4. Community support: Unity has a large and active community of developers who share their experiences and provide support through online resources.
  5. Built-in VR support: Unity has built-in support for VR, which means that developers can easily create VR applications and experiences without needing to write additional code.
  6. Extensive asset store: Unity's asset store contains a wide range of pre-built assets, including models, animations, and scripts, which can save developers time and effort. This variety of assets is essential to expand the project or simulate specific scenarios.
  7. Performance optimization: Unity offers built-in tools for optimizing VR performance, critical for delivering a smooth and immersive VR experience.
  8. Continuous development: Unity is constantly evolving, with regular updates and new features added, ensuring it remains a cutting-edge engine for VR development.
Why HTC Vive Pro Eye?

This device was chosen for this project compared to other devices for the following reasons:

  1. Device Availability: The device is available for use free of charge in the Clemson University laboratory where the project was carried out.
  2. Eye-tracking technology: The Vive Pro Eye is the first VR headset to feature eye-tracking technology, which allows a more natural interaction, immersive experiences, and the potential for new forms of exchange, such as eye-based navigation and selection.
  3. High-resolution: The Vive Pro Eye has a high-resolution of 1440 x 1600 pixels per eye, which provides a more detailed and precise VR experience (convenient when selling the software).
  4. Developer support: HTC is firmly committed to developer support, offering resources such as the Viveport SDK and the Vive Developer Community to help developers create VR experiences with the Vive Pro Eye.
  5. Unity integration: The Vive Pro Eye is fully integrated with Unity, so developers can easily create VR experiences for the Vive Pro Eye using the Unity engine.
  6. OpenVR support: The Vive Pro Eye is compatible with OpenVR, an open standard for VR development that supports multiple VR devices and engines. Developers can use the Vive Pro Eye with other VR devices and machines if desired.

Data interaction and gaze track

The program has two principal actions:

1. Obtaining the user’s gaze

For the program to identify when the user looks at the object, it was necessary to implement the function “GazeFocusChanged” from the Tobii SDK for Unity. Tobii SDK (Software Development Kit) is a suite of software tools and APIs (Application Programming Interfaces) provided by Tobii AB, which develops eye-tracking technology. The Tobii SDK allows developers to integrate eye-tracking functionality into their applications and games and use eye-gaze data as an input method, among other features.

This information was discussed in the paper by PhD Duchowski, Andrew T. et al., 3D gaze in virtual reality: vergence, calibration, event detection in 2022. It discusses the challenges and opportunities of 3D gaze-tracking technology in virtual reality (VR) environments. It also describes some techniques for calibrating 3D gaze-tracking technology, like using natural features as reference points.

The following content is the code implemented in the project:

public void GazeFocusChanged(bool ObjectHasFocus)
    if (ObjectHasFocus)
        _readData = true;
        _targetColor = highlightColor;
        _readData = false;
        _targetColor = _originalColor;
2. Obtaining live data

When the user’s gaze is watching the object, the program executes an HTTP request to a Web API which transmits a JSON response containing a number done using the class UnityWebRequest. UnityWebRequest is a class in the Unity game engine that allows developers to send HTTP requests and receive responses from web servers. One crucial point is that the program executes the HTTP request in every frame (the execution is controlled using Coroutines).


To improve the speed of the request-response, it was necessary to fetch the URL using the URL without SSL (Secure Sockets Layer). You can encrypt the transmitted data manually if you want a secure transmission.

At the beginning of this project, the idea of fetching live data was inspired by the paper 3DRepo4Unity: dynamic loading of version controlled 3D assets into the unity game engine by Friston, Sebastian, et al., which describes a plugin for the Unity game engine that allows developers to access and use 3D models and data stored in the 3DRepo platform directly within Unity. This integration makes it easier for developers to access and use 3D data in their projects, streamlining the development process and enabling more complex and sophisticated 3D applications. But later, I switched from fetching entire graph assets to showing a simple text from an HTTP request because of the high hardware and network transfer cost while using 3DRepo4Unity.


The data is transmitted by a Web API created in ASP.NET Core Web API. The server used to install this project was provided for free by Sistemas RSA.

API url:

The following content is the code implemented in the project:

IEnumerator GetRequest(string uri)
    using (UnityWebRequest webRequest = UnityWebRequest.Get(uri))
        yield return webRequest.SendWebRequest();
        if (webRequest.result == UnityWebRequest.Result.ConnectionError || webRequest.result == UnityWebRequest.Result.DataProcessingError)
            Debug.Log("Error: " + webRequest.error);
            _TextEditor.GetComponent().text = "Heat: " + webRequest.downloadHandler.text + " Celsius";
            var courses = webRequest.downloadHandler.text;

Development log

The section is designed to cover the schedule and progress of the project.

Progress 85%:
January 2023 (completed)

Definition of goals, schedule, and design.

February 2023 (completed)

Create the environment in virtual reality using Unity.

March 2023 (completed)
  • Research and read related papers on the topics covered in this project.
  • Present the project proposal in class and provide a paper proposal.
  • Test and integrate dynamic loading of 3D Assets and gaze interaction with an object on the code.
April 2023 (80% completed)
  • Implement the SDK and plugins in the Unity project and test the new code.
  • Create the web API project to improve data-transmission performance.
  • Create the website with the research log content.
  • Present in class the research log.
  • Finish visual details in the Unity project.
May 2023
  • Create the standalone executable version for Windows and Mac OS.
  • Final presentation in class.

Demo video


The author thanks Ph.D. Andrew T. Duchowski from Clemson University for the helpful discussions about this work and for providing guidelines on the elaboration of this project and to Sistemas RSA for providing all the resources to implement the web API. The information provided in this web site are below the author.