Publications – Angelos Barmpoutis https://abarmpou.github.io/angelos Professor of Digital Arts and Sciences Fri, 21 Feb 2025 13:05:08 +0000 en-US hourly 1 https://wordpress.org/?v=6.4.5 Integrated Telehealth and Extended Reality to Enhance Home Exercise Adherence Following Total Hip and Knee Arthroplasty https://abarmpou.github.io/angelos/page/integrated-telehealth-and-extended-reality-to-enhance-home-exercise-adherence-following-total-hip-and-knee-arthroplasty/ Wed, 19 Feb 2025 18:57:01 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=357 Read More]]> Nearly one million total hip and knee arthroplasties (THA/TKA) are performed annually in the United States, with most patients discharged home and prescribed home exercise programs (HEPs) to enhance lower extremity function. Traditional paper-based HEPs, while accessible and low-cost, often lack engagement and real-time feedback, which are critical for adherence and performance optimization. Extended reality (XR) and telehealth (TH) systems offer promising solutions, combining engagement and feedback, though each has limitations. To address these gaps, we designed and executed a pilot study that compared exercise performance in individuals with THA/TKA using a conventional paper-based HEP versus a proof-of-concept system, dubbed Tele-PhyT, that included the ideal characteristics of a future XR technology that would enable seamless HEP-TH systems, with robust marker-less full body tracking, real-time visual feedback, and performance quantification. The pilot study used a randomized cross-over design and targeted two types of users: therapists and patients. Participants favored Tele-PhyT for its real-time feedback and ease of use, and noted its potential to improve HEP adherence and exercise accuracy.

]]>
Saving lives with coding: the global impact of an undergraduate project https://abarmpou.github.io/angelos/page/saving-lives-with-coding-the-global-impact-of-an-undergraduate-project/ Tue, 19 Nov 2024 18:37:00 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=355 Read More]]> Every smartphone app you use, every video game you play and every website you visit is built on a scaffolding of code that dictates how it works. However, coding is not just about writing computer programmes that work properly; it is also about ensuring that the end users of these programmes find them engaging and easy to use.

User interface and user experience (UI/UX) design is often a central focus of software development courses. However, developing these skills in an educational environment can be tricky as students rarely get the chance to create programmes that will actually be used by real people. To tackle this issue, Professor Angelos Barmpoutis, a computer scientist at the University of Florida (UF), has developed a course-based undergraduate research experience (CURE) in which students develop educational apps that are then used by thousands of school students and educators across the US and further afield.

This article is an open-access resource, produced by Futurum Careers, for K-12 students and offers a glimpse into the process of software development.

]]>
Assessing the Influence of Passive Haptics on User Perception of Physical Properties in Virtual Reality https://abarmpou.github.io/angelos/page/assessing-the-influence-of-passive-haptics-on-user-perception-of-physical-properties-in-virtual-reality/ Wed, 14 Feb 2024 18:49:54 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=298 Read More]]> This paper presents a pilot study that explores the role of low-cost passive haptics on how users perceive physical properties such as the size and weight of objects within virtual reality environments. An A/B-type study was conducted as an air hockey simulation in which
participants experienced two versions: one adhered to conventional VR settings, while the other incorporated a tangible surface, a real table. Statistical analysis of the data collected from post-study questionnaires indicated a shift in perception of size and weight when exposed to the haptic-enhanced simulation, with virtual objects perceived as larger or heavier. It was also noted that the observed shift of the user perception was stronger when the simulation with the tangible surface was experienced first. The paper presents details on the implementation of the air hockey simulation and the setup within the testing environment as well as the statistical analysis performed on the collected data, offering practical recommendations for future applications.

]]>
Investigating how interaction with physical objects within virtual environments affects knowledge acquisition and recall https://abarmpou.github.io/angelos/page/investigating-how-interaction-with-physical-objects-within-virtual-environments-affects-knowledge-acquisition-and-recall/ Tue, 13 Feb 2024 19:04:33 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=308 Read More]]> The use of passive haptics in virtual reality environments has been shown to improve procedural learning across various application domains such as first responders training, kayaking, and others (Calandra at al. 2023, Barmpoutis et al. 2020). In this paper we want to go one step further and quantify the effect of passive haptics on knowledge acquisition and recall. We developed a specialized virtual reality application for learning various chemical compounds and their components. Participants engaged in activities that involved precise mixing and proportioning of chemical components to form targeted compounds (see Figure 1). Employing an A-B test framework, participants were randomly assigned to two identical virtual reality environments, differing only in the substitution of the VR controller with a physical jar.

Post-study surveys were administered to gauge user perceptions regarding interaction accuracy and realism, as well as their ability to recall acquired knowledge (specifically, the list of ingredients) from their virtual experience. This pilot study, conducted at the University of Florida Reality Lab, involved 12 subjects. Rigorous statistical analyses, including chi-square tests, were performed on the collected data, with detailed results outlined in this paper.

Two key findings emerged from the study: (a) the presence of the physical jar significantly heightened perceived interaction accuracy, particularly in precise liquid pouring tasks, and (b) users exhibited a remarkable 33% improvement in knowledge recall when utilizing the physical jar as opposed to a conventional VR controller. These results establish a compelling, statistically significant correlation between the integration of passive haptic objects in VR and knowledge acquisition and recall. Furthermore, this study lays the groundwork for a larger-scale study in the future.

]]>
Enhancing Museum Experience with Virtual Reality: Situating 3D Museum Collections in Context https://abarmpou.github.io/angelos/page/enhancing-museum-experience-with-virtual-reality-situating-3d-museum-collections-in-context/ Mon, 12 Feb 2024 16:37:06 +0000 https://abarmpou.github.io/angelos/?post_type=product&p=337 Read More]]> Recent advancements in photogrammetry and 3D LiDAR scanners have led to a noticeable increase in the creation of 3D scans of museum artifacts. This trend has opened up new possibilities for museums to create interactive and immersive experiences using virtual reality (VR). In collaboration with the Florida Natural History Museum, we have developed an interactive VR game that utilizes the extensive 3D digital collection of the Florida Museum. The goal is to enhance the traditional museum experience by immersing visitors in dynamic VR environments that showcase 3D museum collections within context. Specifically, our VR game highlights 3D scans of endangered species of underwater creatures in the ocean. Children can swim alongside these sea creatures while an AI conversational agent provides scientific insights. Our VR game was showcased to the public at the Florida Natural History Museum during a public outreach event, attracting visits from three K-12 school trips. We conducted field observations to evaluate children’s interactions with the VR game and conducted semi-structured interviews with children’s guardians as well as museum staff. The findings from our observations emphasized the importance of shared experiences among visitors, which could be facilitated by projecting the VR gameplay on a large screen to mitigate the isolated nature of the HMD VR experience limitations. Additionally, museum staff emphasized the significance of considering visitor traffic when designing VR experiences in museum settings. The findings also highlighted a preference for seated experiences over standing ones due to safety concerns related to children colliding with others and museum artifacts. This paper provides an overview of our design process and the challenges of implementing HMD VR in museum settings, offering valuable insights for future endeavors aimed at designing public VR educational experiences targeting children.

]]>
Reinscribing the 3rd dimension in epigraphic studies and transcending disciplinary boundaries https://abarmpou.github.io/angelos/page/reinscribing-the-3rd-dimension-in-epigraphic-studies-and-transcending-disciplinary-boundaries/ Sun, 31 Dec 2023 20:26:10 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=300 Read More]]> Over the past decade, archaeology and epigraphy have been reconsidering their modus operandi. Prompted and facilitated by technological advances, motivated by new research questions, and challenged by growing calls to engage with contemporary audiences, they have been experimenting with methodological approaches and interdisciplinary collaborations. Within this context, the Digital Epigraphy and Archaeology project (DEA) has been developing 3D digitization techniques that accommodate various types of artifacts, has been incorporating multidisciplinary approaches to achieve a more holistic stance towards the objects of study, and has focused on the reproducibility and accessibility of both its techniques and the 3D models.

This paper presents the DEA’s introspective and reembodied ways of preserving and studying the past by reconsidering historical artifacts and their digital re-materialization. The following sections discuss the project’s approach to copies and digital copies, 3D digitization and enhanced visualization processes, comprehensive cloud services, and 3D printing to present the DEA steps toward facilitating and advancing archaeology and epigraphy. Through such approaches that combine traditional rigor with technological novelty and affordances, the team’s vision is to popularize archaeology and epigraphy within and beyond academia and pinpoint the significance of the world’s heritage to the new generations of students and the public.

]]>
Prostate Capsule Segmentation in Micro-Ultrasound Images Using Deep Neural Networks https://abarmpou.github.io/angelos/page/prostate-capsule-segmentation-in-micro-ultrasound-images-using-deep-neural-networks/ Tue, 18 Apr 2023 23:36:13 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=287 Read More]]> Prostate cancer is the most common internal malignancy among males. Micro-Ultrasound is a promising imaging modality for cancer identification and computer-assisted visualization. Identifying the prostate capsule area is essential in active surveillance monitoring and treatment planning. In this paper, we present a pilot study that assesses prostate capsule segmentation using the U-Net deep neural network framework. To the best of our knowledge, this is the first study on prostate capsule segmentation in Micro-Ultrasound images. For our study, we collected multi-frame volumes of Micro-Ultrasound images, and then expert prostate cancer surgeons annotated the capsule border manually. The lack of clear boundaries and variation of shapes between patients make the task challenging, especially for novice Micro-Ultrasound operators. In total 2099 images were collected from 8 subjects, 1296 of which were manually annotated and were split into a training set (1008), a validation set (112), and a test set from a different subject (176). The performance of the model was evaluated by calculating the Intersection over Union (IoU) between the manually annotated area of the capsule and the segmentation mask computed from the trained deep neural network. The results demonstrate high IoU values for the training set (95.05%), the validation set (93.18%) and the test set from a separate subject (85.14%). In 10-fold cross-validation, IoU was 94.25%, and accuracy was 99%, validating the robustness of the model. Our pilot study demonstrates that deep neural networks can produce reliable segmentation of the prostate capsule in Micro-Ultrasound images and pave the road for the segmentation of other anatomical structures within the capsule, which will be the subject of our future studies.

]]>
Developing Mini VR Game Engines as an Engaging Learning Method for Digital Arts & Sciences https://abarmpou.github.io/angelos/page/developing-mini-vr-game-engines-as-an-engaging-learning-method-for-digital-arts-sciences/ Sat, 11 Mar 2023 23:42:04 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=289 Read More]]> Digital Arts and Sciences curricula have been known for combining topics of emerging technologies and artistic creativity for the professional preparation of future technical artists and other creative media professionals. One of the key challenges in such an interdisciplinary curriculum is the instruction of complex technical concepts to an audience that lacks prior computer science background. This paper discusses how developing small custom virtual and augmented reality game engines can become an effective and engaging method for teaching various fundamental technical topics from Digital Arts and Sciences curricula. Based on empirical evidence, we demonstrate examples that integrate concepts from geometry, linear algebra, and computer programming to 3D modeling, animation, and procedural art. The paper also introduces an open-source framework for implementing such a curriculum in Quest VR headsets, and we provide examples of small-scale focused exercises and learning activities.

]]>
AI-driven Human Motion Classification and Analysis using Laban Movement System https://abarmpou.github.io/angelos/page/ai-driven-human-motion-classification-and-analysis-using-laban-movement-system/ Thu, 14 Jul 2022 18:30:46 +0000 https://research.dwi.ufl.edu/projects/angelos/?post_type=product&p=295 Read More]]> Human movement classification and analysis are important in the research of health sciences and the arts. Laban movement analysis is an effective method to annotate human movement in dance that describes communication and expression. Technology-supported human movement analysis employs motion sensors, infrared cameras, and other wearable devices to capture critical joints of the human skeleton and facial key points. However, the aforementioned technologies are not mainstream, and the most popular form of motion capture is conventional video recording, usually from a single stationary camera. Such video recordings can be used to evaluate human movement or dance performance. Any methods that can systematically analyze and annotate these raw video footage would be of great importance to this field. Therefore, this research offers an analysis and comparison of AI-based computer vision methods that can annotate the human movement automatically. This study trained and compared four different machine learning algorithms (random forest, K neighbors, neural network, and decision tree) through supervised learning on existing video datasets of dance performances. The developed system was able to automatically produce annotation in the four dimensions (effort, space, shape, body) of Laban movement analysis. The results demonstrate accurately produced annotations in comparison to manually entered ground truth Laban annotation.

]]>
Virtual Kayaking: A study on the effect of low-cost passive haptics on the user experience while exercising https://abarmpou.github.io/angelos/page/virtual-kayaking-a-study-on-the-effect-of-low-cost-passive-haptics-on-the-user-experience-while-exercising/ Thu, 16 Jul 2020 14:44:20 +0000 https://research.dwi.ufl.edu/people/angelos/?post_type=product&p=35 Read More]]>

This paper presents the results of a pilot study that assesses the effect of passive haptics on the user experience in virtual reality simulations of recreation and sports activities. A virtual reality kayaking environment with realistic physics simulation and water rendering was developed that allowed users to steer the kayak using natural motions. Within this environment the users experienced two different ways of paddling using: a) a pair of typical virtual reality controllers, and b) one custom-made “smart paddle” that provided the passive haptic feedback of a real paddle. The results of this pilot study indicate that the users learned faster how to steer the kayak using the paddle, which they found to be more intuitive to use and more appropriate for this application. The results also demonstrated an increase in the perceived level of enjoyment and realism of the virtual experience.

Kayaking is an outdoor activity that can be enjoyed with easy motions and with minimal skill, and can be performed on equal terms by both people who are physically able and those with disabilities [1]. For this reason, it is an ideal exercise for physical therapy and its efficacy as a rehabilitation tool has been demonstrated in several studies [1-6]. Kayaking simulations offer a minimal-risk environment, which, in addition to rehabilitation, can be used in training and recreational applications [5]. The mechanics of boat simulation in general have been well-studied and led to the design of high-fidelity simulation systems in the past decades [3,7]. These simulators immerse the users by rendering a virtual environment on a projector [1,4,6] or a computer screen that is mounted on the simulator system [2,8]. Furthermore, the users can control the simulation by imitating kayaking motions using remote controls equipped with accelerometers (such as Wii controllers) [5] or by performing the same motions in front of a kinesthetic sensor (such as Kinect sensors) [4,6].

The recent advances in virtual reality technologies and in particular the availability of head mounted displays as self-contained low-cost consumer devices led to the development of highly immersive virtual experiences compared to the conventional virtual reality experiences with wall projectors and computer displays. Kayaking simulations have been published as commercial game titles in these virtual reality platforms [13]. However, the use of head mounted displays in intensive physical therapy exercises bears the risk of serious injuries due to the lack of user contact with the real environment. These risks could potentially be reduced if the users maintained continuous contact with the surrounding objects such as the simulator hardware, the paddle(s), and the floor of the room, with the use of passive haptics. Additionally, the overall user experience can be improved through sensory-rich interaction with the key components of the simulated environment.

This paper assesses the role of passive haptics in virtual kayaking applications. Passive haptics can be implemented in virtual reality systems by tracking objects of interest in real-time and aligning them with identically shaped virtual objects, which results in a sensory-rich experience [9,10]. This alignment between real and virtual objects allows users to hold and feel the main objects of interaction including hand-held objects, tables, walls, and various tools [11,12].

In this paper we present a novel virtual reality kayaking application with passive haptic feedback on the key objects of interaction, namely the paddle and the kayak seat. These objects are being tracked in real-time with commercially available tracking sensors that are firmly attached to them. Although the users’ real-world view is occluded by the head-mounted display, the users can see the virtual representation of these objects and naturally feel, hold, and interact with them. Subsequently, the users can perform natural maneuvers during the virtual kayaking experience by interacting with our “smart” paddle using the same range of motions as in real kayaking.

The proposed system was assessed with a pilot user study (n=10) that tested the following hypotheses: a) The use of passive haptics helps users learn kayaking faster and operate the simulation better compared to the conventional controller-based interaction. b) The use of passive haptics improves the level of immersion while kayaking in virtual reality.

The study was undertaken at the Realities Lab of the Digital Worlds Institute at the University of Florida. The volunteers who participated in this experiment were randomly assigned to the study and the control group and experienced the proposed virtual kayaking system with and without the use of passive haptics respectively. The data collection was performed with pre- and post-test surveys. In addition, the progress of each individual user during kayaking was recorded and the collected timestamps were analyzed.

The results from this study are presented in detail and indicate that the use of passive haptics in this application has a statistically significant impact on the user experience and affects their enjoyment, learning progress, as well as the perceived level of realism of the virtual reality simulation.

]]>