People

Celebrities and Royals from around the world. Right on schedule.

News

Daily news and events, covered by our international photographers.

Features

Odd, funny and touchy images. Be amazed.

Styling

Fashion and design trends.

Portrait

Premium Portraiture.

Reportage

In-depth Coverage.

Creative

Selected stock imagery.

Dukas Bildagentur
request@dukas.ch
+41 44 298 50 00

Your search:

152 result(s) in 0.11 s

  • Mobile World Congress Barcelona 2025
    DUKAS_185136321_NUR
    Mobile World Congress Barcelona 2025
    The HTC Vive, a virtual reality, augmented reality, and mixed reality headset by the Chinese company powered by Snapdragon, is exhibited in black at the Qualcomm pavilion during the Mobile World Congress 2025 in Barcelona, Spain, on March 5, 2025. (Photo by Joan Cros/NurPhoto)

     

  • Mobile World Congress Barcelona 2025
    DUKAS_185136336_NUR
    Mobile World Congress Barcelona 2025
    The HTC Vive, a virtual reality, augmented reality, and mixed reality headset by the Chinese company powered by Snapdragon, is exhibited in black at the Qualcomm pavilion during the Mobile World Congress 2025 in Barcelona, Spain, on March 5, 2025. (Photo by Joan Cros/NurPhoto)

     

  • Technology Trade Show
    DUKAS_185113075_NUR
    Technology Trade Show
    The HTC Vive, a virtual reality, augmented reality, and mixed reality headset by the Chinese company powered by Snapdragon, is exhibited in black at the Qualcomm pavilion during the Mobile World Congress 2025 in Barcelona, Spain, on March 5, 2025. (Photo by Joan Cros/NurPhoto)

     

  • Technology Trade Show
    DUKAS_185113069_NUR
    Technology Trade Show
    The HTC Vive, a virtual reality, augmented reality, and mixed reality headset by the Chinese company powered by Snapdragon, is exhibited in black at the Qualcomm pavilion during the Mobile World Congress 2025 in Barcelona, Spain, on March 5, 2025. (Photo by Joan Cros/NurPhoto)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785271_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: nna Fang, a graduate student in the School of Computer Science's Human-Computer Interaction Institute at Carnegie Mellon, uses a VR headset to practice stress relief strategies

    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785270_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: Anna Fang, a graduate student in the School of Computer Science's Human-Computer Interaction Institute at Carnegie Mellon, uses a VR headset to practice stress relief strategies.

    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785269_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: A screenshot from the VR experience showing stress relief strategies for interacting at a party.

    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785268_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: A screenshot from the VR experience showing stress-relief strategies for conflict with a
    room mate
    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785267_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: A screenshot from the VR experience showing stress-relief strategies for public speaking. This example allows the virtual audience to ask questions

    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Pubic speaking stress buster used virtual reality audience
    DUKAS_183785266_FER
    Pubic speaking stress buster used virtual reality audience
    Ferrari Press Agency
    Avatars 1
    Ref 16751
    21/04/2025
    See Ferrari text
    Pictures must credit: Carnegie Mellon University
    People nervous about public speaking or interacting with strangers at parties are being offered help — with a virtual world where they can practice their social skills.
    A team has developed virtual and augmented reality situations that simulate stressful situations and help people practice stress-relief strategies.
    Users put on a pair of VR or AR glasses and practice what they want to say with a digital audience.
    A team at the USA’s Carnegie Mellon University tested the stress simulation technology on a group of 19 volunteers, the majority of whom overwhelmingly supported it.
    A spokesperson said: “Everyday situations can sometimes feel like big stressors, whether it's delivering an important work presentation, attending a party full of strangers or confronting a partner.
    “Talking to a friend or a therapist can help but so can practice.”
    The team focused on three scenarios that cause people the most stress and anxiety in their daily lives, according to research — public speaking, crowded social events and interpersonal conflict.

    OPS: A screenshot from the VR experience showing stress-relief strategies for public speaking. This example does not allows the virtual audience to ask questions

    Picture suppplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852873_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852872_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852869_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to control the volume on their phone.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852868_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852864_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852863_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852857_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to control the volume on their phone.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852855_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852852_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it.

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Using foot movements to order coffree from an app
    DUKAS_179852850_FER
    Using foot movements to order coffree from an app
    Ferrari Press Agency
    Feet 1
    Ref 16475
    17/01/2025
    See Ferrari text
    Picture MUST credit: University of Waterloo
    Researchers have developed a way to control smartphone apps — with their feet.
    The idea is that different length strides, dragging a heel or tapping a toe for example can all help navigate options on a smartphone.
    The study, from the University of Waterloo in Ontario, Canada, was the idea of computer science professor Daniel Vogel.
    He was frustrated by having to stop and use his phone with cold fingers while walking to get coffee.
    That got him wondering if there could be a way to place orders without pausing.
    This led to a study where volunteers tested 22 different foot motions, rating them on ease of movement, compatibility with walking, and social acceptability.
    The researchers used an augmented reality headset to detect specific gait patterns.
    The idea is that you can navigate apps by altering your footfall, turning your foot one way or the other as you walk.
    Prof Vogel said extreme movements like dance steps or a jump would likely be easy for a system to recognise.
    But he added that these might be harder to perform and would deviate too far from normal walking for people to feel comfortable doing them in public.

    OPS: A researcher in an augmented reality headset uses different foot movements to open an Uber EATS smartphone app , elect a drink and then pay for it. First they scroll through the apps to find the correct one

    Pitcure supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839829_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839828_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839825_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839823_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839821_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839818_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • AI-powered  augmented reality golf coach
    DUKAS_179839817_FER
    AI-powered augmented reality golf coach
    Ferrari Press Agency
    Caddie 1
    Ref 16472
    17/01/2025
    See Ferrari text
    Picture MUST credit: Caddie Vision
    Here’s a new AI-powered device designed to get amateur golfers swinging like a pro.
    It’s an augmented reality coach said to offer real-time analysis of every swing.
    It also has AI-powered analytics to track swing metrics, course strategy, and real-time visual and audio feedback.
    There is also a lost ball function for finding wild shots.
    And it measures distances to greens to help players select the correct club.
    The system, called Caddie Vision , takes the form of a pair of AR glasses that connect to a mobile app.
    On the inside of the lenses all the information needed is displayed.

    OPS: Prototype Caddie Vision app view with the AR glasses.

    Picture supplied by Ferrari
    (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683703_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.The monitoing of facial expressions. This could ould reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683702_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about meal time. It analyses the food and also monitors eating habits such as ensuring food is chewed correcxtly.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683701_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683700_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about posture.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683699_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about meal time. It analyses the food and also monitors eating habits such as ensuring food is chewed correcxtly.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683698_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.This cutaway shows the placement of tiny cameras which here monitor facial expression

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683697_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.The monitoing of facial expressions. This could ould reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683696_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about activity

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683695_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.This cutaway shows the placement of tiny cameras which here monitor facial expression

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683694_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about meal time. It analyses the food and also monitors eating habits such as ensuring food is chewed correcxtly.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683693_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.This cutaway shows the placement of tiny cameras which feed an AI brain.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683691_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about posture.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683690_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.Flipping pages of an augmented reality book by simply winking.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683686_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about meal time. It analyses the food and also monitors eating habits such as ensuring food is chewed correcxtly.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683685_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.The monitoing of facial expressions. This could ould reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683682_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.Using just the eyes to flip pages of an augmented reality book

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683679_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses give a read out to the wearer about meal time. It analyses the food and also monitors eating habits such as ensuring food is chewed correcxtly.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683676_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.This cutaway shows the placement of tiny cameras which here monitor facial expression

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Smart glasses open a window on lifestyle
    DUKAS_176683675_FER
    Smart glasses open a window on lifestyle
    Ferrari Press Agency
    Sense 1
    Ref 16253
    24/10/2024
    See Ferrari text
    Picture MUST credit: Emteq Labs
    A pair of artificial intelligence equipped high tech glasses has been developed said to be the world’s first emotion-sensing eyewear.
    They monitor posture, facial expressions to gauge mood, eating and dietary habits as well as walking and exercise using built-in cameras to provide AI-powered analysis.
    Other uses include gaming with player insights, training and rehabilitation by displaying instructions, and helping with mental health.
    Research demonstrated the glasses, called OCOsense, could reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.
    This opens up possibilities for remote diagnostics and continuous monitoring of mental health conditions like anxiety, depression, and even neurological conditions like autism spectrum disorder.
    They can also be used to flip the pages of augmented reality books with the wink of an eye.
    The spectacles are designed to change how people understand and improve their health in real-time.
    They are the work of UK-based Emteq Labs which is said to be a pioneer AI-powered emotional and behavioural analysis.

    OPS: The OCOsence smart glasses.The monitoing of facial expressions. This could ould reliably distinguish between individuals with and without depression based on their emotional and facial behaviours.

    Picture supplied by Ferrari (FOTO: DUKAS/FERRARI PRESS)

     

  • Lunettes Orion de réalité augmentée Meta avec IA
    DUKAS_175507037_BES
    Lunettes Orion de réalité augmentée Meta avec IA
    Pictures must credit: Meta Tech giant Meta has unveiled its new augmented reality glasses which could one day even compete with smartphones. They can take a hands-free video call to catch up with friends and family in real time, and let users stay connected on WhatsApp and Messenger to view and send messages. The spectacles are not the same as the company’s Ray Ban Meta AI glasses which have just received updates. The new glasses, called Orion, are still under development and no release date has yet. They are powered by artificial intelligence in the form of Meta AI, a smart assistant developed in-house. Wearers get to experience large holographic screens that float before their eyes. The device is claimed to understand and interpret what users may need at the moment in real-time. The miniaturised cameras and sensors sit on the sides of the frame. © Meta via JLPPA/Bestimage
    JLPPA / Bestimage

     

  • Lunettes Orion de réalité augmentée Meta avec IA
    DUKAS_175507035_BES
    Lunettes Orion de réalité augmentée Meta avec IA
    Pictures must credit: Meta Tech giant Meta has unveiled its new augmented reality glasses which could one day even compete with smartphones. They can take a hands-free video call to catch up with friends and family in real time, and let users stay connected on WhatsApp and Messenger to view and send messages. The spectacles are not the same as the company’s Ray Ban Meta AI glasses which have just received updates. The new glasses, called Orion, are still under development and no release date has yet. They are powered by artificial intelligence in the form of Meta AI, a smart assistant developed in-house. Wearers get to experience large holographic screens that float before their eyes. The device is claimed to understand and interpret what users may need at the moment in real-time. The miniaturised cameras and sensors sit on the sides of the frame. © Meta via JLPPA/Bestimage
    JLPPA / Bestimage

     

  • Lunettes Orion de réalité augmentée Meta avec IA
    DUKAS_175507034_BES
    Lunettes Orion de réalité augmentée Meta avec IA
    Pictures must credit: Meta Tech giant Meta has unveiled its new augmented reality glasses which could one day even compete with smartphones. They can take a hands-free video call to catch up with friends and family in real time, and let users stay connected on WhatsApp and Messenger to view and send messages. The spectacles are not the same as the company’s Ray Ban Meta AI glasses which have just received updates. The new glasses, called Orion, are still under development and no release date has yet. They are powered by artificial intelligence in the form of Meta AI, a smart assistant developed in-house. Wearers get to experience large holographic screens that float before their eyes. The device is claimed to understand and interpret what users may need at the moment in real-time. The miniaturised cameras and sensors sit on the sides of the frame. © Meta via JLPPA/Bestimage
    JLPPA / Bestimage

     

  • Lunettes Orion de réalité augmentée Meta avec IA
    DUKAS_175507031_BES
    Lunettes Orion de réalité augmentée Meta avec IA
    Pictures must credit: Meta Tech giant Meta has unveiled its new augmented reality glasses which could one day even compete with smartphones. They can take a hands-free video call to catch up with friends and family in real time, and let users stay connected on WhatsApp and Messenger to view and send messages. The spectacles are not the same as the company’s Ray Ban Meta AI glasses which have just received updates. The new glasses, called Orion, are still under development and no release date has yet. They are powered by artificial intelligence in the form of Meta AI, a smart assistant developed in-house. Wearers get to experience large holographic screens that float before their eyes. The device is claimed to understand and interpret what users may need at the moment in real-time. The miniaturised cameras and sensors sit on the sides of the frame. © Meta via JLPPA/Bestimage
    JLPPA / Bestimage

     

  • Next page