art

this-photo-got-3rd-in-an-ai-art-contest—then-its-human-photographer-came-forward

This photo got 3rd in an AI art contest—then its human photographer came forward

Say cheese —

Humans pretending to be machines isn’t exactly a victory for the creative spirit.

To be fair, I wouldn't put it past an AI model to forget the flamingo's head.

Enlarge / To be fair, I wouldn’t put it past an AI model to forget the flamingo’s head.

A juried photography contest has disqualified one of the images that was originally picked as a top three finisher in its new AI art category. The reason for the disqualification? The photo was actually taken by a human and not generated by an AI model.

The 1839 Awards launched last year as a way to “honor photography as an art form,” with a panel of experienced judges who work with photos at The New York Times, Christie’s, and Getty Images, among others. The contest rules sought to segregate AI images into their own category as a way to separate out the work of increasingly impressive image generators from “those who use the camera as their artistic medium,” as the 1839 Awards site puts it.

For the non-AI categories, the 1839 Awards rules note that they “reserve the right to request proof of the image not being generated by AI as well as for proof of ownership of the original files.” Apparently, though, the awards did not request any corresponding proof that submissions in the AI category were generated by AI.

The 1839 Awards winners page for the

Enlarge / The 1839 Awards winners page for the “AI” category, before Astray’s photo was disqualified.

Because of this, the photographer, who goes by the pen name Miles Astray, was able to enter his photo “F L A M I N G O N E” into that AI-generated category, where it was shortlisted and then picked for third place over plenty of other entries that were not made by a human holding a camera. The photo also won the People’s Choice Award for the AI category after Astray publicly lobbied his social media followers to vote for it multiple times.

Making a statement

On his website, Astray tells the story of a 5 am photo shoot in Aruba where he captured the photo of a flamingo that appears to have lost its head. Astray said he entered the photo in the AI category “to prove that human-made content has not lost its relevance, that Mother Nature and her human interpreters can still beat the machine, and that creativity and emotion are more than just a string of digits.”

That’s not a completely baseless concern. Last year, German artist Boris Eldagsen made headlines after his AI-generated picture “The Electrician” won first prize in the Creative category of the World Photography Organization’s Sony World Photography Award. Eldagsen ended up refusing the prize, writing that he had entered “as a cheeky monkey, to find out if the competitions are prepared for AI images to enter. They are not.”

In a statement provided to press outlets after Astray revealed his deception, the 1839 Awards organizers noted that Astray’s entry was disqualified because it “did not meet the requirements for the AI-generated image category. We understand that was the point, but we don’t want to prevent other artists from their shot at winning in the AI category. We hope this will bring awareness (and a message of hope) to other photographers worried about AI.”

For his part, Astray says his disqualification from the 1839 Awards was “a completely justified and right decision that I expected and support fully.” But he also writes that the work’s initial success at the awards “was not just a win for me but for many creatives out there.”

Even a mediocre human-written comedy special might seem impressive if you thought an AI wrote it.

Enlarge / Even a mediocre human-written comedy special might seem impressive if you thought an AI wrote it.

I’m not sure I buy that interpretation, though. Art isn’t like chess, where the brute force of machine-learning efficiency has made even the best human players relatively helpless. Instead, as conceptual artist Danielle Baskin told Ars when talking about the DALL-E image generator, “all modern AI art has converged on kind of looking like a similar style, [so] my optimistic speculation is that people are hiring way more human artists now.”

The whole situation brings to mind the ostensibly AI-generated George Carlin-style comedy special released earlier this year, which the creators later admitted was written entirely by a human. At the time, I noted how our views of works of art are immediately colored as soon as the “AI generated” label is applied. Maybe you grade the work on a bit of a curve (“Well, it’s not bad for a machine“), or maybe you judge it more harshly for its artificial creation (“It obviously doesn’t have the human touch“).

In any case, reactions to AI artwork are “a reflection of all the fear and promise inherent in computers continuing to encroach on areas we recently thought were exclusively ‘human,’ as well as the economic and philosophical impacts of that trend,” as I wrote when talking about the fake AI Carlin. And those human-centric biases mean we can’t help but use a different eye to judge works of art presented as AI creations.

Entering a human photograph into an AI-generated photo contest says more about how we can exploit those biases than it does about the inherent superiority of man or machine in a field as subjective as art. This isn’t John Henry bravely standing up to a steam engine; it’s Homer Simpson winning a nuclear plant design contest that was not intended for him.

This photo got 3rd in an AI art contest—then its human photographer came forward Read More »

frozen-embryos-are-“children,”-according-to-alabama’s-supreme-court

Frozen embryos are “children,” according to Alabama’s Supreme Court

frozen cell balls —

IVF often produces more embryos than are needed or used.

January 17, 2024, Berlin: In the cell laboratory at the Fertility Center Berlin, an electron microscope is used to fertilize an egg cell.

Enlarge / January 17, 2024, Berlin: In the cell laboratory at the Fertility Center Berlin, an electron microscope is used to fertilize an egg cell.

The Alabama Supreme Court on Friday ruled that frozen embryos are “children,” entitled to full personhood rights, and anyone who destroys them could be liable in a wrongful death case.

The first-of-its-kind ruling throws into question the future use of assisted reproductive technology (ART) involving in vitro fertilization for patients in Alabama—and beyond. For this technology, people who want children but face challenges to conceiving can create embryos in clinical settings, which may or may not go on to be implanted in a uterus.

In the Alabama case, a hospital patient wandered through an unlocked door, removed frozen, preserved embryos from subzero storage and, suffering an ice burn, dropped the embryos, destroying them. Affected IVF patients filed wrongful-death lawsuits against the IVF clinic under the state’s Wrongful Death of a Minor Act. The case was initially dismissed in a lower court, which ruled the embryos did not meet the definition of a child. But the Alabama Supreme Court ruled that “it applies to all children, born and unborn, without limitation.” In a concurring opinion, Chief Justice Tom Parker cited his religious beliefs and quoted the Bible to support the stance.

“Human life cannot be wrongfully destroyed without incurring the wrath of a holy God, who views the destruction of His image as an affront to Himself,” Parker wrote. “Even before birth, all human beings bear the image of God, and their lives cannot be destroyed without effacing his glory.”

In 2020, the US Department of Health and Human Services estimated that there were over 600,000 embryos frozen in storage around the country, a significant percentage of which will likely never result in a live birth.

The process of IVF generally goes like this: First, egg production is overstimulated with hormone treatments. Then, doctors harvest the eggs as well as sperm. The number of eggs harvested can vary, but doctors sometimes try to retrieve as many as possible, ranging from a handful to several dozen, depending on fertility factors. The harvested eggs are fertilized in a clinic, sometimes by combining them with sperm in an incubator or by the more delicate process of directly injecting sperm into a mature egg (intracytoplasmic sperm injection). Any resulting fertilized eggs may then go through additional preparations, including “assisted hatching,” which prepares the embryo’s membrane for attaching to the lining of the uterus, or genetic screening to ensure the embryo is healthy and viable.

Feared reality

This process sometimes yields several embryos, which is typically considered good because each round of IVF can have significant failure rates. According to national ART data collected by the Centers for Disease Control and Prevention, the percentage of egg retrievals that fail to result in a live birth ranges from 46 percent to 91 percent, depending on the patient’s age. The percentage of fertilized egg or embryo transfers that fail to result in a live birth ranges from 51 percent to 76 percent, depending on age. Many patients go through multiple rounds of egg retrievals and embryo transfers.

The whole IVF process often creates numerous embryos but leads to far fewer live births. In 2021, nearly 240,000 patients in the US had over 400,000 ART cycles, resulting in 97,000 live-born infants, according to the CDC.

People who have extra embryos from IVF can currently choose what to do with them, including freezing them for more cycles or future conception attempts, donating them to others wanting to conceive, donating them to research, or having them discarded.

But, if, as Alabama’s Supreme Court ruled, embryos are considered “children,” this could mean that any embryos that are destroyed or discarded in the process of IVF or afterward could be the subject of wrongful death lawsuits. The ruling creates potentially paralyzing liability for ART clinics and patients who use them. Doctors may choose to only attempt creating embryos one at a time to avoid liability attached to creating extras, or they may decline to provide IVF altogether to avoid liability when embryos do not survive the process. This could exacerbate the already financially draining and emotionally exhausting process of IVF, potentially putting it entirely out of reach for those who want to use the technology and putting clinics out of business.

Barbara Collura, CEO of RESOLVE: The National Infertility Association, told USA Today that the ruling would likely halt most IVF work in Alabama. “This is exactly what we have been fearful of and worried about where it was heading,” Collura said. “We are extremely concerned that this is now going to happen in other states.”

But the hypothetical risks don’t end there. Health advocates worry that the idea of personhood for an embryonic ball of a few cells could extend to pregnancy outcomes, such as miscarriages or the use of contraceptives.

Frozen embryos are “children,” according to Alabama’s Supreme Court Read More »

x-ray-imaging-of-the-night-watch-reveals-previously-unknown-lead-layer

X-ray imaging of The Night Watch reveals previously unknown lead layer

The latest from Operation Night Watch —

Rembrandt may have used lead-rich oil to prep his canvas and protect it from humidity.

The Nightwatch, or Militia Company of District II under the Command of Captain Frans Banninck Cocq (1642)

Enlarge / Rembrandt’s The Night Watch underwent many chemical and mechanical alterations over the last 400 years.

Public domain

Rembrandt’s The Night Watch, painted in 1642, is the Dutch master’s largest surviving painting, known particularly for its exquisite use of light and shadow. A new X-ray imaging analysis of the masterpiece has revealed an unexpected lead layer, perhaps applied as a protective measure while preparing the canvas, according to a new paper published in the journal Science Advances. The work was part of the Rijksmuseum’s ongoing Operation Night Watch, the largest multidisciplinary research and conservation project for Rembrandt’s famous painting, devoted to its long-term preservation.

The famous scene depicted in The Night Watch—officially called Militia Company of District II under the Command of Captain Frans Banninck Cocq—was not meant to have taken place at night. Rather, the dark appearance is the result of the accumulation of dirt and varnish over four centuries, as the painting was subject to various kinds of chemical and mechanical alterations.

For instance, in 1715, The Night Watch was moved to Amsterdam’s City Hall (now the Royal Palace on Dam Square). It was too large for the new location, so the painting was trimmed on all four sides, and the trimmed pieces were never found (although in 2021, AI was used to re-create the original full painting). The objective of Operation Night Watch is to employ a wide variety of imaging and analytical techniques to better understand the materials Rembrandt used to create his masterpiece and how those materials have changed over time.

As previously reported, past analyses of Rembrandt’s paintings identified many pigments the Dutch master used in his work, including lead white, multiple ochres, bone black, vermilion, madder lake, azurite, ultramarine, yellow lake, and lead-tin yellow, among others. The artist rarely used pure blue or green pigments, with Belshazzar’s Feast being a notable exception. (The Rembrandt Database is the best resource for a comprehensive chronicling of the many different investigative reports.)

Earlier this year, the researchers at Operation Night Watch found rare traces of a compound called lead formate in the painting. They scanned about half a square meter of the painting’s surface with X-ray powder diffraction mapping (among other methods) and analyzed tiny fragments from the painting with synchrotron micro X-ray probes. This revealed the presence of the lead formates—surprising in itself, but the team also identified those formates in areas where there was no lead pigment, white, or yellow. It’s possible that lead formates disappear fairly quickly, which could explain why they have not been detected in paintings by the Dutch Masters until now. But if that is the case, why didn’t the lead formate disappear in The Night Watch? And where did it come from in the first place? 

Hoping to answer these questions, the team whipped up a model of “cooked oils” from a 17th century recipe, which called for mixing and heating linseed oil and lead oxide, then adding hot water to the reacting mixture. They analyzed those model oils with synchrotron radiation. The results supported their hypothesis that the oil used for light parts of the painting was treated with an alkaline lead drier. The fact that The Night Watch was revarnished with an oil-based varnish in the 18th century complicates matters, as this may have provided a fresh source of formic acid, such that different regions of the painting rich in lead formates may have formed at different times in the painting’s history.

This latest paper sheds more light on the painting by focusing on the preparatory layers applied to the canvas. It’s known that Rembrandt used a quartz-clay ground for The Night Watch—the first time he had done so, perhaps because the colossal size of the painting “motivated him to look for a cheaper, less heavy and more flexible alternative for the ground layer” than the red earth, lead white, and cerussite he was known to use on earlier paintings, the authors suggested.

The Night Watch via the correlated synchrotron-based X-ray fluorescence and ptychographic tomography of a paint sample, supported by a macroscale X-ray fluorescence scan of the whole painting.” height=”439″ src=”https://cdn.arstechnica.net/wp-content/uploads/2023/12/nightwatch4-640×439.jpg” width=”640″>

Enlarge / A so far unknown lead-containing impregnation ‘layer’ was discovered in Rembrandt’s The Night Watch via the correlated synchrotron-based X-ray fluorescence and ptychographic tomography of a paint sample, supported by a macroscale X-ray fluorescence scan of the whole painting.

Fréderique Broers

Per the authors, this is the first time that 3D X-ray imaging techniques have been used: X-ray fluorescence and X-ray ptychographic nano-tomography applied to an embedded paint fragment comprised of only the quartz-clay ground. The authors maintain that microscale analysis of historical paintings usually relies on 2D imaging techniques (e.g., light microscopy, scanning electron microscopy, synchrotron radiation spectroscopy), which only yield partial information about the size, shape, and distribution of pigment particles below the visible surface.

The 3D methods capture more detail by comparison, revealing the presence of an unknown (and unexpected) lead-containing layer located just underneath the ground layer. The authors suggest that this could be due to using a lead compound added to the oil used to prepare the canvas as a drying additive—perhaps to protect the painting from the damaging effects of humidity. (Usually a glue sizing was used before applying the ground layer.)

The Night Watch originally hung in the “great hall” of a musketeer shooting range in Amsterdam and faced windows. The authors note that since the Middle Ages, red lead in oil has been used to preserve stone, wood, and metal against humidity, and one contemporary source mentions using lead-rich oil instead of the typical glue to keep the canvas from separating after years of exposure in humid environments. And that newly discovered lead layer could be the reason for the unusual lead protrusions in areas of The Night Watch with no other lead-containing compounds in the paint. It’s possible that lead migrated into the painting’s ground layer from that lead-oil preparatory layer below.

DOI: Science Advances, 2023. 10.1126/sciadv.adj9394  (About DOIs).

X-ray imaging of The Night Watch reveals previously unknown lead layer Read More »

exploring-the-world-of-live-xr-theater

Exploring the World of Live XR Theater

The last three years may feel as though they’ve gone by pretty quickly. A few short years ago, we were seeing an explosion of interest and production in XR theater and live virtual entertainment. The pandemic meant that a lot of theaters were empty, creating a strong need for audiences and entertainers alike.

Now it’s 2023. Theaters are open again. But, that doesn’t mean that XR theater has gone anywhere. Far from being a temporary fix to string us through an isolated event, live VR entertainment is stronger than ever. It remains a way to explore new avenues of storytelling and even bring new audiences into traditional entertainment venues.

Understanding Immersive Theater

Before we dive in, a quick clarifying note may be required. While some readers will hopefully come from a theater background, most readers are likely more familiar with XR terminology so one particular term might be confusing.

When ARPost describes an experience as “immersive,” we’re usually talking about a 3D virtual environment that is spatially explored either by physical movement in augmented or mixed reality, or through spatial navigation in virtual reality. However, XR does not have a monopoly on the word.

“Immersive theater” is a term from the live entertainment world that far predates XR and XR theater. In this form of immersive theater, participants converse with actors, manipulate props, and physically move through sets that might take up an entire building. While the pandemic played a part in the growth of XR theater, its roots are in immersive theater.

“Due to our familiarity with the genre of immersive theatre, and some of our team members had prior experience performing in and being audience members in VR theatre shows, the transition from in real life (IRL) to VR was very natural,” Ferryman Collective founding member Stephen Butchko told ARPost.

Ferryman Collective, one of the premiere production companies in XR theater, was founded during the pandemic but its founding members had already been performing immersive theater in live venues for years. In fact, one of Ferryman Collective’s first major productions, Severance Theory: Welcome to Respite, began life as an in-person immersive theater production.

From Gaming to XR Theater

The Under Presents, released in 2019, might be the first major piece of XR theater. Tender Claws, the development studio behind the production, had been exploring innovative digital productions and engagements for four years already, but The Under Presents is where our story begins.

The experience, built as a game that sometimes featured live actors, introduced countless viewers to live XR theater. It also inspired other artists at a time when the theater community was in dire need of something new and different.

“Born out of the Pandemic”

“Ferryman Collective was born out of the pandemic and brought together by the magic of The Under Presents, or ‘TUP’, as we affectionately call it,” Ferryman Collective founding member Deirdre Lyons told ARPost. “The pandemic shut everything down in 2020 except for TUP, as people performed and participated in it from home.”

In 2019, Lyons was one of the Tender Claw’s VR actors – a job that she still holds while also producing, directing, and acting in productions by Ferryman Collective. A number of members of Ferryman Collective met while working on TUP.

The live show was only supposed to run for three months but extended the run due to its high popularity. The live component of the app and game was eventually closed, leaving actors free to work on other projects, with Tender Claws’ second major XR theater production, Tempest, coming out the following year.

Ferryman Collective’s first production, PARA, a horror story about a dubious AI startup, came out in the autumn of 2020. The show was written by Braden Roy, and was directed by Roy and Brian Tull, who had also met working on TUP. Roy also wrote Ferryman Collective’s second production, Krampusnacht, directed by Roy, Tull, and Lyons in the winter of 2020-2021.

XR Theater Meets Immersive Theater

Ferryman Collective learned a lot from PARA and Krampusnacht. The latter got the collective their first award nomination, with a run that was extended four times to keep up with interest. However, the collective’s breakout production was The Severance Theory: Welcome to Respite – an XR adaptation of a pre-pandemic live immersive theater production.

“Having experienced quiet moments of contemplation with other audience members within my experience as an actor on TUP, I knew that this medium had the potential for a profound connection,” said Lyons. “Having done some voiceover work on The Severance Theory: Welcome to Respite […] I felt this piece could be that kind of powerful experience in VR.”

Lyons reached out to the play’s creator, Lyndsie Scoggin, who had also been sidelined by the pandemic. Scoggin went from not owning a headset to writing and directing the XR theater adaptation, which took on a life of its own.

“The IRL version of [Welcome to Respite] was performed for one audience member who plays a seven-year-old kid named Alex,” Butchko told ARPost. “In the VR version, we are able to include up to nine additional audience members who are put into invisible avatars and play the alternate aspects of Alex’s personality, the Alter Egos.”

Bringing in Participants

Ferryman Collective’s approach to Welcome to Respite brings in more participants per show, but it also allows the participants to control the story as a group as each one gets a vote to determine Alex’s actions taken by the singular Alex over the course of the play.

Expanding the scale of XR theater audiences is one of the pioneering pursuits of “scrappy storyteller” Brandan Bradley. Bradley has been exploring XR since 2017 but really dove into it during the pandemic. During this time he has launched his own projects and XR theater productions and has also acted in productions by Ferryman Collective.

“The pandemic brought about this collision of my two loves: interactive media and fine arts,” Bradley told ARPost in a 2020 interview.

NON-PLAYER CHARACTER - a VR Musical - Brandan Bradley

Bradley’s current production, NPC, brings in a group decision dynamic similar to Welcome to Respite. Bradley plays a side character in a video game that sees the main character die and turns to the audience for guidance. The audience is four “on-stage” participants that interact with him directly, and a larger “seated audience” that watches the action unfold.

Expanding the audience

Splitting the audience like this does a number of things for Bradley. Traditional immersive theater experiences might only have the participating audience – and most XR theater still works that way. From a strictly box office perspective, bringing in the “seated audience” allows Bradley to sell significantly more tickets per performance.

There’s also an audience accommodation aspect. While the “seated audience” might be interested in seeing a story that is shaped by the audience, shaping the story themselves might not be their cup of tea. Further, the “seated audience” can join on more widely affordable and available devices – including a web browser.

“There is a large contingency of the audience that enjoys a more passive role – like a Twitch chat come to life,” Bradley told me over coffee at AWE. “My mom, who will never put on goggles, is willing to join on the keyboard.”

Bradley’s OnBoardXR – a sort of workshop and venue for XR entertainers to begin developing and testing live performances – uses a similar ticketing model. In a lobby, audience members choose different avatars to signal to the actors the degree to which they feel comfortable participating.

NPC and OnBoardXR, take place on-browser and can be joined in headset, on a desktop, or even on a mobile phone. Ferryman Collective performs in VRChat for similar options. This is a departure from Tender Claws’ VR-only productions.

“All of us would love to be The Under Presents […] but the price point is outrageous and the timetable is untenable for someone who just wants to keep producing […] we’re kind of ‘Off Broadway,’” said Bradley. “This is the balance that we’re all doing. There are things we would all love to do with more robust tools […] right now it’s more important to have more participants.”

Exploring Affordances

Anytime that anything is brought into virtual reality, there are benefits and barriers. Live theater is no different. Set and prop design, construction, and storage can be a lot easier. This to the point that no XR production ever need be permanently ended. A show can be revived at any time because everything exists as files as opposed to physical objects that must be stored.

However, physicality and expression can be a trade-off. A character may be fantastically designed for VR, but controlling it and expressing through it isn’t always easy – even with special avatars with controller-activated expressions.

“Emotions within the scene must be conveyed through the actor’s voice and sometimes stylized gestures[…],” said Butchko. “Things that we cannot do easily or convincingly are eat, drink, and lay down. Those were all found in the IRL production of [Welcome to Respite], but could not be used in the VR version due to technical limitations.”

Further, if you’re still comparing XR theater with a typical play instead of immersive theater, there are a few more details that you might have missed. Some in-person immersive theater involves physical contact between actors and participants, or at least involves participants physically interacting with sets and props.

“Not all immersive shows have physical actor-audience contact but there’s still the physicality of the structure and props that can’t be replicated without building a physical space,” Tull told ARPost. “Smell and taste are noticed less, though the potpourri of an old mansion or a gin and tonic at a seedy speakeasy go a long way in completing the illusion.”

Tull further commented that, even when “physical actor-audience contact” is involved, “the visual immersion of virtual reality can almost replicate the intimacy of actual touch.” I certainly found this to be the case.

Exploring Emotion

As a participant in Ferryman Collective’s Gumball Dreams, an actor reached out and virtually put his hand on my chest. If an actor had physically done this in an IRL production, I dare say that this would have made me immensely uncomfortable in the worst way. But, in VR, this came across as intended – a moving intimate gesture between characters in a story.

Gumball Dreams has an amusing name and a brightly colored and stylized virtual world. However, the actual story is an incredibly moving exploration of mortality and consciousness. Similar themes exist in NPC, while Welcome to Respite explores the experience of psychological disorders. What makes XR theater so conducive to these heavy topics?

“At a media level, when you’re inviting the kind of immersion that VR affords, you want to do more than just comedy,” said Bradley. “There is an emotional intimacy that we experience in VR that we haven’t experienced anywhere else and don’t have words for and that’s the next degree of the storytelling experience.”

In this year’s AWE panel discussion on “XR Entertainment: The Next Generation of Movie Makers and Performers”, Ferryman Collective performer and producer Whitton Frank gave a description of XR theater that also explains the draw that it has to audiences as well as entertainers.

“You are given a character and you are a part of the play […] you’re having emotional experiences with another human being which is why, I think, people get excited about this,” said Frank. “That is the way forward – to show people the world in a way that they haven’t seen it before.”

Find an XR Theater Experience

So, how do you know when and which XR theater experiences are available? It’s still a pretty niche field, but it’s close-knit. Start out by following groups like Tender Claws, OnBoardXR, and Ferryman Collective. Then (before or after the show), talk to the other audience members. Some will likely be new to it themselves, but others will be able to point you in the right direction.

Exploring the World of Live XR Theater Read More »

rose-and-mastercard-augment-the-miami-design-district-in-a-new-immersive-experience

ROSE and Mastercard® Augment the Miami Design District in a New Immersive Experience

Mastercard cardholders can see Miami’s Design District in a whole new light, thanks to an immersive experience from ROSE. Follow along for a look at the #Priceless experience and exclusive insight from ROSE on how it came together.

Experience Miami’s Design District In AR

The newest experience on Mastercard’s “Priceless” platform is an AR tour of Miami’s Design District, led by CEO, entrepreneur, and art collector Craig Robins. Some of the seven stops in the viewer’s virtual tour are currently in Robins’ private collection.

ROSE and Mastercard immersive experience - Miami Design District AR tour

“This iconic destination provides enthusiasts with special access to the vibrant arts scene in Miami, as well as the multi-sensory dimensions of the Mastercard brand through our sonic music and immersive visual branding,” said Mastercard Executive VP of Consumer Marketing and Revenue, Monica Biagiotti. “To ensure the experience captures everything the Design District represents, we’re thrilled to partner with the ultimate insider, Craig Robins who introduces a special, curated tour for our guests.”

Each of the stops is marked by an orb that appears in the viewer’s camera feed on their connected smart device. Viewers can select experiences by tapping the orb or by moving toward it. Each bubble transports viewers to its own immersive experience, wherein they can further navigate around the artworks and architecture presented in the tour.

ROSE and Mastercard Augment the Miami Design District in a New Immersive Experience

“We’ve been dipping our toes in the AR space to better understand how people want to engage with this new technology,” said Biagiotti. “AR and other emerging data-driven technologies provide us with an opportunity to connect more consumers with their passions – like travel – in innovative ways so they can learn or experience something new and enriching.”

The stops along the tour include:

  • Sol LeWitt’s Wall Drawing #1138;
  • Virgil Abloh’s Dollar a Gallon III;
  • Buckminster Fuller’s Fly’s Eye Dome;
  • Criola’s Interdimensional Portal;
  • John Baldessari’s Fun Mural (Part 1);
  • Urs Fischer’s Standing Julian;
  • Jana Euler’s Two Brides.

ROSE and Mastercard - Priceless immersive experience

“Miami is becoming a destination for art and technology, this experience really marries the two as a way for Mastercard cardholders everywhere to experience iconic and large-scale art pieces from anywhere using immersive technology,” ROSE Associate Creative Director Nicole Riemer said in an emailed response.

How the Immersive Experience Came Together

This immersive experience is the second XR experience created by ROSE and Mastercard. The first was a Snapchat Lens for Mastercard’s Stand Up To Cancer campaign that launched last summer.

“Working with Mastercard™ has been great!” said Riemer. “For both projects, the Mastercard™ teams have been super excited about using augmented reality and have trusted us as experts in this space. That has allowed us to have a very collaborative relationship.”

Another familiar piece of the puzzle was 8th Wall. ROSE had been working with the company since before it was purchased by Niantic, and the relationship shows no signs of souring.

“Their support is always great. Their team is very available to work through how to implement our ideas on their platform, and being open to feedback about features,” said Riemer. “During the development of this project, their team checked in often to see if they could offer support, as well as planning for using this as a case study for their platform.”

How to Experience Miami From Anywhere

Mastercard cardholders can virtually visit Miami’s Design District here. The immersive experience runs through June 1, 2023.

(Terms apply. Quantities limited.)

ROSE and Mastercard® Augment the Miami Design District in a New Immersive Experience Read More »

artebinaria-open-air-museum:-imaginary-museums-without-walls-in-augmented-reality

Artebinaria Open-Air Museum: Imaginary Museums Without Walls in Augmented Reality

 

Sponsored content

Sponsored by Artebinaria

 

Artebinaria Open-Air Museum is a brand new technological platform – designed, developed, and curated by Artebinaria – for setting up and opening to the public a collection of imaginary museums in augmented reality, geo-locatable anywhere in the world, and visitable through the app Open-Air Museum, available for iPhone and iPad.

A series of open-air exhibition spaces come to life in augmented reality, offering an unparalleled experience for art enthusiasts, with exclusive installations that evolve over time with new proposals.

Meet Artebinaria

Artebinaria is a tech company based in Florence, Italy, founded by senior software engineer Alessandro Bemporad, operating internationally since 2019 in the field of augmented reality applied to the world of art. The company develops its own solutions on Apple devices and Cloud Computing platforms.

The team is composed of software engineers and art historians, all with many years of experience in their respective professional fields.

Artebinaria believes in augmented imagination – the symbiosis of creativity, knowledge, and technology.

Artebinaria Open-Air Museum in Florence, London, and Paris

The first three imaginary museums are already open at Piazzale Michelangelo in Florence, at Primrose Hill (The Regent’s Park) in London, and at Place-Vendôme in Paris, and offer a selection of 100 masterpieces of painting from the 13th to the 20th century.



The selection of the artworks, and the original content of the information sheets, illustrating the artworks, have been curated by an art historian, the Art Director of Artebinaria, Maddalena Grazzini.

How to Visit Artebinaria Open-Air Museum

A visit to one of Artebinaria’s Open-Air Museums takes place via the app Open-Air Museum by Artebinaria, which allows visitors who are in one of the geo-located museum locations to explore the artworks in augmented reality.

Visits to the Artebinaria Open-Air Museums are free of charge and are offered by selected sponsors, whose logos are visible in 3D directly within the augmented reality scenes.

artebinaria open-air museum augmented reality florence

Each of Artebinaria’s Open-Air Museums is arranged in a series of ‘Pavilions’ and ‘Exhibition Rooms,’ which over time will house new thematic exhibitions.

Inside the virtual rooms, visitors can admire the artworks, displayed in life-size, as if they were hanging on invisible walls which do not conceal their surroundings.

Moving within each imaginary room, visitors can admire the artworks from any perspective. In particular, it is possible to get close to an artwork to discover all its details or touch an artwork in space to view its information sheet.

100 Masterpieces in Augmented Reality

In this first edition, located in London, Paris, and Florence, the Pavilions of Artebinaria Open-Air Museum are dedicated to the themes of everyday life, portraits, mythology, and sacred art. Inside the exhibition rooms are shown 100 paintings of more than 60 great masters such as Giotto, Vermeer, Rubens, Van Gogh, Leonardo da Vinci, Rembrandt, Gauguin, Renoir, Degas, Monet, etc.

artebinaria open-air museum augmented reality london phone

Artebinaria Open-Air Museum in Your City

With the opening of the first museums without walls in Florence, London, and Paris, Artebinaria invites all art history enthusiasts to visit them, and also to propose new locations for the opening of fantastic new Open-Air Museums in augmented reality all over the world.

Why don’t you propose to Artebinaria the opening of an Open-Air Museum in your city, too?

Artebinaria Open-Air Museum: Imaginary Museums Without Walls in Augmented Reality Read More »