From Digital Age to Nano Age. WorldWide.

Tag: Don039t

Robotic Automations

VCs and the military are fueling self-driving startups that don't need roads | TechCrunch


A new crop of early-stage startups — along with some recent VC investments — illustrates a niche emerging in the autonomous vehicle technology sector. Unlike the companies bringing robotaxis to city streets, these startups are taking their tech off-road.  Two recent entrants — Seattle-based Overland AI and New Brunswick-based Potential — are poised to get […]

© 2024 TechCrunch. All rights reserved. For personal use only.


Software Development in Sri Lanka

Robotic Automations

Photo-sharing community EyeEm will license users photos to train AI if they don't delete them | TechCrunch


EyeEm, the Berlin-based photo-sharing community that exited last year to Spanish company Freepik, after going bankrupt, is now licensing its users’ photos to train AI models. Earlier this month, the company informed users via email that it was adding a new clause to its Terms & Conditions that would grant it the rights to upload users’ content to “train, develop, and improve software, algorithms, and machine-learning models.” Users were given 30 days to opt out by removing all their content from EyeEm’s platform. Otherwise, they were consenting to this use case for their work.

At the time of its 2023 acquisition, EyeEm’s photo library included 160 million images and nearly 150,000 users. The company said it would merge its community with Freepik’s over time.

Once thought of as a possible challenger to Instagram — or at least “Europe’s Instagram” — EyeEm had dwindled to a staff of three before selling to Freepik, TechCrunch’s Ingrid Lunden previously reported. Joaquin Cuenca Abela, CEO of Freepik, hinted at the company’s possible plans for EyeEm, saying it would explore how to bring more AI into the equation for creators on the platform.

As it turns out, that meant selling their work to train AI models.

Now, EyeEm’s updated Terms & Conditions reads as follows:

8.1 Grant of Rights – EyeEm Community

By uploading Content to EyeEm Community, you grant us regarding your Content the non-exclusive, worldwide, transferable and sublicensable right to reproduce, distribute, publicly display, transform, adapt, make derivative works of, communicate to the public and/or promote such Content.

This specifically includes the sublicensable and transferable right to use your Content for the training, development and improvement of software, algorithms and machine learning models. In case you do not agree to this, you should not add your Content to EyeEm Community.

The rights granted in this section 8.1 regarding your Content remains valid until complete deletion from EyeEm Community and partner platforms according to section 13. You can request the deletion of your Content at any time. The conditions for this can be found in section 13.

Section 13 details a complicated process for deletions that begins with first deleting photos directly — which would not impact content that had been previously shared to EyeEm Magazine or social media, the company notes. To delete content from the EyeEm Market (where photographers sold their photos) or other content platforms, users would have to submit a request to [email protected] and provide the Content ID numbers for those photos they wanted to delete and whether it should be removed from their account, as well, or the EyeEm market only.

Of note, the notice says that these deletions from EyeEm market and partner platforms could take up to 180 days. Yes, that’s right: requested deletions take up to 180 days but users only have 30 days to opt out. That means the only option is manually deleting photos one by one.

Worse still, the company adds that:

You hereby acknowledge and agree that your authorization for EyeEm to market and license your Content according to sections 8 and 10 will remain valid until the Content is deleted from EyeEm and all partner platforms within the time frame indicated above. All license agreements entered into before complete deletion and the rights of use granted thereby remain unaffected by the request for deletion or the deletion.

Section 8 is where licensing rights to train AI are detailed. In Section 10, EyeEm informs users they will forgo their right to any payouts for their work if they delete their account — something users may think to do to avoid having their data fed to AI models. Gotcha!

EyeEm’s move is an example of how AI models are being trained on the back of users’ content, sometimes without their explicit consent. Though EyeEm did offer an opt-out procedure of sorts, any photographer who missed the announcement would have lost the right to dictate how their photos were to be used going forward. Given that EyeEm’s status as a popular Instagram alternative had significantly declined over the years, many photographers may have forgotten they had ever used it in the first place. They certainly may have ignored the email, if it wasn’t already in a spam folder somewhere.

Those who did notice the changes were upset they were only given a 30-day notice and no options to bulk delete their contributions, making it more painful to opt out.

Requests for comment sent to EyeEm weren’t immediately confirmed, but given this countdown had a 30-day deadline, we’ve opted to publish before hearing back.

This sort of dishonest behavior is why users today are considering a move to the open social web. The federated platform, Pixelfed, which runs on the same ActivityPub protocol that powers Mastodon, is capitalizing on the EyeEm situation to attract users.

In a post on its official account, Pixelfed announced “We will never use your images to help train AI models. Privacy First, Pixels Forever.”




Software Development in Sri Lanka

Robotic Automations

Metaview's tool records interview notes so that hiring managers don't have to | TechCrunch


Siadhal Magos and Shahriar Tajbakhsh were working at Uber and Palantir, respectively, when they both came to the realization that hiring — particularly the process of interviewing — was becoming unwieldy for many corporate HR departments.

“It was clear to us that the most important part of the hiring process is the interviews, but also the most opaque and unreliable part,” Magos told TechCrunch. “On top of this, there’s a bunch of toil associated with taking notes and writing up feedback that many interviewers and hiring managers do everything they can to avoid.”

Magos and Tajbakhsh thought that the hiring process was ripe for disruption, but they wanted to avoid abstracting away too much of the human element. So they launched Metaview, an AI-powered note-taking app for recruiters and hiring managers that records, analyzes and summarizes job interviews.

“Metaview is an AI note-taker built specifically for the hiring process,” Magos said. “It helps recruiters and hiring managers focus more on getting to know candidates and less on extracting data from the conversations. As a consequence, recruiters and hiring managers save a ton of time writing up notes and are more present during interviews because they’re not having to multitask.”

Metaview integrates with apps, phone systems, videoconferencing platforms and tools like Calendly and GoodTime to automatically capture the content of interviews. Magos says the platform “accounts for the nuances of recruiting conversations” and “enriches itself with data from other sources,” such as applicant tracking systems, to highlight the most relevant moments.

“Zoom, Microsoft Teams and Google Meet all have transcription built in, which is a possible alternative to Metaview,” Magos said. “But the information that Metaview’s AI pulls out from interviews is far more relevant to the recruiting use case than generic alternatives, and we also assist users with the next steps in their recruiting workflows in and around these conversations.”

Image Credits: Metaview

Certainly, there’s plenty wrong with traditional job interviewing, and a note-taking and conversation-analyzing app like Metaview could help, at least in theory. As a piece in Psychology Today notes, the human brain is rife with biases that hinder our judgement and decision making, for example a tendency to rely too heavily on the first piece of information offered and to interpret information in a way that confirms our preexisting beliefs.

The question is, does Metaview work — and, more importantly, work equally well for all users?

Even the best AI-powered speech dictation systems suffer from their own biases. A Stanford study showed that error rates for Black speakers on speech-to-text services from Amazon, Apple, Google, IBM and Microsoft are nearly double those for white speakers. Another, more recent study published in the journal Computer Speech and Language found statistically significant differences in the way two leading speech recognition models treated speakers of different genders, ages and accents.

There’s also hallucination to consider. AI makes mistakes summarizing, including in meeting summaries. In a recent story, The Wall Street Journal cited an instance where, for one early adopter using Microsoft’s AI Copilot tool for summarizing meetings, Copilot invented attendees and implied calls were about subjects that were never discussed.

When asked what steps Metaview has taken, if any, to mitigate bias and other algorithmic issues, Magos claimed that Metaview’s training data is diverse enough to yield models that “surpass human performance” on recruitment workflows and perform well on popular benchmarks for bias.

I’m skeptical and a bit wary, too, of Metaview’s approach to how it handles speech data. Magos says that Metaview stores conversation data for two years by default unless users request that the data be deleted. That seems like an exceptionally long time.

But none of this appears to have affected Metaview’s ability to get funding or customers.

Metaview this month raised $7 million from investors including Plural, Coelius Capital and Vertex Ventures, bringing the London-based startup’s total raised to $14 million. Metaview’s client count stands at 500 companies, Magos says, including Brex, Quora, Pleo and Improbable — and it’s grown 2,000% year-over-year.

“The money will be used to grow the product and engineering team primarily, and give more fuel to our sales and marketing efforts,” Magos said. “We will triple the product and engineering team, further fine-tune our conversation synthesis engine so our AI is automatically extracting exactly the right information our customers need and develop systems to proactively detect issues like inconsistencies in the interview process and candidates that appear to be losing interest.”


Software Development in Sri Lanka

Robotic Automations

Don't miss out on savings! Only 48 hours left to claim your early-bird ticket | TechCrunch


The clock has almost run out! With just 48 hours left, this is your final opportunity to secure early-bird savings for TechCrunch Early Stage 2024. Don’t miss out on this chance to join industry leaders, experts, and fellow founders for a day packed with invaluable insights, networking opportunities, and actionable advice to fuel your startup journey.

Here’s a glimpse of some of our esteemed speakers you’ll get to hear from:

  • Alex Kayyal, partner at Lightspeed Venture Partners
  • Rudina Seseri, co-founder and managing partner at Glasswing Ventures
  • Emily Knight, president at The Engine Accelerator
  • Jess Lee, partner at Sequoia
  • James Currier, general partner at NFX
  • Rebecca Lee Whiting, founder and fractional general counsel at Epigram Legal P.C.
  • Tom Blomfield, group partner, Y Combinator

Don’t wait until it’s too late! Secure your ticket now and take advantage of this limited-time offer. Join us on April 25 in Boston for TechCrunch Early Stage 2024 and take your startup to new heights. Grab your early-bird ticket before the clock runs out at 11:59pm PT tomorrow, March 29.

Is your company interested in sponsoring or exhibiting at TechCrunch Early Stage 2024? Reach out to our sponsorship sales team by completing this form.


Software Development in Sri Lanka

Back
WhatsApp
Messenger
Viber