Friday, June 30, 2023

The open-source AI boom is built on Big Tech’s handouts. How long will it last?

Greater access to the code behind generative models is fueling innovation. But if top companies get spooked, they could close up shop.
MIT TECHNOLOGY REVIEW
May 12, 2023
STEPHANIE ARNETT/MITTR | ENVATO

LONG READ


Last week a leaked memo reported to have been written by Luke Sernau, a senior engineer at Google, said out loud what many in Silicon Valley must have been whispering for weeks: an open-source free-for-all is threatening Big Tech’s grip on AI.

New open-source large language models—alternatives to Google’s Bard or OpenAI’s ChatGPT that researchers and app developers can study, build on, and modify—are dropping like candy from a piƱata. These are smaller, cheaper versions of the best-in-class AI models created by the big firms that (almost) match them in performance—and they’re shared for free.

Companies like Google—which revealed at its annual product showcase this week that it is throwing generative AI at everything it has, from Gmail to Photos to Maps—were too busy looking over their shoulders to see the real competition coming, writes Sernau: “While we’ve been squabbling, a third faction has been quietly eating our lunch.”

In many ways, that’s a good thing. Greater access to these models has helped drive innovation—it can also help catch their flaws. AI won't thrive if just a few mega-rich companies get to gatekeep this technology or decide how it is used.

But this open-source boom is precarious. Most open-source releases still stand on the shoulders of giant models put out by big firms with deep pockets. If OpenAI and Meta decide they’re closing up shop, a boomtown could become a backwater.

For example, many of these models are built on top of LLaMA, an open-source large language model released by Meta AI. Others use a massive public data set called the Pile, which was put together by the open-source nonprofit EleutherAI. But EleutherAI exists only because OpenAI’s openness meant that a bunch of coders were able to reverse-engineer how GPT-3 was made, and then create their own in their free time.

“Meta AI has done a really great job training and releasing models to the research community,” says Stella Biderman, who divides her time between EleutherAI, where she is executive director and head of research, and the consulting firm Booz Allen Hamilton. Sernau, too, highlights Meta AI’s crucial role in his Google memo. (Google confirmed to MIT Technology Review that the memo was written by one of its employees but notes that it is not an official strategy document.)

All that could change. OpenAI is already reversing its previous open policy because of competition fears. And Meta may start wanting to curb the risk that upstarts will do unpleasant things with its open-source code. “I honestly feel it’s the right thing to do right now,” says Joelle Pineau, Meta AI’s managing director, of opening the code to outsiders. “Is this the same strategy that we’ll adopt for the next five years? I don’t know, because AI is moving so quickly.”

Related Story

Google is throwing generative AI at everything


But experts say that releasing these models into the wild before fixing their flaws could prove extremely risky for the company.

If the trend toward closing down access continues, then not only will the open-source crowd be cut adrift—but the next generation of AI breakthroughs will be entirely back in the hands of the biggest, richest AI labs in the world.

The future of how AI is made and used is at a crossroads.

Open-source bonanza

Open-source software has been around for decades. It’s what the internet runs on. But the cost of building powerful models meant that open-source AI didn’t take off until a year or so ago. It has fast become a bonanza.

Just look at the last few weeks. On March 25, Hugging Face, a startup that champions free and open access to AI, unveiled the first open-source alternative to ChatGPT, the viral chatbot released by OpenAI in November.

Hugging Face’s chatbot, HuggingChat, is built on top of an open-source large language model fine-tuned for conversation, called Open Assistant, that was trained with the help of around 13,000 volunteers and released a month ago. But Open Assistant itself is built on Meta’s LLaMA.

And then there’s StableLM, an open-source large language model released on March 19 by Stability AI, the company behind the hit text-to-image model Stable Diffusion. A week later, on March 28, Stability AI released StableVicuna, a version of StableLM that—like Open Assistant or HuggingChat—is optimized for conversation. (Think of StableLM as Stability’s answer to GPT-4 and StableVicuna its answer to ChatGPT.)

These new open-source models join a string of others released in the last few months, including Alpaca (from a team at the University of Stanford), Dolly (from the software firm Databricks), and Cerebras-GPT (from AI firm Cerebras). Most of these models are built on LLaMA or datasets and models from EleutherAI; Cerebras-GPT follows a template set by DeepMind. You can bet more will come.

For some, open-source is a matter of principle. “This is a global community effort to bring the power of conversational AI to everyone … to get it out of the hands of a few big corporations,” says AI researcher and YouTuber Yannic Kilcher in a video introducing Open Assistant.

“We will never give up the fight for open source AI,” tweeted Julien Chaumond, cofounder of Hugging Face, last month.

Related Story

Inside a radical new project to democratize AI

A group of over 1,000 AI researchers has created a multilingual large language model bigger than GPT-3—and they’re giving it out for free.

For others, it is a matter of profit. Stability AI hopes to repeat the same trick with chatbots that it pulled with images: fuel and then benefit from a burst of innovation among developers that use its products. The company plans to take the best of that innovation and roll it back into custom-built products for a wide range of clients. “We stoke the innovation, and then we pick and choose,” says Emad Mostaque, CEO of Stability AI. “It’s the best business model in the world.”

Either way, the bumper crop of free and open large language models puts this technology into the hands of millions of people around the world, inspiring many to create new tools and explore how they work. “There’s a lot more access to this technology than there really ever has been before,” says Biderman.

“The incredible number of ways people have been using this technology is frankly mind-blowing,” says Amir Ghavi, a lawyer at the firm Fried Frank who represents a number of generative AI companies, including Stability AI. “I think that's a testament to human creativity, which is the whole point of open-source.”
Melting GPUs

But training large language models from scratch—rather than building on or modifying them—is hard. “It's still beyond the reach of the vast majority of people,” says Mostaque. “We melted a bunch of GPUs building StableLM.”

Stability AI’s first release, the text-to-image model Stable Diffusion, worked as well as—if not better than—closed equivalents such as Google’s Imagen and OpenAI’s DALL-E. Not only was it free to use, but it also ran on a good home computer. Stable Diffusion did more than any other model to spark the explosion of open-source development around image-making AI last year.


MITTR | GETTY

This time, though, Mostaque wants to manage expectations: StableLM does not come close to matching GPT-4. “There’s still a lot of work that needs to be done,” he says. “It’s not like Stable Diffusion, where immediately you have something that’s super usable. Language models are harder to train.”

Another issue is that models are harder to train the bigger they get. That’s not just down to the cost of computing power. The training process breaks down more often with bigger models and needs to be restarted, making those models even more expensive to build.

In practice there is an upper limit to the number of parameters that most groups can afford to train, says Biderman. This is because large models must be trained across multiple different GPUs, and wiring all that hardware together is complicated. “Successfully training models at that scale is a very new field of high-performance computing research,” she says.


The exact number changes as the tech advances, but right now Biderman puts that ceiling roughly in the range of 6 to 10 billion parameters. (In comparison, GPT-3 has 175 billion parameters; LLaMA has 65 billion.) It’s not an exact correlation, but in general, larger models tend to perform much better.

Biderman expects the flurry of activity around open-source large language models to continue. But it will be centered on extending or adapting a few existing pretrained models rather than pushing the fundamental technology forward. “There’s only a handful of organizations that have pretrained these models, and I anticipate it staying that way for the near future,” she says.


That’s why many open-source models are built on top of LLaMA, which was trained from scratch by Meta AI, or releases from EleutherAI, a nonprofit that is unique in its contribution to open-source technology. Biderman says she knows of only one other group like it—and that’s in China.

EleutherAI got its start thanks to OpenAI. Rewind to 2020 and the San Francisco–based firm had just put out a hot new model. “GPT-3 was a big change for a lot of people in how they thought about large-scale AI,” says Biderman. “It’s often credited as an intellectual paradigm shift in terms of what people expect of these models.”

Related Story

Generative AI is changing everything. But what’s left when the hype is gone?

No one knew how popular OpenAI’s DALL-E would be in 2022, and no one knows where its rise will leave us.

Excited by the potential of this new technology, Biderman and a handful of other researchers wanted to play with the model to get a better understanding of how it worked. They decided to replicate it.

OpenAI had not released GPT-3, but it did share enough information about how it was built for Biderman and her colleagues to figure it out. Nobody outside of OpenAI had ever trained a model like it before, but it was the middle of the pandemic, and the team had little else to do. “I was doing my job and playing board games with my wife when I got involved,” says Biderman. “So it was relatively easy to dedicate 10 or 20 hours a week to it.”

Their first step was to put together a massive new data set, containing billions of passages of text, to rival what OpenAI had used to train GPT-3. EleutherAI called its dataset the Pile and released it for free at the end of 2020.

EleutherAI then used this data set to train its first open-source model. The largest model EleutherAI trained took three and a half months and was sponsored by a cloud computing company. “If we’d paid for it out of pocket, it would have cost us about $400,000,” she says. “That’s a lot to ask for a university research group.”

Helping hand

Because of these costs, it's far easier to build on top of existing models. Meta AI’s LLaMA has fast become the go-to starting point for many new open-source projects. Meta AI has leaned into open-source development since it was set up by Yann LeCun a decade ago. That mindset is part of the culture, says Pineau: “It’s very much a free-market, ‘move fast, build things’ kind of approach.”

Pineau is clear on the benefits. “It really diversifies the number of people who can contribute to developing the technology,” she says. “That means that not just researchers or entrepreneurs but civil governments and so on can have visibility into these models.”

Like the wider open-source community, Pineau and her colleagues believe that transparency should be the norm. “One thing I push my researchers to do is start a project thinking that you want to open-source,” she says. “Because when you do that, it sets a much higher bar in terms of what data you use and how you build the model.”

But there are serious risks, too. Large language models spew misinformation, prejudice, and hate speech. They can be used to mass-produce propaganda or power malware factories. “You have to make a trade-off between transparency and safety,” says Pineau.

For Meta AI, that trade-off might mean some models do not get released at all. For example, if Pineau’s team has trained a model on Facebook user data, then it will stay in house, because the risk of private information leaking out is too great. Otherwise, the team might release the model with a click-through license that specifies it must be used only for research purposes.

This is the approach it took for LLaMA. But within days of its release, someone posted the full model and instructions for running it on the internet forum 4chan. “I still think it was the right trade-off for this particular model,” says Pineau. “But I’m disappointed that people will do this, because it makes it harder to do these releases.”

Related Story

Meta has built a massive new language AI—and it’s giving it away for free

Facebook’s parent company is inviting researchers to pore over and pick apart the flaws in its version of GPT-3

“We’ve always had strong support from company leadership all the way to Mark [Zuckerberg] for this approach, but it doesn’t come easily,” she says.

The stakes for Meta AI are high. “The potential liability of doing something crazy is a lot lower when you’re a very small startup than when you’re a very large company,” she says. “Right now we release these models to thousands of individuals, but if it becomes more problematic or we feel the safety risks are greater, we’ll close down the circle and we’ll release only to known academic partners who have very strong credentials—under confidentiality agreements or NDAs that prevent them from building anything with the model, even for research purposes.”

If that happens, then many darlings of the open-source ecosystem could find that their license to build on whatever Meta AI puts out next has been revoked. Without LLaMA, open-source models such as Alpaca, Open Assistant, or Hugging Chat would not be nearly as good. And the next generation of open-source innovators won’t get the leg up the current batch have had.
In the balance

Others are weighing up the risks and rewards of this open-source free-for-all as well.

Around the same time that Meta AI released LLaMA, Hugging Face rolled out a gating mechanism so that people must request access—and be approved—before downloading many of the models on the company’s platform. The idea is to restrict access to people who have a legitimate reason—as determined by Hugging Face—to get their hands on the model.

“I’m not an open-source evangelist,” says Margaret Mitchell, chief ethics scientist at Hugging Face. “I do see reasons why being closed makes a lot of sense.”

Mitchell points to nonconsensual pornography as one example of the downside to making powerful models widely accessible. It’s one of the main uses of image-making AI, she says.

Mitchell, who previously worked at Google and cofounded its Ethical AI team, understands the tensions at play. She favors what she calls “responsible democratization”—an approach similar to Meta AI’s, where models are released in a controlled way according to their potential risk of causing harm or being misused. “I really appreciate open-source ideals, but I think it’s useful to have in place some sort of mechanisms for accountability,” she says.

OpenAI is also shutting off the spigot. Last month when it announced GPT-4, the company’s new version of the large language model that powers ChatGPT, there was a striking sentence in the technical report: “Given both the competitive landscape and the safety implications of large-scale models like GPT-4, this report contains no further details about the architecture (including model size), hardware, training compute, dataset construction, training method, or similar.”

These new restrictions are partly driven by the fact that OpenAI is now a profit-driven company competing with the likes of Google. But they also reflect a change of heart. Cofounder and chief scientist Ilya Sutskever has said in an interview with The Verge that his company’s openness in the past was a mistake.

OpenAI has definitely shifted strategies when it comes to what is and isn’t safe to make public, says Sandhini Agarwal, a policy researcher at OpenAI: “Previously, if something was open-source maybe a small group of tinkerers might care. Now, the whole environment has changed. Open-source can really accelerate development and lead to a race to the bottom.”

But it wasn’t always like this. If OpenAI had felt this way three years ago when it published details about GPT-3, there would be no EleutherAI. 

Related Story

The ChatGPT-fueled battle for search is bigger than Microsoft or Google

A frenzy of activity from tech giants and startups alike is reshaping what people want from search—for better or worse.

Today, EleutherAI plays a pivotal role in the open-source ecosystem. It has since built several large language models, and the Pile has been used to train numerous open-source projects, including Stability AI’s StableLM (Mostaque is on EleutherAI’s board).

None of this would have been possible if OpenAI had shared less information. Like Meta AI, EleutherAI enables a great deal of open-source innovation.

But with GPT-4—and 5 and 6—locked down, the open-source crowd could be left to tinker in the wake of a few large companies again. They might produce wild new versions—maybe even threaten some of Google's products. But they will be stuck with last-generation's models. The real progress, the next leaps forward, will happen behind closed doors.

Does this matter? How one thinks about the impact of big tech firms’ shutting down access, and the impact that will have on open-source, depends a lot on what you think about how AI should be made and who should make it.

“AI is likely to be a driver of how society organizes itself in the coming decades,” says Ghavi. “I think having a broader system of checks and transparency is better than concentrating power in the hands of a few.”

Biderman agrees: “I definitely don’t think that there is some kind of moral necessity that everyone do open source," she says. "But at the end of the day, it's pretty important to have people developing and doing research on this technology who are not financially invested in its commercial success."

Yet she is clear about the stakes. “I’m actually a lot less critical of OpenAI than a lot of people assume that I would be," she says. "There are real risks involved in publicly releasing this technology, real potential dangers."

OpenAI claims it is just playing it safe. “It’s not that we think transparency is not good,” says Dave Willner, head of OpenAI’s trust and safety teams. “It’s more that we’re trying to figure out how to reconcile transparency with safety. And as these technologies get more powerful, there is some amount of tension between those things in practice.”

“A lot of norms and thinking in AI have been formed by academic research communities, which value collaboration and transparency so that people can build on each other’s work,“ says Willner. “Maybe that needs to change a little bit as this technology develops.”


by Will Douglas Heaven

 

Guillermo del Toro on leaving live-action films behind and calling out the film industry

Celebrated Mexican director Guillermo del Toro during the Annecy Animation Film Festival
By David Mouriquand

Disclaimer: This article contains Oscar-winning director Guillermo del Toro dropping a fair few f-bombs.

This year’s Annecy Animation Film Festival, which runs until tomorrow evening, has elected to put the spotlight on Mexican animation within its exceptional and diverse programme.

To mark the occasion, award-winning The Book of Life director and animator Jorge R. Gutierrez was asked to design the official poster for the event, and Mexican animation has been the subject of several masterclasses over the week, which we had the pleasure of attending.

And who better to lead several of these masterclasses than the celebrated director Guillermo del Toro, the man behind acclaimed films like HellboyPan’s LabyrinthNightmare Alley and the Oscar-winning films The Shape of Water and Pinocchio?

Animation was del Toro’s first love, as he highlighted in Annecy while recounting his early beginnings in the genre, using his father’s Super 8 camera. And considering what he told us during two stellar masterclasses, it will be his last.

Indeed, after his Best Animated Picture Oscar win for Pinocchio, the director said that he only has a few live-action feature films left in him, affirming that he prefers to focus on animation. 

“There are a couple more live-action movies I want to do but not many,” said del Toro. “After that, I only want to do animation. That’s the plan.”


Guillermo del Toro with Euronews Culture's David Mouriquand - "No David, I don't want to hear about your pitch for Hellboy III !

The 58-year-old director’s pivot to full-time animation is one to be celebrated, and it clearly means the world to him, as listening to del Toro speak about his beginnings in Guadalajara, sharing numerous anecdotes and communicating his love for the animation medium is truly addictive. And often very funny.

He shared that cash-strapped beginnings meant cutting costs, including on food (“We found out that the cheapest food was dog food – it’s full of calcium, so we had dog food and all the money we saved we used to buy a lens or a new light”), his undying respect for his special effects tutor and "make-up teacher” Dick Smith (The Godfather, The Exorcist), and how his then soon-to-be father-in-law told him that he believed that people in the movie business were “very indecent” - to which del Toro replied: “I am a decent man, I will make my puppets out of clay – there’s very little room for cocaine!”

He also recounted how his first movie was supposed to be a stop-motion film, and how that plan went down the tubes when vandals burglarized the studio and destroyed the hundreds of puppets and sets, thereby putting an end to his original dream and making the young filmmaker switch to live-action – his debut film Cronos.

He’s now getting his second shot.


Guillermo del Toro speaking at the Annecy Animation Film FestivalDavid Mouriquand

The “sandwich of shit”

Even if you’re a two-time Oscar winner and considered a filmmaking legend, don’t think for a single moment that his hands aren’t tied.

Del Toro called out the film industry for being “geared toward grinding out shit and destroying your art”, revealing that five of his projects were turned down by studios in the last two months.

“In the last two months, they said no to five of my projects,” he shared with the Annecy audience. “So it doesn’t go away. Making movies is eating a sandwich of shit.”

“There’s always shit, just sometimes you get a little more bread with yours,” he said, painting quite the visual culinary image.

He added: “The rate of productivity against your efforts will remain frustratingly difficult, and frustratingly long. And you will always encounter assholes. But have faith in the stories you want to tell and wait until someone wants to buy them.”

Animation is saying fuck you to the world as it was presented to you as a kid. Don’t stop saying fuck you to the world. Keep saying it until you crash. That’s the important thing about monsters. I love them because they represent a corporal fuck you to the world.

As candid as he is about the industry gatekeepers and moneymen, it all comes back to passion with del Toro. During his talks in Annecy, del Toro stressed how stop-motion is “the most beautiful form of animation because it’s the most intimate.”

“There is always a strong connection between the animator and the physical model,” he continued, adding: “Animation is for the undomesticated spirits. Animation is saying fuck you to the world as it was presented to you as a kid. Don’t stop saying fuck you to the world. Keep saying it until you crash. That’s the important thing about monsters. I love them because they represent a corporal fuck you to the world.”

Del Toro also emphasized how animation is a collaborative medium, and the importance of passing on knowledge so that future generations can help keep the artform alive.

“If you love animation and don’t just love yourself, then teach it, share it. If you have it, share it. We’re a bunch of crazy fuckers trying to keep an ancient form of magic alive. So share it and pass it on.”

He talks the talk, but also walks the walk. Rare are directors like del Toro who are so full of love, humour and generosity. During the whole festival, it was a delight to watch him take the time to talk to audience members, to greet them and hear projects from young filmmakers and animators.

This is what’s great about Annecy – you're not alone, you’re part of a community of freaks. It’s fantastic – take advantage of it!

In his second seminar this week, he told students and animators in the audience that they are not alone and connected to the world, emphasizing the need for this big hearted community to pull together, share resources and appreciate what others are doing in the field.

“This is what’s great about Annecy – you're not alone, you’re part of a community of freaks. It’s fantastic – take advantage of it!”

Heartwarming though that undoubtedly is, if an Oscar winner like del Toro gets his projects nixed, what hope do others have?

A sobering and rather depressing thought.

However, thank the animation gods for festivals like Annecy. It’s an event that not only celebrates the very best of animation but gives a sizeable platform to up-and-coming talents who need the exposure in order to establish their voices on a larger, international scene.

Still, if you are still reeling (like we are) at the fact we’re not getting one or more of his dismissed five projects (we’re still waiting for the Mexican maestro’s long rumoured adaptation of H.P. Lovecraft’s ‘At the Mountains of Madness’), prepare yourself for more expletives...

Not just for the kids & taking over the asylum

The Mexican director insisted that the animation medium is not just for the younglings.

“I believe you can make an adult fantasy drama with stop-motion and move people emotionally,” he said. “I think stop-motion can be intravenous, it can go straight to your emotions in a way that no other medium can.”

While acknowledging that recent string of animated box office success stories like Spider-Man: Across the Spider-Verse and The Super Mario Bros. Movie – as well as citing Teenage Mutant Ninja Turtles: Mutant Mayhem, which premiered in Annecy and hits cinemas in August – have helped the genre and give it “a little more latitude”, del Toro stressed that “there are still big fights to be had.”

Indeed, he did not mince his words with regards to Hollywood and big studios, affirming that animation is the purest form of art and that it has been “kidnapped by a bunch of hoodlums” - seemingly referring to mainstream animation studios which avoid risks with an endless barrage of easily consumable fare.

“We have to rescue it. I think that we can Trojan-horse a lot of good shit into the animation world.”

His Trojan horse analogy is apt, as there is still a long way to go for animation to be considered on par with live-action features. Much like horror has been considered the runt of the genre litter in Hollywood for decades and is experiencing something of a reappraisal by major studios, animation needs to gain a further appreciation. Not by the audience, mind you. The Annecy Film Festival attests to the passion and excitement that surrounds animation, with some of the most vocally appreciative and utterly delightful viewers you’re likely to meet and share screenings with. Animation needs to be granted a seat at the big boys table by an industry that consistently sets up roadblocks.

“We’re still being kept at the fucking children’s table in the industry,” commented del Toro. “We have to fight to change that – to take over the asylum and then run it! (...) That’s why festivals like Annecy are so important.”

Fear “emoji-style” animation, not AI

The director stated that “emoji-style” animation where everyone is “happy and sassy and quick” only goes so far. As opposed to commercial animation, he prefers to see “real life in animation” and believes that there’s not a moment to lose.

(If) I see a character raising his fucking eyebrow, or crossing his arms, having a sassy pose — oh, I hate that shit. Why does everything act as if they’re in a sitcom? I think is emotional pornography.

“I think it’s urgent to see real life in animation. Emotions are codified into a sort of teenage rom-com, almost emoji-style behaviour. (If) I see a character raising his fucking eyebrow, or crossing his arms, having a sassy pose — oh, I hate that shit. Why does everything act as if they’re in a sitcom? I think is emotional pornography. All the families are happy and sassy and quick, everyone has a one-liner. Well, my dad was boring. I was boring. Everybody in my family was boring. We had no one-liners. We’re all fucked up. That’s what I want to see animated. I would love to see real life in animation. I actually think it’s urgent.”

When people say they’re scared of AI, I say don’t be afraid of any intelligence; be afraid of stupidity.

Another warning del Toro issued was against corporate stupidity, not artificial intelligence, as many as decrying.

“When people say they’re scared of AI, I say don’t be afraid of any intelligence; be afraid of stupidity,” shared del Toro. “Every intelligence is artificial. Stupidity is natural. Completely, 100 percent natural, organic. Be afraid of stability. That’s the real enemy.”

That being said, he stressed how he tries to avoid digital sets and effects as much as possible: “I think that we need things that we know are made by humans to recover the human spirit. I love things that look handmade. I fucking hate perfection.”

He concluded, with reference to studio speak: “I think when somebody calls stories ‘content,’ when somebody says ‘pipeline,’ they’re using sewage language.”

Preach, maestro.


Guillermo del Toro during his second masterclass in AnnecyDavid Mouriquand

Regarding those live-action movies still left in him, Guillermo del Toro is supposed to shoot his long-gestating passion project Frankenstein this summer (all depending on the ongoing writer’s strike). Oscar Isaac, Mia Goth and Andrew Garfield are attached to the project. 

He’ll also be returning to Netflix for his next stop-motion adventure, the animated adaptation of Kazuo Ishiguro’s novel, ‘The Buried Giant’. The book follows an elderly couple living in a fictional post-Arthurian England in which no one can retain their long-term memories. Del Toro plans to shoot the film using the same stop-motion technique he used on Pinocchio. And considering the stunning craft on show in this Oscar-winning film, you can bet The Buried Giant will only cement why we’re lucky del Toro is choosing to focus on animation. 

After all, the striking and fantastical worlds of Guillermo del Toro deserve to not be shackled by the real world – especially when animation can do so much more.

He knows it. Annecy understands it. And it’s time everyone else gets it.

Indonesia needs 1.5 billion USD to terminate coal in production

Indonesia needs a grant of about 1.5 billon USD from the International Partnership Group (IPG), including the US, Japan, Canada, Denmark, the European Union (EU), France, Germany, Italy, Norway and the UK to terminate coal mining and production under the Just Energy Transition Partnership (JETP)'s framework.

VNA Friday, June 30, 2023 10:42
https://link.gov.vn/yvX6CXi0
Coal barges are pictured as they queue to be pull along the Mahakam River in Samarinda, East Kalimantan province, Indonesia, on Aug 31, 2019. (Photo: REUTERS)

Jakarta (VNA) – Indonesia needs a grant of about 1.5 billon USD from the International Partnership Group (IPG), including the US, Japan, Canada, Denmark, the European Union (EU), France, Germany, Italy, Norway and the UK to terminate coal mining and production under the Just Energy Transition Partnership (JETP)'s framework.

Executive director Indonesian Institute for Essential Services Reform (IESR) Fabby Tumiwa said the institute cannot take on debt for energy and coal mining termination programmes. In case of having to take on debt, preferential financial sources must ensure an interest rate of less than 3%.

Last year, at the G20 Summit in Indonesia, IPG pledged to raise 10 billion USD in public funds over three to five years as part of a JETP plan to help Indonesia remove coal in production.

Financial institutions in the Glasgow Financial Alliance for Emission Reduction (Gfanz) have also pledged 10 billion USD, bringing the JETP programme a grant of up to 20 billion USD.

However, Indonesian Ministry of Energy recently said that the country will receive about 160 million USD in funding from JETP. This amount is too modest compared to the 20 billion USD committed under the JETP framework.

Meanwhile, about half of the 20 billion USD is expected to come from commercial loans.

Indonesia is currently drafting a comprehensive investment plan for the JETP programme. The document will be released in August.

The document will provide details on the number of coal-fired power plants that Indonesia will have to shut down for a specified period of time./.

Italy's right-wing government seeks to shut door on adoption for same-sex couples


By Savin Mattozzi • Updated: 30/06/2023 - 

In Italy, same-sex couples don't have a legal right to adoption or surrogacy. Now the right-wing government wants to close a loophole allowing them to adopt abroad.

Siro and Giuseppe first met in the early days of online dating in 2006. After exchanging phone numbers, they spoke over the phone for six months before deciding to meet in person.

“I don’t know what struck me most about him” Siro explains. “At first I was infatuated by his person, aside from just his physical appearance. Obviously I liked his eyes and how he is physically. But I fell in love with what was inside him.”

They eventually moved in together as they continued to build a life of their own. Siro, 38 and Giuseppe 36, manage a dance school outside of Naples in addition to working their regular jobs.

Slovenia becomes first post-communist country to legalise same-sex marriage and adoption

 


Then, as happens in the lives of many couples, the two started to discuss the possibility of having children.

“It might have been after a movie but one night we started talking about this idea to have a child” Giuseppe says. “And I asked ‘who knows if we could even have a kid.”

Nearly seven years after they first asked themselves that question, they now know how complicated the situation really is.

Rights and responsibilities

Siro and Giuseppe photographed laughing together.
Savin Mattozzi/Euronews

In Italy, same-sex couples do not have a legal right to adoption, stepchild adoption or surrogacy. The only opportunity for both parents to have rights with their child is via a legal loophole where the parents can go to another country and have their child by surrogacy.

Now, Italy’s right-wing government is looking to close that opportunity as well, making the possibility for same-sex couples to have children nearly impossible.

Italy has historically lagged far behind its Western European neighbors in its treatment of LGBTQ people. According to ILGA-Europe, Italy ranked 34th out of 49 European countries in terms of rights granted to the LGBTQ community.

Even if the law is not passed, LGBTQ families and people in Italy still struggle to be treated equally under the law.

The country has been condemned by the EU parliament, Human Rights Watch and the European Court of Human Rights over the years for its treatment of LGBTQ people without any affirmative action taken by the state.

For Giuseppe, this leaves him feeling a sense of abandonment from the European community.

“Considering that we are a European country and that they have an obligation to defend us, they [European countries] only use their power to say things” Giuseppe says. “Defend us, actually defend us, not just with words but with concrete actions.”

Setbacks in legislative progress

Participants hold a giant rainbow flag as they take part in the Pride March to show support for members of the LGBT community, in Milan on June 24, 2023.
GABRIEL BOUYS/AFP

Recently, Italian politicians tried to make concrete change to laws that would protect the LGBTQ community.

In 2021, a group of politicians attempted to pass a law that would make it a hate crime to attack someone based on their sexual orientation, gender identity or disability. The law was swiftly kicked out of parliament after a vote which ended in the right-wing coalition cheering its failure.

“Since the passing of civil unions in 2016, we basically haven’t progressed” Explains Benny, a 38-year-old university worker who lives with her fiancĆ©, Bianca. “Honestly at this point it looks like we’re going backwards… aside from civil unions, there are no rights for non-heterosexual people. Italy is truly among some of the most backward countries in Europe.”

The situation in Italy for not just LGBTQ individuals but also couples has forced people like Benny and Bianca to face some tough decisions about their future and whether they should stay or leave the country to have a better life.

“I, differently from Benny, would consider leaving Italy to bring up a child in a country that recognizes that they are the child of two people who love each other” says Bianca. “Apart from the fact that this proposed law is extremely discriminatory, it also is an injustice because just because a couple has the fortune to be a man and a woman doesn’t necessarily mean they are adept to be parents.”

Couples who decide to stay however risk having their parental rights being taken away from them even if they thought they were safe.






 

How has Pride and LGBTQ culture changed our lives?


By Andrea Carlo • Updated: 30/06/2023 - 

Euronews Culture takes a look at the queer cultural practices and linguistic traditions which have influenced our everyday ways of living, speaking, and thinking.

Following centuries of oppression and marginalisation, Europe’s LGBTQ+ community is finally starting to receive both legal recognition and societal acceptance, with Pride parades and events having become an integral part of our yearly calendars.

As a result of such oppression, queer individuals came to create tight-knit communities with rich and diverse subcultures, that would subsequently come to both influence and be co-opted by mainstream society.

Now that Pride month draws to a close, Euronews Culture dives into the different queer-rooted cultural practices, traditions and linguistic expressions which have impacted our everyday ways of speaking, dressing, thinking, and living in its entirety.

From "slay" to "camp": our pink-hued lexicon

Ever accused someone of "throwing shade"? Or thought something looked "camp"?

Without realising it, you may have been using terms and expressions that have a queer-specific history.

Our everyday parlance is imbued with expressions, terms and idioms that were developed within LGBTQ+ communities. For starters, a lot of the present-day slang stereotypically associated with Generation Z (teenagers and young adults born from the mid-late 1990s to circa 2010) was inherited from the jargon of New York City’s ballroom and drag communities, which were birthed by queer people of colour. Think of "slay", "snatched", "wig" – interjections typically used to express enthusiasm or approval – which are now adopted in everyday contexts by many individuals who have no connection to the queer community.

Pioneering American drag queen RuPaul's reality show -- RuPaul's Drag Race -- which debuted in 2009 and has become a global franchise, is often credited as having exposed the public to such terminology. In 1990, documentary Paris is Burning, which delved into NYC's ball rooms, also contributed to growing public awareness of queer subcultures.

Speaking to Euronews Culture, Ricky Tucker -- a prominent NYC-based culture critic -- remarked on the extent to which English slang has picked up words from the Big Apple's queer subcultures.

"[The lexicon of] ballroom is thriving across the world, but ballroom is still underground," he stated. "The etymology of the words themselves is often hard to figure out" -- Tucker remarked on how certain expressions, namely "throwing shade", may have originated outside of the ball and drag community -- "so you've got to see the context in which they're most commonly used."

RuPaul
NICK HOOVER/AP2006

But, “pink language”, so to speak, is not just a by-product of US-based queer subcultures. In the UK, many expressions such as "naff" (bad) and "camp" (kitschy) came from or were associated with 'Polari', a sociolect which developed in Victorian London’s disenfranchised gay and travelling communities, that used words of Cockney, Romance, Yiddish and Romani origin ('Polari' itself derives from the Italian word parlare, "to speak").

'Polari' was queer argot that, while not exclusive to the community, was used to communicate at a time in which being gay was tantamount to a social death sentence – and often a criminal one too. It was only the popular 1965-1968 BBC Radio show Round the Horne which would come to introduce certain Polari expressions to a mainstream UK public.

Paul Baker, a scholar looking at the history of 'Polari', noted its spill-over into mainstream British English vocabulary, although he believes its influence is somewhat less wide-ranging than the US's ballroom lexicon.

"[I]t's only really a very small number of Polari words that made it into the mainstream, and in some cases this was temporary," he told Euronews Culture. "Some of the words which have survived the test of time were only peripherally Polari, but were words more generally associated with the gay community (words like camp or trade)."

"The forms of language used by queer folk in the US have been more influential on the mainstream," he added. "So RuPaul’s Drag Race has helped to make some forms of vocabulary more common."
From pop music to makeup: how queer subcultures changed our cultural landscape

Throughout the centuries, queer individuals -- often shunned from their domestic lives -- forged new families and found greater acceptance within more liberal-minded, bohemian circles.

It comes as little surprise that what we describe as "queer" culture would come to be intimately intertwined with the world of theatre, fashion and the arts, and that it would consequently exert an indelible influence

Celebrity Kim Kardashian is often credited with having 'popularised' the contouring makeup trend
Ashley Landis/Copyright 2023 The AP. All rights reserved

This influence traces a long history, ranging from the "traditionally" masculine 1920s GarƧonne style which was associated with lesbian women, to the colourful, free-flowing fashions of the 1970s and dramatic looks of 1980s New Wave rockers. To this day, many makeup trends that had become inescapable by the mid-2010s – namely that of "contouring", or using different foundation hues to dramatically accentuate cheekbones and other facial features – drew direct inspiration from the styles used by drag queens, and was widely popularised by the Kardashian-Jenner sisters and their reality show.

The footprint left by the queer community is not limited to the realm of everyday aesthetics, but sonics too. Much of the music that surrounds us and is blasted on a daily basis from clubs to shopping centres, has its origins in queer subcultures -- most prominently, disco and house, the former of which developed in 1970s NYC, and the latter in the 1990s.

Since the start of the new decade, and in particular in the midst of the COVID-19 pandemic, disco's glitzy, upbeat sound has witnessed a resurgence, especially as people seek a musical distraction from the hardships of everyday life.

British-Kosovar pop star Dua Lipa's hit 2020 album Future Nostalgia is often seen as the herald of this new trend, as a shimmery, unabashedly dance-heavy record released in the midst of a minimalist indie and trap-dominated landscape. Indeed, this year's highly anticipated Barbie movie, directed by Greta Gershwin and starring Margot Robbie and Ryan Gosling, has a soundtrack replete with disco flavours, starting from Lipa's own "Dance the Night" track.



'Queerbaiting': influence or appropriation?


The impact which the LGBTQ+ community has had on mainstream pop culture is undeniable. But things may not be so rosy: has queer culture influenced our popular trends, or has it been appropriated?

Part of the trouble boils down to the crux of the question: what is queer culture, and does it exist in a clearly identifiable form?

Indeed, what is read as “queer” may change in a specific time and place. Crossdressing for entertainment purposes, for instance, has been a part of “straight” working-class pub culture in the UK, for instance, while men’s fashions in the 18th century — where aristocrats donned puffy wigs and heavy makeup — would be perceived as “androgynous” and “queer” by many within a contemporary, Western social context.

The trouble comes when, in a highly politicised environment in which LGBTQ+ rights are highly fragile, the adoption of specific components of LGBTQ+ culture may have a loaded meaning.

British pop star Harry Styles has been accused by some of "queerbaiting"
Vianney Le Caer/2023 Invision

Certain artists such as Harry Styles and Charlie Puth, who have recently adopted more androgynous clothing styles and who have also not come out as LGBTQ+, have been accused by some of “gaybaiting” or “queerbaiting”: that is, appropriating a specific style or set of mannerisms to appeal to and profit off of their LGBTQ+ fanbase, while not enduring the social oppression that comes with being publicly queer.

The issue becomes even more polemical when celebrities such as Styles and Puth get praised for their gender-bending and “rule-breaking” fashions, while drag queens and trans individuals are finding their rights stripped and attacked left, right and centre, especially as many US states pass anti-drag bills.

“Gaybaiting is a thing,” Tucker asserted, noting that the relationship between mainstream pop culture and queer subcultures is “complicated, paradoxical, and complex”.

As for why mainstream culture co-opts "queer" styles? "People who are put upon are generally seen as cool, and hip," he remarked. "And queer — and particularly Black queer — people are the most put upon in the US."

"Anything that's hip and spreads the world and catches fire," Tucker quipped, "can generally be tracked back to a queer man of colour."

Tucker took the example of Madonna’s iconic 1990 "Vogue" music video, which brought the ballroom dance style to the masses. But, as he noted, the song’s memorable bridge — in which she references a slew of Golden Age Hollywood stars — has no real connection to the spirit of ballroom.


Moreover, to make matters worse, many people erroneously see "voguing" as the Queen of Pop's creation, thus divorcing the dancing style from its rich cultural background.

While the public’s often lazy approach to referencing cultural practices from marginalised communities may be considered problematic in today's social milieu, some see the matter as a relative non-issue, arguing that community has "ownership" over a particular cultural practice, especially given queer culture’s own fluidity.

Professor Matt Cook, recently appointed the first LGBTQ+ history professor at the University of Oxford, considers the talk of “queerbaiting” to be a red — or, rather, pink — herring distracting from the real issue at hand: the ongoing political ambushes gnawing at LGBTQ+ rights and equality.

"I’m resistant to the idea of there being a ‘straight’ way of acting," he told Euronews Culture. "I celebrate anyone who is exploring different ways of expressing themselves."

"Of course we should question people's behaviours, but what is the real problem here?" he added.

And on the issue of the politicisation and criminalisation of drag culture?

"The right is trying to foment a culture war," he stated. "We don’t address that by critiquing people like Harry Styles."
 


EU taking a stand against pharmaceutical residues in wastewater

In partnership with 
The European Commission

By Euronews Digital • Updated: 29/06/2023 -

In Uppsala, Sweden, an innovative pilot project to rid water of pollutants is making waves.

Every year, Uppsala's water treatment plant processes around 20 million cubic meters of wastewater. A cocktail of powerful pollutants - residues from medications against depression, inflammation, anxiety, and high blood pressure - have been detected in the water.

"We are talking about nanograms per litre. So it is very low level. But some of our priority substances could have a negative effect on the aquatic environment already at those very low levels," says Anna Maria Sundin, a Development Engineer at Uppsala Vatten.
Uppsala Waste Water Plant, Sweden

Such pharmaceutical residues can severely affect the internal organs and reproductive cycles of fish and amphibians. The Uppsala water treatment plant's pilot project aims to address this issue by removing at least 80% of five of these harmful substances.

To achieve this ambitious target, the project employs complex technologies including activated carbon and anion exchange. However, these come with significant challenges due to the varied properties of the compounds. Sundin states, "The different compounds on different substances have different chemical and physical properties. And that affects the removal efficiency."

The urgency of upgraded removal technologies cannot be overstated, particularly given the global scale of the problem. It's estimated that 4,000 active pharmaceutical ingredients are being used in medicines worldwide, and up to 90% of oral doses can be excreted by our bodies as active substances.

This persistent issue extends well beyond Sweden. Stefan Berggren, Director at the Swedish Knowledge Centre for Pharmaceuticals in the Environment, explains, "One of the issues is that they are very resistant to breaking down in the body...they will reach the environment as a fairly persistent substance".

This problem has drawn the attention of the European Commission, which has proposed to strengthen the environmental risk assessment for all pharmaceutical companies placing their medicines on EU markets. It even has plans to refuse market authorization where companies do not provide complete or sufficiently substantiated assessments.


Swedish Knowledge Centre for Pharmaceuticals in the Environment

Elin Engdahl, Senior Policy Officer at the Swedish Society for Nature Conservation, believes this is a step in the right direction, but calls for further action. "We need to try to green the healthcare sector. Now the pharmaceutical industry is developing more green pharmaceuticals; they are more biological, they can degrade more easily. But they are more expensive as well," she says.

The pilot project in Uppsala will conclude in 2024, but the quest to mitigate the environmental impact of pharmaceutical residues in our water continues. By leveraging cutting-edge technology and driving policies that encourage sustainable practices, we can hope to protect our aquatic ecosystems for future generations.

In its proposal for the Urban Waste Water Treatment directive, the Commission is also working to reduce the environmental impact of medicines by extending a Responsibility scheme that also applies to pharmaceuticals developers.