Ofcom will be given additional tools by the government to help tech companies prevent, detect and remove child sexual abuse.
These powers will be granted through amendments to the Online Safety Bill. This bill aims to regulate the internet.
Ofcom can impose fines up to PS18m and 10% of the global annual turnover of tech companies if they fail to comply.
But there are growing concerns about how it will actually work in practice.
It is unclear what additional tools Ofcom, the media regulator, will receive.
One of the biggest concerns is the end-to-end encryption (E2EE), or ultra-secure messaging such as Signal, and how it might be compromised.
According to the government, it supports tools that can detect child sexual abuse imagery within and around E2EE environments. However, it respects user privacy. This will inform the wider debate about user safety and privacy.
Professor Alan Woodward, University of Surrey, told BBC that the only way to detect child abuse imagery and associated text is with unencrypted data.
"If the OSB insists that such material be found in encrypted data, it is possible only by inspecting the sending and receiving devices. It is available for consumption.
"The implication is that there is some universal client-side scanning, which many will find intrusive and liable... to be used to detect other items not related to child safety."
Client-side scanning is a technology that matches message content against a database (for example, images of child sexual abuse) before it's sent to the intended recipient.
Experts are urging Ofcom to give the power to impose scanning technology. They also call for more information from the government about the technical feasibility, security implications, and privacy impact.
Experts claim that scanning technology cannot be used for only "good" purposes.
Prof Woodward stated that "the big problem will be that any technology which can be used for looking at what is encrypted could be misused to conduct surveillance."
E2EE's fundamental principle is that only the intended recipients and sender of a message can infer or know the content of that message. This is why people love Signal and WhatsApp.
Susie Hargreaves (CEO of the Internet Watch Foundation) wants the OSB to include provisions that allow Ofcom to be co-designated with the IWF in order to regulate online child sexual abuse material.
She stated that "our unparalleled expertise in the area would make it strong and effective from the beginning."
"We have strong collaboration relationships with industry and law enforcement as well as the leading expertise that can ensure no child is forgotten or their suffering is not overlooked."
According to the overnment, it won't suffice for a large tech company to claim that it cannot deploy certain technologies on its platform because it is not configured in a way that allows them.
If necessary and proportionate, Ofcom may issue a notice to companies to show that they have taken all reasonable steps to remove images of child sexual abuse.
However, this is dependent on how the regulator assesses the risk of child exploitation.
Prof Woodward stated that Ofcom has a steep mountain to climb. To be able to provide the OSB with technical solutions, they will need to attract rare talent.
"That's not even to mention the skills they'll need to navigate secondary legislation...It is a really huge task ahead."
Ofcom informed the BBC that it was ready to assume the new role. This includes skills and expertise from the tech sector as well as child protection and advocacy experts.
According to its spokesperson, "Tackling child sexual assault online is central in the new online safety laws – and rightly so." Although it's a difficult job, once the OSB has been passed, we will be ready to implement these groundbreaking laws.
According to the National Crime Agency, there are approximately 550,000-850,000 people in the UK that pose a risk of sexual harm to children.
Online access to this content can lead to offenders normalizing their consumption, sharing techniques with one another on how to avoid detection, and eventually escalating to actual child sexual abuse offenses.
Nadine Dorries, digital minister, stated that tech companies have a responsibility to not provide safe spaces for horrific images of child abuse to go online. They should not ignore the horrible crimes that are happening on their websites.
Maeve Hanna is a partner at law firm Allen & Overy. She told BBC that although the intentions of the amendment were admirable, it was unclear what a tech company would have to do in order to comply with Ofcom notices or avoid large fines.
This lack of clarity will pose real challenges for any Ofcom enforcement action. How will Ofcom demonstrate that tech companies have developed a particular technology that doesn't yet exist?
Follow Shiona McCallum @shionamc