Content Paint

LLMs

No LLMs aren’t about to “autonomously” hack your company

Welcome to your latest episode of “is this exciting or is this mild AI exaggeration™"

Meta launches Llama 3: Puts its GPUs to work on Llama 4

The LLM arms race continues: Don't bet against the company spending $33 billion per year on GPUs.

Synthetic focus groups and RAG in the contact centre: Bayer, Verizon, WPP on their AI deployments

"You can create a consumer, a brand strategist, a brand marketer, client, encoded with actual ground truth data, then critique the content that's been generated by the system with agents playing off against each other."

“Scant evidence” – Google’s AI chemistry claims were very misleading

AI discovered 2.2m new "materials" said DeepMind. Chemistry professors investigated, found hallucinations, repetition, known crystals.

DataStax snaps up Langchain-democratiser Langflow

"You have to chain user input, system prompts, and DB data to feed the LLM and then lots of processing to deliver that magic AI agent experience to the user"

Databricks LLM DBRX

Inference “up to 2x faster than LLaMA2-70B” for new model, trained on 12 trillion tokens.

Musk open-sources 314-billion parameter LLM "Grok" (including weights), but...

Apache 2.0 licence and weights, but no training code/reproducible datasets.

Europe’s AI Act demands extensive "logs" - targets biometrics, bias, black boxes

Emotion recognition banned in workplaces, classrooms.

The key to making Large Language Models work is becoming clearer.

"GenAI with too low a temperature lacks creative spark... Too high a temperature and it will strongly hallucinate" -- Neo4j’s Jim Webber discusses new ways of delivering GenAI value.

Search the site

Your link has expired. Please request a new one.
Your link has expired. Please request a new one.
Your link has expired. Please request a new one.
Great! You've successfully signed up.
Great! You've successfully signed up.
Welcome back! You've successfully signed in.
Success! You now have access to additional content.