Nue is an open source web framework for UX developers by frontend/UX developer Tero Piirainen. It’s now available in beta and is designed to be a “ridiculously simpler alternative” to Next.js, Gatsby and Astro, writes its creator.
“What used to take a React specialist and an absurd amount of JavaScript can now be done by a UX developer and a small amount of CSS,” Piirainen stated in an announcement Thursday.
It’s designed for UX developers, obviously, but he listed other potential users as beginner web developers, experienced JS developers that are frustrated with the React stack, designers, teachers and parents.
Nue 1.0 includes a global design system, which lets UX developers quickly create different designs by using exactly the same markup between projects, according to the documentation.
It also has an improved CSS stack with a CSS theming system that supports hot-reloading, CSS inlining, error reporting and automatic decency management.
It also includes Lightening CSS enabled by default, library folders that can be included on pages with a new include property and there’s an exclude property that strips unneeded assets from the request to lighten the payload.
The Nue site includes a three-step installation guide and a tutorial.
Antropic’s Claude Introduces Prompt Caching
Anthropic’s AI API now supports prompt caching. This capability allows developers to cache frequently used context between API calls. This is an issue where developers have a large amount of prompt context that they want to refer to repeatedly and long-term over requests.
“With prompt caching, customers can provide Claude with more background knowledge and example outputs — all while reducing costs by up to 90% and latency by up to 85% for long prompts,” the company stated.
The Anthropic documentation explains how prompt caching works. When a request is sent with prompt caching enabled, the system first checks if the prompt prefix is already cached from a recent query. If it’s found, then the system uses the cached version; if it’s not, then the full prompt is processed and the prefix cached for future use.
As it turns out, it also reduces costs.
As for uses, prompt caching makes it possible to “talk” to books or other long documents, which happens to be my personal favorite AI use case. With prompt caching, developers can bring books into the knowledge base by embedding the document into the prompt and letting users ask questions related to it.
But there are other uses as well, including creating conversational agents, coding assistants, large document processing or when there’s a detailed instruction set.
The blog post includes tables that show the latency improvements early customers saw with prompt content. It’s available in public beta for Claude 3.5 Sonnett and Claude 3 Haiku.
OpenAI Adds Structured Outputs in the API
This week, OpenAI added Structure Outputs to its API, which is designed to ensure model-generated outputs will exactly match JSON schemas provided by developers, the company stated. It’s now generally available in the API.
This resolves a problem with the JSON mode introduced last year to improve model reliability. JSON mode generated valid JSON outputs but did not guarantee that the models’ response would conform to a particular schema.
“Generating structured data from unstructured inputs is one of the core use cases for AI in today’s applications,” OpenAI noted, but developers have had to use open source tools, prompting and retry requests to work with LLM limitations in this area. “Structured Outputs solves this problem by constraining OpenAI models to match developer-supplied schemas and by training our models to better understand complicated schemas.”
The new model was able to score a perfect 100 in complex JSON schemas, compared to less than 40% with the GPT that did not have Structured Outputs, the company stated.
This capability is available in two forms: As a function call and as a new option for the response_format parameter. The Python and Node SDKs have been updated with native support for Structured Outputs as well.
Some use cases for this cited by OpenAI are:
- Dynamically generating user interfaces based on the user’s intent;
- Separating a final answer from supporting reasoning or additional commentary;
- Extracting structured data from unstructured data.
The blog post outlines how OpenAI achieved this and explains alternative approaches to the problem. It also lists the limitations of Structured Outputs, including that it allows only a subset of JSON schema. There’s also documentation for implementing Structured Outputs.
Survey Respondent: AI Like ‘Auto Complete’
“It’s like having a really eager, fast intern as an assistant, who’s really good at looking stuff up, but has no wisdom or awareness.”
Anonymous survey respondent
While we’re on the topic of AI, the Pragmatic Engineer published the results of an AI survey on how software engineers are using large language models. The site shared the results in their newsletter and blog. The survey was completed by 211 tech professionals, which is not a lot, but the results are fun to read. It turns out, developers are not that impressed with AI, with respondents comparing it to autocomplete or being paired with a junior programmer.
One respondent even compared it to an intern.
“It’s like having a really eager, fast intern as an assistant, who’s really good at looking stuff up, but has no wisdom or awareness,” the anonymous respondent wrote.
The post A ‘Nue’ UX Web Framework; Plus Anthropic, OpenAI Boost AI APIs appeared first on The New Stack.
Nue 1.0, a web framework for UX designers, those with React fatigue and non-developers, debuted this week. Also, news on AI API updates.