JetBrains announced its annual IDE updates this week, along with news that its AI assistant now leverages updated large language models (LLMs), which means the assistant can provide faster code completion for Java, Kotlin and Python.
“The AI chat is now smarter with GPT-4o support and includes chat references for better context,” a company press release stated. “New capabilities include AI-assisted VCS conflict resolution, in-terminal command generation, and customizable prompts for documentation and unit tests.”
Also with this update, the new JetBrains UI — designed to reduce visual complexity and provide easier access to essential features — is now the default for all users. However, for those who dislike change, the classic UI is available as a plugin.
Finally, the Search Everywhere dialog now allows developers to preview the codebase elements they’re searching for. JetBrains IDEs will automatically detect and use system proxy settings configured on the developer’s machine by default.
In addition to this news, a few of the IDE-specific updates include:
- Revamped Jupyter notebooks and new AI cells to help iterate faster on data analysis workloads in PyCharm 2024.2;
- New IDE functionality, such as the “Add Method to Interface and All Its Implementations” refactoring, and support for the latest Go features in GoLand 2024.2. The update also includes performance improvements, fixes for remote development and dev containers, and enhanced support for Go frameworks; and
- WebStorm 2024.2 support for special path resolving for frameworks with file-system-based routing such as Next.js, initial debugging support for Bun, the ability to run and debug TypeScript files directly, version control enhancements and features to improve the user experience.
New OpenAI Feature Ensures Outputs Match JSON Schemas
OpenAI introduced this week Structured Outputs in the API, a feature that ensures model-generated outputs will exactly match JSON Schemas provided by developers.
This is part of an effort that began last year with DevDay when OpenAI introduced JSON mode. JSON mode improved model reliability for generating valid JSON outputs, but didn’t guarantee the model’s response would conform to a particular schema. Structured Outputs in the API ensures model-generated outputs will exactly match JSON Schemas provided by developers, the company said on its blog.
Generating structured data from unstructured inputs is one of the core use cases for AI in applications, OpenAI explained.
“Developers use the OpenAI API to build powerful assistants that have the ability to fetch data and answer questions via function calling (opens in a new window), extract structured data for data entry, and build multistep agentic workflows that allow LLMs to take actions,” it noted.
But developers had to work within the limitations of LLMs by using open source tooling, prompting, and retrying requests to ensure that model outputs match the formats needed to interoperate with their systems, the team noted.
“Structured Outputs solves this problem by constraining OpenAI models to match developer-supplied schemas and by training our models to better understand complicated schemas,” it added.
Structured Outputs incorporates two forms in the API:
- “Function calling: Structured Outputs via tools is available by setting strict: true within your function definition,” the blog noted. When Structured Outputs are enabled, model outputs will match the supplied tool definition.
- “A new option for the response_format parameter: Developers can now supply a JSON Schema via json_schema, a new option for the response_format parameter,” the post stated. “This is useful when the model is not calling a tool, but rather, responding to the user in a structured way.
There are some limitations and restrictions outlined in the post, such as Structured Outputs allowing only a subset of JSON Schema, which helps OpenAI ensure the best possible performance.
New Open Source Tool Turns Figma Design into Code
Figma has launched a new open source project called Handoff that offers a new way for creators and engineers to automate turning Figma design into code.
Handoff is lightweight, cloud-agnostic, and built for the open source community under the MIT license. It can extract, transform and distribute Figma decisions as code, bridging the gap between design and development, the company stated. Its code can be tested, improved or deployed from GitHub.
Handoff is built on Figma’s Rest API and is available via the Figma Marketplace.
The post JetBrains Improves AI Code Completion, OpenAI Boosts JSON appeared first on The New Stack.
In other news, Figma launched a new open source project called Handoff that automates turning design into code.