<?xml version="1.0" encoding="UTF-8"?><rss version="2.0"
	xmlns:content="http://purl.org/rss/1.0/modules/content/"
	xmlns:wfw="http://wellformedweb.org/CommentAPI/"
	xmlns:dc="http://purl.org/dc/elements/1.1/"
	xmlns:atom="http://www.w3.org/2005/Atom"
	xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
	xmlns:slash="http://purl.org/rss/1.0/modules/slash/"
	 xmlns:media="http://search.yahoo.com/mrss/" >

<channel>
	<title>PyCharm : The only Python IDE you need. | The JetBrains Blog</title>
	<atom:link href="https://blog.jetbrains.com/pycharm/feed/" rel="self" type="application/rss+xml" />
	<link>https://blog.jetbrains.com</link>
	<description>Developer Tools for Professionals and Teams</description>
	<lastBuildDate>Thu, 09 Apr 2026 12:46:20 +0000</lastBuildDate>
	<language>en-US</language>
	<sy:updatePeriod>
	hourly	</sy:updatePeriod>
	<sy:updateFrequency>
	1	</sy:updateFrequency>
	

 
	<item>
		<title>How to Train Your First TensorFlow Model in PyCharm</title>
		<link>https://blog.jetbrains.com/pycharm/2026/04/how-to-train-your-first-tensorflow-model/</link>
		
		<dc:creator><![CDATA[Evgenia Verbina]]></dc:creator>
		<pubDate>Tue, 07 Apr 2026 10:36:35 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/04/PC-social-BlogSocialShare-1280x720-1-1.png</featuredImage>		<category><![CDATA[data-science]]></category>
		<category><![CDATA[tutorials]]></category>
		<category><![CDATA[tensorflow]]></category>
		<category><![CDATA[tensors]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=697464</guid>

					<description><![CDATA[This is a guest post from Iulia Feroli, founder of the Back To Engineering community on YouTube. TensorFlow is a powerful open-source framework for building machine learning and deep learning systems. At its core, it works with tensors (a.k.a multi‑dimensional arrays) and provides high‑level libraries (like Keras) that make it easy to transform raw data [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p><em>This is a guest post from </em><strong><em><a href="https://blog.jetbrains.com/pycharm/2026/04/how-to-train-your-first-tensorflow-model/#author" data-type="link" data-id="https://blog.jetbrains.com/pycharm/2026/04/how-to-train-your-first-tensorflow-model/#author">Iulia Feroli</a></em></strong><em>, founder of the Back To Engineering community on YouTube.</em></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" fetchpriority="high" src="https://blog.jetbrains.com/wp-content/uploads/2026/04/PC-social-BlogFeatured-1280x720-1-1.png" alt="How to Train Your First TensorFlow Model in PyCharm" class="wp-image-697465"/></figure>



<p><a href="https://www.tensorflow.org/" target="_blank" rel="noopener">TensorFlow</a> is a powerful open-source framework for building machine learning and deep learning systems. At its core, it works with tensors (a.k.a multi‑dimensional arrays) and provides high‑level libraries (like Keras) that make it easy to transform raw data into models you can train, evaluate, and deploy.</p>



<p>TensorFlow helps you handle the full pipeline: loading and preprocessing data, assembling models from layers and activations, training with optimizers and loss functions, and exporting for serving or even running on edge devices (including lightweight TensorFlow Lite models on Raspberry Pi and other microcontrollers).&nbsp;</p>



<p>If you want to make data-driven applications, prototyping neural networks, or ship models to production or devices, learning TensorFlow gives you a consistent, well-supported toolkit to go from idea to deployment.</p>



<p>If you’re brand new to TensorFlow, start by watching the <strong><a href="https://www.youtube.com/watch?v=hm07b8ETaso" data-type="link" data-id="https://www.youtube.com/watch?v=hm07b8ETaso" target="_blank" rel="noopener">short overview video</a></strong> where I explain tensors, neural networks, layers, and why TensorFlow is great for taking data → model → deployment, and how all of this can be explained with a LEGO-style pieces sorting example.&nbsp;</p>



<p>In this blog post, I’ll walk you through a first, stripped-down TensorFlow implementation notebook so we can get started with some practical experience. You can also watch the walkthrough video to follow along.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe title="Build Your First TensorFlow Model in Python (A Step-by-Step Tutorial)" src="https://www.youtube.com/embed/nswGrvOhaOY?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p>We&#8217;ll be exploring a very simple use case today: load the Fashion MNIST dataset, build two very simple Keras models, train and compare them, then dig into visualizations (predictions, confidence bars, confusion matrix). I kept the code minimal and readable so you can focus on the ideas – and you’ll see how <a href="https://www.jetbrains.com/pycharm/data-science/" target="_blank" rel="noopener">PyCharm</a> helps along the way.</p>



<h2 class="wp-block-heading">Training TensorFlow models step by step</h2>



<h3 class="wp-block-heading">Getting started in PyCharm</h3>



<p>We&#8217;ll be leveraging PyCharm&#8217;s native Notebook integration to build out <a href="https://github.com/iuliaferoli/TensorFlow_with_pycharm" target="_blank" rel="noopener">our project</a>. This way, we can inspect each step of the pipeline and use some supporting visualization along the way. We&#8217;ll <a href="https://www.jetbrains.com/help/pycharm/creating-empty-project.html" target="_blank" rel="noopener">create a new project</a> and <a href="https://www.jetbrains.com/help/pycharm/creating-virtual-environment.html" target="_blank" rel="noopener">generate a virtual environment</a> to manage our dependencies.&nbsp;</p>



<p>If you&#8217;re running the code from the attached repo, you can install directly from the requirements file. If you wish to expand this example with additional visualizations for further models, you can easily add more packages to your requirements as you go by using the PyCharm package manager helpers for <a href="https://www.jetbrains.com/guide/python/tips/install-and-import/)%20and" target="_blank" rel="noopener">installing</a> and <a href="https://www.jetbrains.com/help/pycharm/installing-uninstalling-and-upgrading-packages.html" target="_blank" rel="noopener">upgrading</a>.</p>



<h3 class="wp-block-heading">Load <code>Fashion MNIST</code> and inspect the data</h3>



<p><code>Fashion MNIST</code> is a great starter because the images are small (28×28 pixels), visually meaningful, and easy to interpret. They represent various garment types as pixelated black-and-white images, and provide the relevant labels for a well-contained classification task. We can first take a look at our data sample by printing some of these images with various matplotlib functions:</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" src="https://blog.jetbrains.com/wp-content/uploads/2026/04/image1.png" alt="" class="wp-image-699830"/></figure>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
fig, axes = plt.subplots(2, 5, figsize=(10, 4))
for i, ax in enumerate(axes.flat):
    ax.imshow(x_train[i], cmap='gray')
    ax.set_title(class_names[y_train[i]])
    ax.axis('off')
plt.show()
```
# Two simple models (a quick experiment)
```
model1 = models.Sequential([
    layers.Flatten(input_shape=(28, 28)),
    layers.Dense(128, activation='relu'),
    layers.Dense(10, activation='softmax')
])
model2 = models.Sequential([
    layers.Flatten(input_shape=(28, 28)),
    layers.Dense(128, activation='relu'),
    layers.Dense(128, activation='relu'),
    layers.Dense(10, activation='softmax')
])
```</pre>



<h3 class="wp-block-heading">Compile and train your first model</h3>



<p>From here, we can compile and train our first TensorFlow model(s). With PyCharm’s code completion features and documentation access, you can get instant suggestions for building out these simple code blocks.</p>



<p>For a first try at TensorFlow, this allows us to spin up a working model with just a few presses of <em>Tab</em> in our IDE. We&#8217;re using the recommended standard optimizer and loss function, and we&#8217;re tracking for accuracy. We can choose to build multiple models by playing around with the number or type of layers, along with the other parameters.&nbsp;</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
model1.compile(
    optimizer='adam',
    loss='sparse_categorical_crossentropy',
    metrics=['accuracy']
)
model1.fit(x_train, y_train, epochs=10)
model2.compile(
    optimizer='adam',
    loss='sparse_categorical_crossentropy',
    metrics=['accuracy']
)
model2.fit(x_train, y_train, epochs=15)
```</pre>



<h3 class="wp-block-heading">Evaluate and compare your TensorFlow model performance</h3>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
loss1, accuracy1 = model1.evaluate(x_test, y_test)
print(f'Accuracy of model1: {accuracy1:.2f}')
loss2, accuracy2 = model2.evaluate(x_test, y_test)
print(f'Accuracy of model2: {accuracy2:.2f}')
```</pre>



<p>Once the models are trained (and you can see the epochs progressing visually as each cell is run), we can immediately evaluate the performance of the models.</p>



<p>In my experiment, <code>model1</code> sits around ~0.88 accuracy, and while <code>model2</code> is a little higher than that, it took 50% longer to train. That’s the kind of trade‑off you should be thinking about: Is a tiny accuracy gain worth the additional compute and complexity?&nbsp;</p>



<p>We can dive further into the results of the model run by generating a DataFrame instance of our new prediction dataset. Here we can also leverage built-in functions like `describe` to quickly get some initial statistical impressions:</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/04/image2.png" alt="" class="wp-image-699841"/></figure>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
predictions = model1.predict(x_test)
import pandas as pd
df_pred = pd.DataFrame(predictions, columns=class_names)
df_pred.describe()
```</pre>



<p>However, the most useful statistics will compare our model&#8217;s prediction with the ground truth &#8220;real&#8221; labels of our dataset. We can also break this down by item category:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
y_pred = model1.predict(x_test).argmax(axis=1)
cm = confusion_matrix(y_test, y_pred)
plt.figure(figsize=(8,6))
sns.heatmap(cm, annot=True, fmt='d', cmap='Blues', xticklabels=class_names, yticklabels=class_names)
plt.xlabel('Predicted')
plt.ylabel('True')
plt.title('Confusion Matrix')
plt.show()
print('Classification report:')
print(classification_report(y_test, y_pred, target_names=class_names))
```</pre>



<p>From here, we can notice that the accuracy differs quite a bit by type of garment. A possible interpretation of this is that trousers are quite a distinct type of clothing from, say, t-shirts and shirts, which can be more commonly confused.&nbsp;</p>



<p>This is, of course, the type of nuance that, as humans, we can pick up by looking at the images, but the model only has access to a matrix of pixel values. The data does seem, however, to confirm our intuition. We can further build a more comprehensive visualization to test this hypothesis.&nbsp;</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/04/image-4.png" alt="" class="wp-image-697493"/></figure>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">```
import numpy as np
import matplotlib.pyplot as plt
# pick 8 wrong examples
y_pred = predictions.argmax(axis=1)
wrong_idx = np.where(y_pred != y_test)[0][:8]  # first 8 mistakes
n = len(wrong_idx)
fig, axes = plt.subplots(n, 2, figsize=(10, 2.2 * n), constrained_layout=True)
for row, idx in enumerate(wrong_idx):
    p = predictions[idx]
    pred = int(np.argmax(p))
    true = int(y_test[idx])
    axes[row, 0].imshow(x_test[idx], cmap="gray")
    axes[row, 0].axis("off")
    axes[row, 0].set_title(
        f"WRONG  P:{class_names[pred]} ({p[pred]:.2f})  T:{class_names[true]}",
        color="red",
        fontsize=10
    )
    bars = axes[row, 1].bar(range(len(class_names)), p, color="lightgray")
    bars[pred].set_color("red")
    axes[row, 1].set_ylim(0, 1)
    axes[row, 1].set_xticks(range(len(class_names)))
    axes[row, 1].set_xticklabels(class_names, rotation=90, fontsize=8)
    axes[row, 1].set_ylabel("conf", fontsize=9)
plt.show()
```</pre>



<p>This table generates a view where we can explore the confidence our model had in a prediction: By exploring which weight each class was given, we can see where there was doubt (i.e. multiple classes with a higher weight) versus when the model was certain (only one guess). These examples further confirm our intuition: top-types appear to be more commonly confused by the model.&nbsp;</p>



<h2 class="wp-block-heading">Conclusion</h2>



<p>And there we have it! We were able to set up and train our first model and already drive some data science insights from our data and model results. Using some of the PyCharm functionalities at this point can speed up the experimentation process by providing access to our documentation and applying code completion directly in the cells. We can even use AI Assistant to help generate some of the graphs we&#8217;ll need to further evaluate the TensorFlow model performance and investigate our results.</p>



<p>You can <a href="https://github.com/iuliaferoli/TensorFlow_with_pycharm" target="_blank" rel="noopener">try out this notebook yourself</a>, or better yet, try to generate it with these same tools for a more hands-on learning experience.</p>



<h2 class="wp-block-heading">Where to go next</h2>



<p><a href="https://github.com/iuliaferoli/TensorFlow_with_pycharm" target="_blank" rel="noopener">This notebook</a> is a minimal, teachable starting point. Here are some practical next steps to try afterwards:</p>



<ul>
<li>Replace the dense baseline with a small CNN (Conv2D → MaxPooling → Dense).</li>



<li>Add dropout or batch normalization to reduce overfitting.</li>



<li>Apply data augmentation (random shifts/rotations) to improve generalization.</li>



<li>Use callbacks like <code>EarlyStopping</code> and <code>ModelCheckpoint</code> so training is efficient, and you keep the best weights.</li>



<li>Export a <code>SavedModel</code> for server use or convert to TensorFlow Lite for edge devices (Raspberry Pi, microcontrollers).</li>
</ul>



<h2 class="wp-block-heading">Frequently asked questions</h2>



<h3 class="wp-block-heading">When should I use TensorFlow?</h3>



<p>TensorFlow is best used when building machine learning or deep learning models that need to scale, go into production, or run across different environments (cloud, mobile, edge devices).&nbsp;</p>



<p>TensorFlow is particularly well-suited for large-scale models and neural networks, including scenarios where you need strong deployment support (TensorFlow Serving, TensorFlow Lite). For research prototypes, TensorFlow is viable, but it’s more commonplace to use lightweight frameworks for easier experimentation.</p>



<h3 class="wp-block-heading">Can TensorFlow run on a GPU?</h3>



<p>Yes, TensorFlow can run GPUs and TPUs. Additionally, using a GPU can significantly speed up training, especially for deep learning models with large datasets. The best part is, TensorFlow will automatically use an available GPU if it’s properly configured.</p>



<h3 class="wp-block-heading">What is loss in TensorFlow?</h3>



<p>Loss (otherwise known as loss function) measures how far a model’s predictions are from the actual target values. Loss in TensorFlow is a numerical value representing the distance between predictions and actual target values. A few examples include:&nbsp;</p>



<ul>
<li>MSE (mean squared error), used in regression tasks.</li>



<li>Cross-entropy loss, often used in classification tasks.</li>
</ul>



<h3 class="wp-block-heading">How many epochs should I use?</h3>



<p>There’s no set number of epochs to use, as it depends on your dataset and model. Typical approaches cover:&nbsp;</p>



<ul>
<li>Starting with a conservative number (10–50 epochs).</li>



<li>Monitoring validation loss/accuracy and adjusting based on the results you see.</li>



<li>Using early stopping to halt training when improvements decrease.</li>
</ul>



<p>An epoch is one full pass through your training data. Not enough passes through leads to underfitting, and too many can cause overfitting. The sweet spot is where your model generalizes best to unseen data.&nbsp;</p>



<h2 class="wp-block-heading" id="author">About the author</h2>


    <div class="about-author ">
        <div class="about-author__box">
            <div class="row">
                                                            <div class="about-author__box-img">
                            <img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" src="https://blog.jetbrains.com/wp-content/uploads/2026/04/Iulia-Feroli-e1775558363746.png" alt="" loading="lazy">
                        </div>
                                        <div class="about-author__box-text">
                                                    <h4>Iulia Feroli</h4>
                                                <p><span style="font-weight: 400;">Iulia’s mission is to make tech exciting, understandable, and accessible to the new generation.</span></p>
<p><span style="font-weight: 400;">With a background spanning data science, AI, cloud architecture, and open source, she brings a unique perspective on bridging technical depth with approachability.</span></p>
<p><span style="font-weight: 400;">She’s building her own brand, Back To Engineering, through which she creates a community for tech enthusiasts, engineers, and makers. From YouTube videos on building robots from scratch, to conference talks or keynotes about real, grounded AI, and technical blogs and tutorials </span><span style="font-weight: 400;">–</span><span style="font-weight: 400;"> Iulia shares her message worldwide on how to turn complex concepts into tools developers can use every day.</span></p>
                    </div>
                            </div>
        </div>
    </div>



<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>What’s New in PyCharm 2026.1</title>
		<link>https://blog.jetbrains.com/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</link>
		
		<dc:creator><![CDATA[Ilia Afanasiev]]></dc:creator>
		<pubDate>Mon, 30 Mar 2026 15:31:08 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/03/PC-releases-BlogFeatured-1280x720-1.png</featuredImage>		<category><![CDATA[releases]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=687025</guid>

					<description><![CDATA[Welcome to PyCharm 2026.1. This release doesn’t just add features – it rethinks how you build, debug, and scale Python projects. From a brand-new debugging engine powered by debugpy to first-class uv support on remote targets and expanded JavaScript support in the free tier, this version is all about removing friction and letting you focus [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Welcome to PyCharm 2026.1. This release doesn’t just add features – it rethinks how you build, debug, and scale Python projects. From a brand-new debugging engine powered by debugpy to first-class uv support on remote targets and expanded JavaScript support in the free tier, this version is all about removing friction and letting you focus on your code. Whether you’re working locally, over SSH, or inside Docker, PyCharm now adapts to your setup instead of the other way around.</p>



<p>In this post, we’ll explore the highlights of this update and show you how these improvements can streamline your daily workflow.</p>



<h2 class="wp-block-heading">Standardizing the future of debugging with debugpy</h2>



<p>PyCharm now offers the option to use debugpy as the default debugger backend, providing the industry-standard Debug Adapter Protocol (DAP) that aligns the IDE with the broader Python ecosystem. By replacing complex, legacy socket-waiting logic with a more stable connection model, race conditions and timing edge cases will no longer interfere with your debugging experience.</p>



<h3 class="wp-block-heading">A modern foundation for Python development</h3>



<p>The new engine provides full native support for <a href="https://peps.python.org/pep-0669/" target="_blank" rel="noopener">PEP 669</a>, utilizing Python 3.12’s low-impact monitoring API to significantly reduce debugger overhead compared to the legacy <code>sys.settrace()</code> approach. This ensures that your debugging sessions are faster and less intrusive. Furthermore, the migration introduces comprehensive <code>asyncio</code> support. You can now use the full suite of debugger tools, such as the debug console and expression evaluation, directly within async contexts for modern frameworks like FastAPI and aiohttp.&nbsp;</p>



<h3 class="wp-block-heading">Reliability across environments</h3>



<p>Beyond performance improvements, debugpy simplifies the <em>Attach to Process</em> experience by providing a standardized approach for Docker containers, remote servers on AWS, Azure, or GCP, and local running processes. For specialized workflows, we have introduced a new <em>Attach to DAP</em> run configuration. This allows you to connect to targets using the <code>debugpy.listen()</code> command, eliminating the friction of manual connection management and allowing you to focus on your code instead of debugging infrastructure.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/03/debugpy.webm"></video></figure>



<h2 class="wp-block-heading">Support for uv as a remote interpreter</h2>



<p>Many developers work on projects where the code and dependencies live on a remote server – whether via SSH, in WSL, or inside Docker. By connecting PyCharm to a remote machine and using uv as the interpreter, you can keep the environment fully synchronized, ensure package management works as expected, and run projects smoothly – just as if everything were local.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/03/uv_on_wsl.webm"></video></figure>



<h2 class="wp-block-heading">Free professional web development for everyone</h2>



<p>With PyCharm 2026.1, the core IDE experience continues to evolve as we bring a broader set of professional-grade web tools to all users for free. Everyone, from beginners to backend-first developers, now has access to a substantial set of JavaScript, TypeScript,<strong> </strong>and CSS features, as well as advanced navigation and code intelligence previously available only with a Pro subscription.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/03/Webstorm_Free_JS.webm"></video></figure>



<p>For a complete breakdown of all new features, check out this <a href="https://blog.jetbrains.com/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/">blog post</a>. </p>



<h2 class="wp-block-heading">Advancements in AI integration</h2>



<p>PyCharm is evolving into an open platform that gives you the freedom to bring the AI tools of your choice directly into your professional development workflow. This release focuses on providing a flexible ecosystem where you can orchestrate the best models and agents available today.</p>



<h3 class="wp-block-heading">The ACP Registry: Your gateway to new agents</h3>



<p>Keeping up with the rapid pace of AI development can be a challenge, with new coding agents appearing almost daily. To help you navigate this dynamic landscape, we’ve launched the <a href="https://blog.jetbrains.com/ai/2026/01/acp-agent-registry/">ACP Registry</a> – a built-in directory of AI coding agents integrated directly into your IDE via the Agent Client Protocol.</p>



<p>Whether you want to experiment with open-source agents like OpenCode or specialized tools like Gemini CLI, you can now discover and install them in just a few clicks. If you have a custom setup or an agent that isn’t listed yet, you can easily add it via the <code>acp.json</code> configuration, giving you the flexibility to use your favorite tools, with no strings attached.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/03/ACP.webm"></video></figure>



<h3 class="wp-block-heading">Native OpenAI Codex integration and BYOK</h3>



<p>OpenAI Codex is now natively integrated into the JetBrains AI chat. This means you can tackle complex development tasks without switching to a browser or copy-pasting code between windows.</p>



<p>We’ve also introduced Bring Your Own Key (BYOK) support. You can now connect your own API keys from OpenAI, Anthropic, or other compatible providers – including local models – directly in the IDE settings. This allows you to choose the setup that fits your workflow and budget best, while keeping all your AI-powered development inside PyCharm.</p>



<h3 class="wp-block-heading">Stay in the flow with next edit suggestions</h3>



<p>Small changes in your code often trigger a cascade of mechanical follow-up edits. Adding a parameter to a function or renaming a symbol can lead to errors popping up across your entire file.</p>



<p>Next edit suggestions (NES) offer a smarter, lightweight alternative to asking an AI agent for a full rewrite. As you modify your code, PyCharm proactively predicts the most likely next changes and suggests them inline.</p>



<ul>
<li><strong>Effortless consistency:</strong> Update all call sites across a file with a simple <em>Tab Tab</em> experience.</li>



<li><strong>Stay in control:</strong> Move step by step through changes rather than reviewing large, automated diffs.</li>



<li><strong>No quota required:</strong> Use NES without consuming AI credits – available without consuming the AI quota of your JetBrains AI Pro subscription.</li>
</ul>



<p>This natural evolution of code completion keeps you in the flow, making those small cascading fixes feel almost effortless.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/03/NES.webm"></video></figure>



<p>All of the updates mentioned above are just a glimpse of what’s new in PyCharm 2026.1.</p>



<p>There is even more under the hood, including performance improvements, stability upgrades, and thoughtful refinements across the IDE that make everyday development smoother and faster.</p>



<p>To explore the full list of updates, check out our <a href="https://www.jetbrains.com/pycharm/whatsnew/" target="_blank" rel="noopener">What’s New</a> page.&nbsp;</p>



<p>As always, we would love to hear your feedback. Your insights help us shape the future of PyCharm – and we cannot wait to see what you build next.</p>
]]></content:encoded>
					
		
		
		                    <language>
                        <code><![CDATA[zh-hans]]></code>
                        <url>https://blog.jetbrains.com/zh-hans/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[pt-br]]></code>
                        <url>https://blog.jetbrains.com/pt-br/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ko]]></code>
                        <url>https://blog.jetbrains.com/ko/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ja]]></code>
                        <url>https://blog.jetbrains.com/ja/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[fr]]></code>
                        <url>https://blog.jetbrains.com/fr/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[es]]></code>
                        <url>https://blog.jetbrains.com/es/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[de]]></code>
                        <url>https://blog.jetbrains.com/de/pycharm/2026/03/what-s-new-in-pycharm-2026-1/</url>
                    </language>
                	</item>
		<item>
		<title>Expanding Our Core Web Development Support in PyCharm 2026.1</title>
		<link>https://blog.jetbrains.com/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</link>
		
		<dc:creator><![CDATA[Ilia Afanasiev]]></dc:creator>
		<pubDate>Wed, 25 Mar 2026 15:01:54 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/03/PC-social-BlogFeatured-1280x720-1.png</featuredImage>		<category><![CDATA[web-development]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=687028</guid>

					<description><![CDATA[With PyCharm 2026.1, our core IDE experience continues to evolve as we’re bringing a broader set of professional-grade web tools to all users for free. Everyone, from beginners to backend-first developers, is getting access to a substantial set of JavaScript, TypeScript, and CSS features that were previously only available with a Pro subscription. React, JavaScript, [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>With PyCharm 2026.1, our core IDE experience continues to evolve as we’re bringing a broader set of professional-grade web tools to all users for free. Everyone, from beginners to backend-first developers, is getting access to a substantial set of JavaScript, TypeScript,<strong> </strong>and CSS features that were previously only available with a Pro subscription.</p>



<h3 class="wp-block-heading">React, JavaScript, TypeScript, and CSS support</h3>



<p>Leverage a comprehensive set of editing and formatting tools for modern web languages within PyCharm, including:</p>



<ul>
<li><strong>Basic React support </strong>with code completion, component and attribute navigation, and React component and prop rename refactorings.</li>



<li><strong>Advanced import management</strong>:
<ul>
<li>Enjoy automatic JavaScript and TypeScript imports as you work.</li>



<li>Merge or remove unnecessary references via the <em>Optimize imports</em> feature.</li>



<li>Get required imports automatically when you paste code into the editor.</li>
</ul>
</li>
</ul>



<ul>
<li><strong>Enhanced styling</strong>: Access CSS-tailored code completion, inspections, and quick-fixes, and view any changes in real time via the built-in web preview.</li>



<li><strong>Smart editor behavior:</strong> Utilize smart keys, code vision inlay hints, and postfix code completions designed for web development.</li>
</ul>



<h3 class="wp-block-heading">Navigation and code intelligence</h3>



<p>Finding your way around web projects is now even more efficient with tools that allow for:</p>



<ul>
<li><strong>Pro-grade navigation:</strong> Use dedicated gutter icons for <em>Jump to&#8230;</em> actions, recursive calls, and TypeScript source mapping.</li>



<li><strong>Core web refactorings:</strong> Perform essential code changes with reliable <em>Rename</em> refactorings and actions (<em>Introduce variable</em>, <em>Change signature</em>, <em>Move members</em>, and more<em>)</em>.</li>



<li><strong>Quality control</strong>: Maintain high code standards with professional-grade inspections, intentions, and quick-fixes.</li>



<li><strong>Code cleanup</strong>: Identify redundant code blocks through JavaScript and TypeScript duplicate detection.</li>
</ul>



<h3 class="wp-block-heading">Frameworks and integrated tools</h3>



<p>With the added essential support for some of the most popular frontend frameworks and tools, you will have access to:</p>



<ul>
<li><strong>Project initialization</strong>: Create new web projects quickly using the built-in Vite generator.</li>



<li><strong>Standard tooling</strong>: Standardize code quality with integrated support for Prettier, ESLint, TSLint, and StyleLint.</li>



<li><strong>Script management</strong>: Discover and execute NPM scripts directly from your <code>package.json</code>.</li>



<li><strong>Security</strong>: Check project dependencies for security vulnerabilities.</li>
</ul>



<p>We’re excited to bring these tried and true features to the core PyCharm experience for free! We’re certain these tools will help beginners, students, and hobbyists tackle real-world tasks within a single, powerful IDE. Best of all, core PyCharm can be used for both commercial and non-commercial projects, so it will grow with you as you move from learning to professional development.</p>
]]></content:encoded>
					
		
		
		                    <language>
                        <code><![CDATA[zh-hans]]></code>
                        <url>https://blog.jetbrains.com/zh-hans/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[pt-br]]></code>
                        <url>https://blog.jetbrains.com/pt-br/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ko]]></code>
                        <url>https://blog.jetbrains.com/ko/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ja]]></code>
                        <url>https://blog.jetbrains.com/ja/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[fr]]></code>
                        <url>https://blog.jetbrains.com/fr/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                                    <language>
                        <code><![CDATA[es]]></code>
                        <url>https://blog.jetbrains.com/es/pycharm/2026/03/expanding-our-core-web-development-support-in-pycharm-2026-1/</url>
                    </language>
                	</item>
		<item>
		<title>OpenAI Acquires Astral: What It Means for PyCharm Users</title>
		<link>https://blog.jetbrains.com/pycharm/2026/03/openai-acquires-astral-what-it-means-for-pycharm-users/</link>
		
		<dc:creator><![CDATA[Ilia Afanasiev]]></dc:creator>
		<pubDate>Mon, 23 Mar 2026 16:04:34 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/03/PC-social-BlogFeatured-1280x720-1-2.png</featuredImage>		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=691821</guid>

					<description><![CDATA[On March 19, OpenAI announced that it would acquire Astral, the company behind uv, Ruff, and ty. The Astral team, led by founder Charlie Marsh, will join OpenAI&#8217;s Codex team. The deal is subject to regulatory approval. First and foremost: congratulations to Charlie Marsh and the entire Astral team. They shipped some of the most [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>On March 19, OpenAI announced that it would <a href="https://openai.com/index/openai-to-acquire-astral/" target="_blank" rel="noopener">acquire Astral</a>, the company behind uv, Ruff, and ty. The Astral team, led by founder Charlie Marsh, will <a href="https://astral.sh/blog/openai" target="_blank" rel="noopener">join OpenAI&#8217;s Codex team</a>. The deal is subject to regulatory approval.</p>



<p>First and foremost: congratulations to Charlie Marsh and the entire Astral team. They shipped some of the most beloved tools in the Python ecosystem and raised the bar for what developer tooling can be. This acquisition is a reflection of the impact they&#8217;ve had.</p>



<p>This is big news for the Python ecosystem, and it matters to us at JetBrains. Here&#8217;s our perspective.</p>



<h2 class="wp-block-heading">What Astral built</h2>



<p>In just two years, Astral transformed Python tooling. Their tools now see hundreds of millions of downloads every month, and for good reason:</p>



<ul>
<li><strong>uv</strong> is a blazing-fast package and environment manager that unifies functionality from pip, venv, pyenv, pipx, and more into a single tool. With around 124 million monthly downloads, it has quickly become the default choice for many Python developers.</li>



<li><strong>Ruff</strong> is an extremely fast linter and formatter, written in Rust. For many teams it has replaced flake8, isort, and black entirely.</li>



<li><strong>ty</strong> is a new type checker for Python. It&#8217;s still early, and we’re already working on it with PyCharm. It&#8217;s showing promise.</li>
</ul>



<p>This is foundational infrastructure that millions of developers rely on every day. We&#8217;ve integrated both Ruff and uv into PyCharm because they substantially make Python development better.</p>



<h2 class="wp-block-heading">The risks are real, but manageable</h2>



<p>Change always carries risk, and acquisitions are no exception. The main concern here is straightforward: if Astral&#8217;s engineers get reassigned to OpenAI&#8217;s more commercial priorities, these tools could stagnate over time.</p>



<p>The good news is that Astral&#8217;s tools are open-source under permissive licenses. The community can fork them if it ever comes to that. As Armin Ronacher <a href="https://lucumr.pocoo.org/2024/8/21/harvest-season/" target="_blank" rel="noopener">has noted</a>, uv is &#8220;very forkable and maintainable.&#8221; There’s no possible future where these tools go <em>backwards.</em></p>



<p>Both OpenAI and Astral have committed to continued open-source development. We take them at their word, and we hope for the best.</p>



<h2 class="wp-block-heading">Our commitment hasn&#8217;t changed</h2>



<p>JetBrains already has great working relationships with both the Astral and the Codex teams. We&#8217;ve been integrating Ruff and uv into PyCharm, and we will continue to do so. We’ve submitted some upstream improvements to ty. Regardless of who owns these tools, our commitment to supporting the best Python tooling for our users stays the same. We&#8217;ll keep working with whoever maintains them.</p>



<p>The Python ecosystem is stronger because of the work Astral has done. We hope this acquisition amplifies that work, not diminishes it. We&#8217;ll be watching closely, and we&#8217;ll keep building the best possible experience for Python developers in PyCharm.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Python Unplugged on PyTV Recap</title>
		<link>https://blog.jetbrains.com/pycharm/2026/03/python-unplugged-on-pytv-recap/</link>
		
		<dc:creator><![CDATA[Will Vincent]]></dc:creator>
		<pubDate>Fri, 13 Mar 2026 13:05:10 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/03/TechRadar_Email_Half-page_600-300-1.png</featuredImage>		<product ><![CDATA[pycharm]]></product>
		<category><![CDATA[livestreams]]></category>
		<category><![CDATA[news]]></category>
		<category><![CDATA[amsterdam]]></category>
		<category><![CDATA[conferences]]></category>
		<category><![CDATA[python]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=687433</guid>

					<description><![CDATA[Last week marked the fruition of almost a year of hard work by the entire PyCharm team. On March 4th, 2026, we hosted Python Unplugged on PyTV, our first-ever community conference featuring a 90s music-inspired online conference for the Python community. The PyCharm team is a fixture at Python conferences globally, such as PyCon US [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Last week marked the fruition of almost a year of hard work by the entire <a href="https://www.jetbrains.com/pycharm/" target="_blank" rel="noopener">PyCharm</a> team. On March 4th, 2026, we hosted <a href="https://www.youtube.com/live/qKkyBhXIJJU?si=ilEeq1iRXquQhssj" target="_blank" rel="noopener">Python Unplugged on PyTV,</a> our first-ever community conference featuring a 90s music-inspired online conference for the Python community.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe loading="lazy" title="Python Unplugged on PyTV – Free Online Python Conference" src="https://www.youtube.com/embed/qKkyBhXIJJU?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div><figcaption class="wp-element-caption"><span style="font-family: Roboto, Arial, sans-serif;font-weight: 700">Python Unplugged on PyTV – Free Online Python Conference</span></figcaption></figure>



<p>The PyCharm team is a fixture at Python conferences globally, such as <a href="https://us.pycon.org/2026/" target="_blank" rel="noopener">PyCon US</a> and <a href="https://ep2026.europython.eu/" target="_blank" rel="noopener">EuroPython</a>, but we recognize that while attending a conference can be life-changing, the costs involved put it out of reach for many Pythonistas.</p>



<p>We wanted to recreate the entire Python conference experience in a digital format, complete with live talks, hallway tracks, and Q&amp;A sessions, so anyone, anywhere in the world, could join in and participate.</p>



<p>And we did it! Superstar speakers from across the Python community joined us in our studio in Amsterdam, Netherlands &#8211; the country where Python was born. Some of them traveled for over 10 hours, and one even joined with their newborn baby! Travis Oliphant, of Numpy and Scipy fame, was ultimately unable to join us in person, but he kindly pre-recorded a wonderful talk and participated in a live Q&amp;A after it, despite it being very early morning in his time zone.&nbsp;</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7830-1.jpg" alt="" class="wp-image-687545"/><figcaption class="wp-element-caption"><em>Cheuk Ting Ho, Jodie Burchell,&nbsp; Valerie Andrianova</em></figcaption></figure>



<p>The PyCharm team is extremely grateful for the community&#8217;s support in making this happen.</p>



<h2 class="wp-block-heading">The event</h2>



<p>We <a href="https://www.youtube.com/live/qKkyBhXIJJU?si=fd5uQvLnpEL2P9lU" target="_blank" rel="noopener">livestreamed the entire event</a> from 11am to 6:30pm CET/CEST, almost seven and a half hours of content, featuring 15 speakers, a PyLadies panel, and an ongoing quiz with prizes. Topics covered the future of Python, AI, data science, web development, and more.</p>



<p>Here is the complete list of speakers and timestamped links to their talks:</p>



<ul>
<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=24031s" target="_blank" rel="noopener">Carol Willing </a>&#8211; JupyterLab Core Developer</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=673s" target="_blank" rel="noopener">Deb Nicholson</a> &#8211; Executive Director, Python Software Foundation</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=6604s" target="_blank" rel="noopener">Ritchie Vink</a> &#8211; Creator of Polars</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=16430s" target="_blank" rel="noopener">Travis Oliphant</a> &#8211; Creator of NumPy</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=12480s" target="_blank" rel="noopener">Sarah Boyce</a> &#8211; Django Fellow&nbsp;</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=14389s" target="_blank" rel="noopener">Sheena O’Connell</a> &#8211; Python Software Foundation Board Member&nbsp;</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=18360s" target="_blank" rel="noopener">Marlene Mhangami</a> &#8211; Senior Developer Advocate at Microsoft&nbsp;</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=10591s" target="_blank" rel="noopener">Carlton Gibson</a> &#8211; Creator of multiple open-source projects in the Django ecosystem</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=22179s" target="_blank" rel="noopener">Tuana Çelik </a>&#8211; Developer Relations Engineer at LlamaIndex</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=20289s" target="_blank" rel="noopener">Merve Noyan</a> &#8211; Machine Learning Engineer at Hugging Face&nbsp;</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=4750s" target="_blank" rel="noopener">Paul Everitt</a> &#8211; Developer Advocate at JetBrains</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=2568s" target="_blank" rel="noopener">Mark Smith</a> &#8211; Head of Python Ecosystem at JetBrains</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=8630s" target="_blank" rel="noopener">Georgi Ker</a> &#8211; Director and Fellow of the Python Software Foundation</li>



<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=8630s" target="_blank" rel="noopener">Una Galyeva</a> &#8211; Head of AI at Geobear Global and PyLadies Amsterdam organizer&nbsp;</li>
</ul>



<ul>
<li><a href="https://www.youtube.com/watch?v=qKkyBhXIJJU&amp;t=8630s" target="_blank" rel="noopener">Jessica Greene</a> &#8211; Senior Machine Learning Engineer at Ecosia</li>
</ul>



<p></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7801.jpg" alt="" class="wp-image-687457"/><figcaption class="wp-element-caption">The studio room with presenter&#8217;s desk and Q&amp;A table.</figcaption></figure>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7779.jpg" alt="" class="wp-image-687468"/><figcaption class="wp-element-caption"><em>Production meeting the day before the event</em></figcaption></figure>



<p>We spent the afternoon doing final checks and a run-through with the studio team at <a href="https://vixylive.com/" target="_blank" rel="noopener">Vixy Live</a>. They were very professional and patient with us as we were working in a studio for the first time. With their help, we were confident that the event the next day would go smoothly.</p>



<p></p>



<h2 class="wp-block-heading alignwide" id="we-re-a-studio-in-berlin-with-an-international-practice-in-architecture-urban-planning-and-interior-design-we-believe-in-sharing-knowledge-and-promoting-dialogue-to-increase-the-creative-potential-of-collaboration" style="font-size:48px;line-height:1.1">Livestream day</h2>



<p>On the day of the livestream, we arrived early to get our makeup done. The makeup artists were absolute pros, and we all looked great on camera. One of our speakers, Carol, jokingly said that she is now 20 years younger! The hosts, Jodie, Will, and Cheuk, were totally covered in ‘90s fashion and vibes.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/image5.jpg" alt="" class="wp-image-687479"/><figcaption class="wp-element-caption"><em>Python Team Lead Jodie Burchell bringing the 90s back</em></figcaption></figure>



<p>We also had swag designed by our incredible marketing team, including t-shirts, stickers, posters, and tote bags.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7814.jpg" alt="" class="wp-image-687490"/><figcaption class="wp-element-caption"><em>PyTV Stickers for all participants</em></figcaption></figure>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7815.jpg" alt="" class="wp-image-687501"/><figcaption class="wp-element-caption"><em>PyTV Totebags</em></figcaption></figure>



<p></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/IMG_7820.jpg" alt="" class="wp-image-687512"/><figcaption class="wp-element-caption"><em>PyTV posters</em></figcaption></figure>



<p></p>



<h2 class="wp-block-heading">Python content for everyone</h2>



<p>After a brief opening introducing the conference and the event Discord, we began with a series of talks focused on the community, learning Python, and other hot Python topics. We also had two panels, both absolutely inspiring: one on the role of AI in open source and another featuring prominent members of PyLadies.</p>



<p>Following our first block of speakers, we moved on to web development-focused talks from key people involved with the Django framework. We then had a series of talks from experts across the data science and AI world, including speakers from Microsoft, Hugging Face, and LlamaIndex, who gave us up-to-date insights into open-source AI and agent-based approaches. We ended with a talk by Carol Willing, one of the most respected figures in the Python community.</p>



<p>Throughout the day, we ran a quiz for the audience to test their knowledge about Python and the community. Since we had many audience members learning Python, we hope they learned some fun facts about Python through the quiz.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/image7-2.png" alt="" class="wp-image-687523"/><figcaption class="wp-element-caption"><em>First of 8 questions on the Python ecosystem</em></figcaption></figure>



<p></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/03/image10.jpg" alt="" class="wp-image-687534"/><figcaption class="wp-element-caption"><em>Sarah Boyce, Will Vincent, Sheena O’Connell, Carlton Gibson, Marlene Mhangami</em></figcaption></figure>



<h2 class="wp-block-heading">Next year?</h2>



<p>Looking at the numbers, we had more than 5,500 people join us during the live stream, with most of them watching at least one talk. We’ve since had another 8,000 people as of this writing watch the event recording. </p>



<p>We&#8217;d love to do this event again next year. If you have suggestions for speakers, topics, swag, or anything else please leave it in the comments!</p>


    <div class="buttons">
        <div class="buttons__row">
                                                <a href="https://www.youtube.com/live/qKkyBhXIJJU" class="btn" target="" rel="noopener">Watch now</a>
                                                    </div>
    </div>







<p></p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Cursor Joined the ACP Registry and Is Now Live in Your JetBrains IDE</title>
		<link>https://blog.jetbrains.com/ai/2026/03/cursor-joined-the-acp-registry-and-is-now-live-in-your-jetbrains-ide/</link>
		
		<dc:creator><![CDATA[Jan-Niklas Wortmann]]></dc:creator>
		<pubDate>Wed, 04 Mar 2026 15:28:01 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/03/IDEs-social-BlogFeatured-1280x720-1-2.png</featuredImage>		<product ><![CDATA[idea]]></product>
		<product ><![CDATA[pycharm]]></product>
		<product ><![CDATA[rust]]></product>
		<category><![CDATA[ai-assistant]]></category>
		<category><![CDATA[news]]></category>
		<category><![CDATA[acp]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=ai&#038;p=685326</guid>

					<description><![CDATA[Cursor is now available as an AI agent inside JetBrains IDEs through the Agent Client Protocol. Select it from the agent picker, and it has full access to your project. If you&#8217;ve spent any time in the AI coding space, you already know Cursor. It has been one of the most requested additions to the [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>Cursor is now available as an AI agent inside JetBrains IDEs through the <a href="https://www.jetbrains.com/acp/" target="_blank" rel="noopener">Agent Client Protocol</a>. Select it from the agent picker, and it has full access to your project.</p>



<p>If you&#8217;ve spent any time in the AI coding space, you already know <a href="https://cursor.com/" target="_blank" rel="noopener">Cursor</a>. It has been one of the most requested additions to the ACP Registry.</p>



<h2 class="wp-block-heading">What you get</h2>



<p>Cursor is known for its AI-native, agentic workflows. JetBrains IDEs are valued for deep code intelligence – refactoring, debugging, code quality checks, and the tooling professionals rely on at scale. ACP brings the two together.</p>



<p>You can now use Cursor&#8217;s agentic capabilities directly inside your JetBrains IDE – within the workflows and features you already use.&nbsp;</p>



<h2 class="wp-block-heading">A growing open ecosystem</h2>



<p>Cursor joins a growing list of agents available through ACP in JetBrains IDEs. Every new addition to the ACP Registry means you have more choice – while still working inside the IDE you already rely on. You get access to frontier models from major providers, including OpenAI, Anthropic, Google, and now also Cursor.</p>



<p>This is part of our open ecosystem strategy. Plug in the agents you want and work in the IDE you love – without getting locked into a single solution.</p>



<blockquote class="wp-block-quote">
<p></p>
<cite>Cursor is focused on building the best way to build software with AI. By integrating Cursor with JetBrains IDEs, we&#8217;re excited to provide teams with powerful agentic capabilities in the environments where they&#8217;re already working.<br><br>– Jordan Topoleski, COO at Cursor</cite></blockquote>



<h2 class="wp-block-heading">Get started</h2>



<p>You need version 2025.3.2 or later of your JetBrains IDE with the <em>AI Assistant</em> plugin enabled. From there, open the agent selector, select <em>Install from</em> <em>ACP Registry…</em>, install Cursor, and start working. You <strong>don’t need a JetBrains AI subscription</strong> to use Cursor as an AI agent.</p>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe loading="lazy" title="Cursor is now live in your JetBrains IDE through ACP" src="https://www.youtube.com/embed/-AFODqVoe8s?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p>The ACP Registry keeps growing, and many agents have already joined it – with more on the way. Try it today with Cursor and experience agent-driven development inside your JetBrains IDE. For more information about the Agent Client Protocol, see <a href="https://blog.jetbrains.com/ai/2025/12/bring-your-own-ai-agent-to-jetbrains-ides/">our original announcement</a> and the <a href="https://blog.jetbrains.com/ai/2026/01/acp-agent-registry/">blog post on the ACP Agent Registry support</a>.</p>
]]></content:encoded>
					
		
		
		                    <language>
                        <code><![CDATA[ko]]></code>
                        <url>https://blog.jetbrains.com/ko/ai/2026/03/cursor-joined-the-acp-registry-and-is-now-live-in-your-jetbrains-ide/</url>
                    </language>
                                    <language>
                        <code><![CDATA[fr]]></code>
                        <url>https://blog.jetbrains.com/fr/ai/2026/03/cursor-joined-the-acp-registry-and-is-now-live-in-your-jetbrains-ide/</url>
                    </language>
                	</item>
		<item>
		<title>LangChain Python Tutorial: A Complete Guide for 2026</title>
		<link>https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/</link>
		
		<dc:creator><![CDATA[Cheuk Ting Ho]]></dc:creator>
		<pubDate>Thu, 19 Feb 2026 10:40:15 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/02/PC-social-BlogFeatured-1280x720-1.png</featuredImage>		<category><![CDATA[data-science]]></category>
		<category><![CDATA[tutorials]]></category>
		<category><![CDATA[ai]]></category>
		<category><![CDATA[ai-agents]]></category>
		<category><![CDATA[chatbots]]></category>
		<category><![CDATA[langchain]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=681664</guid>

					<description><![CDATA[If you’ve read the blog post How to Build Chatbots With LangChain, you may want to know more about LangChain. This blog post will dive deeper into what LangChain offers and guide you through a few more real-world use cases. And even if you haven’t read the first post, you might still find the info [&#8230;]]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/PC-social-BlogFeatured-1280x720-1.png" alt="LangChain Python Tutorial" class="wp-image-682317"/></figure>



<p>If you’ve read the blog post <a href="https://blog.jetbrains.com/pycharm/2024/08/how-to-build-chatbots-with-langchain/"><em>How to Build Chatbots With LangChain</em></a>, you may want to know more about LangChain. This blog post will dive deeper into what LangChain offers and guide you through a few more real-world use cases. And even if you haven’t read the first post, you might still find the info in this one helpful for building your next AI agent.</p>



<h2 class="wp-block-heading">LangChain fundamentals</h2>



<p>Let’s have a look at what LangChain is. LangChain provides a standard framework for building AI agents powered by LLMs, like the ones offered by OpenAI, Anthropic, Google, etc., and is therefore the easiest way to get started. LangChain supports most of the commonly used LLMs on the market today.</p>



<p>LangChain is a high-level tool built on LangGraph, which provides a low-level framework for orchestrating the agent and runtime and is suitable for more advanced users. Beginners and those who only need a simple agent build are definitely better off with LangChain.</p>



<p>We’ll start by taking a look at several important components in a LangChain agent build.</p>



<h3 class="wp-block-heading">Agents</h3>



<p>Agents are what we are building. They combine LLMs with tools to create systems that can reason about tasks, decide which tools to use for which steps, analyze intermittent results, and work towards solutions iteratively.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-27.png" alt="" class="wp-image-681665"/></figure>



<p>Creating an agent is as simple as using the `create_agent` function with a few parameters:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">from langchain.agents import create_agent

agent = create_agent(

   "gpt-5",

   tools=tools

)</pre>



<p>In this example, the LLM used is GPT-5 by OpenAI. In most cases, the provider of the LLM can be inferred. To see a list of all supported providers, head over <a href="https://reference.langchain.com/python/langchain/models/#langchain.chat_models.init_chat_model(model)" target="_blank" rel="noopener">here</a>.</p>



<h3 class="wp-block-heading">LangChain Models: Static and Dynamic</h3>



<p>There are two types of agent models that you can build: static and dynamic. Static models, as the name suggests, are straightforward and more common. The agent is configured in advance during creation and remains unchanged during execution.</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">import os

from langchain.chat_models import init_chat_model

os.environ["OPENAI_API_KEY"] = "sk-..."

model = init_chat_model("gpt-5")

print(model.invoke("What is PyCharm?"))</pre>



<p><br><br>Dynamic models allow you to build an agent that can switch models during runtime based on customized logic. Different models can then be picked based on the current state and context. For example, we can use ModelFallbackMiddleware (described in the <em>Middleware</em> section below) to have a backup model in case the default one fails.</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">from langchain.agents import create_agent

from langchain.agents.middleware import ModelFallbackMiddleware

agent = create_agent(

   model="gpt-4o",

   tools=[],

   middleware=[

       ModelFallbackMiddleware(

           "gpt-4o-mini",

           "claude-3-5-sonnet-20241022",

       ),

   ],

)</pre>



<h3 class="wp-block-heading">Tools</h3>



<p>Tools are important parts of AI agents. They make AI agents effective at carrying out tasks that involve more than just text as output, which is a fundamental difference between an agent and an LLM. Tools allow agents to interact with external systems – such as APIs, databases, or file systems. Without tools, agents would only be able to provide text output, with no way of performing actions or iteratively working their way toward a result.</p>



<p>LangChain provides decorators for systematically creating tools for your agent, making the whole process more organized and easier to maintain. Here are a couple of examples:</p>



<p>Basic tool</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">@tool

def search_db(query: str, limit: int = 10) -> str:

   """Search the customer database for records matching the query.

   """

...

   return f"Found {limit} results for '{query}'"</pre>



<p>Tool with a custom name</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">@tool("pycharm_docs_search", return_direct=False)

def pycharm_docs_search(q: str) -> str:

   """Search the local FAISS index of JetBrains PyCharm documentation and return relevant passages."""

...

   docs = retriever.get_relevant_documents(q)

   return format_docs(docs)</pre>



<h3 class="wp-block-heading">Middleware</h3>



<p>Middleware provides ways to define the logic of your agent and customize its behavior. For example, there is middleware that can monitor the agent during runtime, assist with prompting and selecting tools, or even help with advanced use cases like guardrails, etc.</p>



<p>Here are a few examples of built-in middleware. For the full list, please refer to the <a href="https://docs.langchain.com/oss/python/langchain/middleware/built-in#provider-agnostic-middleware" target="_blank" rel="noopener">LangChain middleware documentation</a>.</p>



<figure class="wp-block-table"><table><tbody><tr><td><strong>Middleware</strong></td><td><strong>Description</strong></td></tr><tr><td>Summarization</td><td>Automatically summarize the conversation history when approaching token limits.</td></tr><tr><td>Human-in-the-loop</td><td>Pause execution for human approval of tool calls.</td></tr><tr><td>Context editing</td><td>Manage conversation context by trimming or clearing tool uses.</td></tr><tr><td>PII detection</td><td>Detect and handle personally identifiable information (PII).</td></tr></tbody></table></figure>



<h2 class="wp-block-heading">Real-world LangChain use cases</h2>



<p>LangChain use cases cover a varied range of fields, with common instances including:&nbsp;</p>



<ol>
<li><a href="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#ai-powered-chatbots" data-type="link" data-id="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#ai-powered-chatbots">AI-powered chatbots</a></li>



<li><a href="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#document-question-answering-systems" data-type="link" data-id="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#document-question-answering-systems">Document question answering systems</a></li>



<li><a href="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#content-generation-tools" data-type="link" data-id="https://blog.jetbrains.com/pycharm/2026/02/langchain-tutorial-2026/#content-generation-tools">Content generation tools</a></li>
</ol>



<h3 class="wp-block-heading" id="ai-powered-chatbots">AI-powered chatbots</h3>



<p>When we think of AI agents, we often think of chatbots first. If you’ve read the <a href="https://blog.jetbrains.com/pycharm/2024/08/how-to-build-chatbots-with-langchain/"><em>How to Build Chatbots With LangChain</em></a> blog post, then you’re already up to speed about this use case. If not, I highly recommend checking it out.</p>



<h3 class="wp-block-heading" id="document-question-answering-systems">Document question answering systems</h3>



<p>Another real-world use case for LangChain is a document question answering system. For example, companies often have internal documents and manuals that are rather long and unwieldy. A document question answering system provides a quick way for employees to find the info they need within the documents, without having to manually read through each one.</p>



<p>To demonstrate, we’ll create a <a href="https://github.com/Cheukting/langchain-example1/blob/main/src/langchainexample/ingest_pycharm_docs.py" data-type="link" data-id="https://github.com/Cheukting/langchain-example1/blob/main/src/langchainexample/ingest_pycharm_docs.py" target="_blank" rel="noopener">script</a> to index the <a href="https://www.jetbrains.com/help/pycharm/" target="_blank" rel="noopener">PyCharm documentation</a>. Then we’ll create an AI agent that can answer questions based on the documents we indexed. First let’s take a look at our tool:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">@tool("pycharm_docs_search")

def pycharm_docs_search(q: str) -> str:

   """Search the local FAISS index of JetBrains PyCharm documentation and return relevant passages."""

   # Load vector store and create retriever

   embeddings = OpenAIEmbeddings(

       model=settings.openai_embedding_model, api_key=settings.openai_api_key

   )

   vector_store = FAISS.load_local(

       settings.index_dir, embeddings, allow_dangerous_deserialization=True

   )

   k = 4

   retriever = vector_store.as_retriever(

       search_type="mmr", search_kwargs={"k": k, "fetch_k": max(k * 3, 12)}

   )

   docs = retriever.invoke(q)</pre>



<p>We are using a <a href="https://docs.langchain.com/oss/python/integrations/vectorstores" target="_blank" rel="noopener">vector store</a> to perform a similarity search with embeddings provided by OpenAI. Documents are embedded so the doc search tool can perform similarity searches to fetch the relevant documents when called.&nbsp;</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">def main():

   parser = argparse.ArgumentParser(

       description="Ask PyCharm docs via an Agent (FAISS + GPT-5)"

   )

   parser.add_argument("question", type=str, nargs="+", help="Your question")

   parser.add_argument(

       "--k", type=int, default=6, help="Number of documents to retrieve"

   )

   args = parser.parse_args()

   question = " ".join(args.question)

   system_prompt = """You are a helpful assistant that answers questions about JetBrains PyCharm using the provided tools.

   Always consult the 'pycharm_docs_search' tool to find relevant documentation before answering.

   Cite sources by including the 'Source:' lines from the tool output when useful. If information isn't found, say you don't know."""

   agent = create_agent(

       model=settings.openai_chat_model,

       tools=[pycharm_docs_search],

       system_prompt=system_prompt,

       response_format=ToolStrategy(ResponseFormat),

   )

   result = agent.invoke({"messages": [{"role": "user", "content": question}]})

   print(result["structured_response"].content)</pre>



<p>&nbsp;</p>



<p>System prompts are provided to the LLM together with the user’s input prompt. We are using OpenAI as the LLM provider in this example, and we’ll need an API key from them. Head to <a href="https://docs.langchain.com/oss/python/integrations/chat/openai" target="_blank" rel="noopener">this page</a> to check out OpenAI’s integration documentation. When creating an agent, we’ll have to configure the settings for `llm`, `tools`, and `prompt`.</p>



<p>For the full scripts and project, see <a href="https://github.com/Cheukting/langchain-example1" target="_blank" rel="noopener">here</a>.</p>



<h3 class="wp-block-heading" id="content-generation-tools">Content generation tools</h3>



<p>Another example is an agent that generates text based on content fetched from other sources. For instance, we might use this when we want to generate marketing content with info taken from documentation. In this example, we’ll pretend we’re doing marketing for Python and creating a newsletter for the latest Python release.</p>



<p>In <a href="https://github.com/Cheukting/langchain-example2/blob/main/app/tools.py" target="_blank" rel="noopener">tools.py</a>, a tool is set up to fetch the relevant information, parse it into a structured format, and extract the necessary information.</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">@tool("fetch_python_whatsnew", return_direct=False)

def fetch_python_whatsnew() -> str:

   """

   Fetch the latest "What's New in Python" article and return a concise, cleaned

   text payload including the URL and extracted section highlights.

   The tool ignores the input argument.

   """

   index_html = _fetch(BASE_URL)

   latest = _find_latest_entry(index_html)

   if not latest:

       return "Could not determine latest What's New entry from the index page."

   article_html = _fetch(latest.url)

   highlights = _extract_highlights(article_html)

   return f"URL: {latest.url}\nVERSION: {latest.version}\n\n{highlights}"</pre>



<p>As for the agent in <a href="https://github.com/Cheukting/langchain-example2/blob/main/app/agent.py" target="_blank" rel="noopener">agent.py</a>.&nbsp;</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">SYSTEM_PROMPT = (

   "You are a senior Product Marketing Manager at the Python Software Foundation. "

   "Task: Draft a clear, engaging release marketing newsletter for end users and developers, "

   "highlighting the most compelling new features, performance improvements, and quality-of-life "

   "changes in the latest Python release.\n\n"

   "Process: Use the tool to fetch the latest 'What's New in Python' page. Read the highlights and craft "

   "a concise newsletter with: (1) an attention-grabbing subject line, (2) a short intro paragraph, "

   "(3) 4–8 bullet points of key features with user benefits, (4) short code snippets only if they add clarity, "

   "(5) a 'How to upgrade' section, and (6) links to official docs/changelog. Keep it accurate and avoid speculation."

)

...

def run_newsletter() -> str:

   load_dotenv()

   agent = create_agent(

       model=os.getenv("OPENAI_MODEL", "gpt-4o"),

       tools=[fetch_python_whatsnew],

       system_prompt=SYSTEM_PROMPT,

       # response_format=ToolStrategy(ResponseFormat),

   )

...</pre>



<p>As before, we provide a system prompt and the API key for OpenAI to the agent.</p>



<p>For the full scripts and project, see <a href="https://github.com/Cheukting/langchain-example2" target="_blank" rel="noopener">here</a>.</p>



<h2 class="wp-block-heading">Advanced LangChain concepts</h2>



<p>LangChain’s more advanced features can be extremely useful when you’re building a more sophisticated AI agent. Not all AI agents require these extra elements, but they are commonly used in production. Let’s look at some of them.</p>



<h3 class="wp-block-heading">MCP adapter</h3>



<p>The MCP (Model Context Protocol) allows you to add extra tools or functionalities to an AI agent, making it increasingly popular among active AI agent users and AI enthusiasts alike.&nbsp;</p>



<p>LangChain’s Client module provides a <a href="https://reference.langchain.com/python/langchain_mcp_adapters/" target="_blank" rel="noopener">MultiServerMCPClient</a> class that allows the AI agent to accept MCP server connections. For example:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">from langchain_mcp_adapters.client import MultiServerMCPClient

client = MultiServerMCPClient(

   {

       "postman-server": {

          "type": "http",

          "url": "https://mcp.eu.postman.com",

           "headers": {

               "Authorization": "Bearer ${input:postman-api-key}"

           }

       }

   }

)

all_tools = await client.get_tools()</pre>



<p>The above connects to the <a href="https://www.postman.com/postman/postman-public-workspace/collection/681dc649440b35935978b8b7" target="_blank" rel="noopener">Postman MCP server in the EU</a> with an API key.</p>



<h3 class="wp-block-heading">Guardrails</h3>



<p>As with many AI technologies, since the logic is not pre-determined, the behavior of an AI agent is non-deterministic. Guardrails are necessary for managing AI behavior and ensuring that it is policy-compliant.</p>



<p>LangChain middleware can be used to set up specific guardrails. For example, you can use PII detection middleware to protect personal information or human-in-the-loop middleware for human verification. You can even create custom middleware for more specific guardrail policies.&nbsp;</p>



<p>For instance, you can use the `<a href="https://docs.langchain.com/oss/python/langchain/guardrails#before-agent-guardrails" target="_blank" rel="noopener">@before_agent</a>` or `<a href="https://docs.langchain.com/oss/python/langchain/guardrails#after-agent-guardrails" target="_blank" rel="noopener">@after_agent</a>` decorators to declare guardrails for the agent’s input or output. Below is an example of a code snippet that checks for banned keywords:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">from typing import Any

from langchain.agents.middleware import before_agent

banned_keywords = ["kill", "shoot", "genocide", "bomb"]

@before_agent(can_jump_to=["end"])

def content_filter() -> dict[str, Any] | None:

  """Block requests containing banned keywords."""

  content = first_message.content.lower()

# Check for banned keywords

  for keyword in banned_keywords:

      if keyword in content:

          return {

              "messages": [{

                  "role": "assistant",

                  "content": "I cannot process your requests due to inappropriate content."

              }],

              "jump_to": "end"

          }

  return None

from langchain.agents import create_agent

agent = create_agent(

  model="gpt-4o",

  tools=[search_tool],

  middleware=[content_filter],

)

# This request will be blocked

result = agent.invoke({

  "messages": [{"role": "user", "content": "How to make a bomb?"}]

})</pre>



<p>For more details, check out the documentation <a href="https://docs.langchain.com/oss/python/langchain/guardrails" target="_blank" rel="noopener">here</a>.</p>



<h3 class="wp-block-heading">Testing</h3>



<p>Just like in other software development cycles, testing needs to be performed before we can start rolling out AI agent products. LangChain provides testing tools for both unit tests and integration tests.&nbsp;</p>



<h4 class="wp-block-heading">Unit tests</h4>



<p>Just like in other applications, unit tests are used to test out each part of the AI agent and make sure it works individually. The most helpful tools used in unit tests are mock objects and mock responses, which help isolate the specific part of the application you’re testing.&nbsp;</p>



<p>LangChain provides <a href="https://python.langchain.com/api_reference/core/language_models/langchain_core.language_models.fake_chat_models.GenericFakeChatModel.html?_gl=1*fwqfa2*_gcl_au*Mzg1NzM1NDUxLjE3NjUyMDk4OTg.*_ga*MTk1ODUyNzE1Ny4xNzY1MjA5ODk4*_ga_47WX3HKKY2*czE3NjYxNTQ5MDkkbzE3JGcxJHQxNzY2MTU1ODM4JGo2MCRsMCRoMA.." target="_blank" rel="noopener">GenericFakeChatModel</a>, which mimics response texts. A response iterator is set in the mock object, and when invoked, it returns the set of responses one by one. For example:</p>



<pre class="EnlighterJSRAW" data-enlighter-language="python" data-enlighter-theme="" data-enlighter-highlight="" data-enlighter-linenumbers="" data-enlighter-lineoffset="" data-enlighter-title="" data-enlighter-group="">from langchain_core.language_models.fake_chat_models import GenericFakeChatModel

def respond(msgs, **kwargs):

   text = msgs[-1].content if msgs else ""

   examples = {"Hello": "Hi there!", "Ping": "Pong.", "Bye": "Goodbye!"}

   return examples.get(text, "OK.")

model = GenericFakeChatModel(respond=respond)

print(model.invoke("Hello").content)</pre>



<h4 class="wp-block-heading">Integration tests</h4>



<p>Once we’re sure that all parts of the agent work individually, we have to test whether they work together. For an AI agent, this means testing the trajectory of its actions. To do so, LangChain provides another package: <a href="https://github.com/langchain-ai/agentevals" target="_blank" rel="noopener">AgentEvals</a>.</p>



<p>AgentEvals provides two main evaluators to choose from:</p>



<ol>
<li>Trajectory match – A reference trajectory is required and will be compared to the trajectory of the result. For this comparison, you have <a href="https://docs.langchain.com/oss/python/langchain/test#trajectory-match-evaluator" target="_blank" rel="noopener">4 different models</a> to choose from.</li>



<li>LLM judge – An <a href="https://docs.langchain.com/oss/python/langchain/test#llm-as-judge-evaluator" target="_blank" rel="noopener">LLM judge</a> can be used with or without a reference trajectory. An LLM judge evaluates whether the resulting trajectory is on the right path.</li>
</ol>



<h2 class="wp-block-heading">LangChain support in PyCharm</h2>



<p>With LangChain, you can develop an AI agent that suits your needs in no time. However, to be able to effectively use LangChain in your application, you need an effective debugger. In PyCharm, we have the <a href="https://plugins.jetbrains.com/plugin/26921-ai-agents-debugger" target="_blank" rel="noopener">AI Agents Debugger plugin</a>, which allows you to power up your experience with LangChain.</p>



<p>If you don’t yet have PyCharm, <a href="https://www.jetbrains.com/pycharm/download/" target="_blank" rel="noopener">you can download it here</a>.</p>



<p>Using the AI Agents Debugger is very straightforward. Once you install the plug-in, it will appear as an icon on the right-hand side of the IDE.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-27.png" alt="" class="wp-image-681666"/></figure>



<p>When you click on this icon, a side window will open with text saying that no extra code is needed – just run your agent and traces will be shown automatically.</p>



<p>As an example, we will run the <a href="https://github.com/Cheukting/langchain-example2" target="_blank" rel="noopener">content generation agent</a> that we built above. If you need a custom run configuration, you will have to set it up now by following this guide on <a href="https://www.jetbrains.com/help/pycharm/run-debug-configuration.html" target="_blank" rel="noopener">custom run configurations in PyCharm</a>.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-27.png" alt="" class="wp-image-681676"/></figure>



<p>Once it is done, you can review all the input prompts and output responses at a glance. To inspect the LangGraph, click on the <em>Graph</em> button in the top-right corner.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-27.png" alt="" class="wp-image-681673"/></figure>



<p>The <em>LangGraph </em>view is especially useful if you have an agent that has complicated steps or a customized workflow.</p>



<h2 class="wp-block-heading">Summing up</h2>



<p>LangChain is a powerful tool for building AI agents that work for many use cases and scenarios. It’s built on <a href="https://docs.langchain.com/oss/python/langgraph/overview" target="_blank" rel="noopener">LangGraph</a>, which provides low-level orchestration and runtime customization, as well as compatibility with a vast variety of LLMs on the market. Together, LangChain and LangGraph set a new industry standard for developing AI agents.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Python Unplugged on PyTV – A Free Online Python Conference for Everyone </title>
		<link>https://blog.jetbrains.com/pycharm/2026/02/python-unplugged-on-pytv/</link>
		
		<dc:creator><![CDATA[Cheuk Ting Ho]]></dc:creator>
		<pubDate>Wed, 11 Feb 2026 16:37:44 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/02/Blog-Social-Share-image-1280x720-1.png</featuredImage>		<product ><![CDATA[education]]></product>
		<category><![CDATA[ai]]></category>
		<category><![CDATA[conference]]></category>
		<category><![CDATA[ml]]></category>
		<category><![CDATA[python]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=679943</guid>

					<description><![CDATA[The PyCharm team loves being part of the global Python community. From PyCon US to EuroPython to every PyCon in between, we enjoy the atmosphere at conferences, as well as meeting people who are as passionate about Python as we are. This includes everyone: professional Python developers, data scientists, Python hobbyists and students. However, we [&#8230;]]]></description>
										<content:encoded><![CDATA[
<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/Blog-Featured-1280x720-1.png" alt="" class="wp-image-680011"/></figure>



<p>The <a href="https://www.jetbrains.com/pycharm/" data-type="link" data-id="https://www.jetbrains.com/pycharm/" target="_blank" rel="noopener">PyCharm</a> team loves being part of the global Python community. From PyCon US to EuroPython to every PyCon in between, we enjoy the atmosphere at conferences, as well as meeting people who are as passionate about Python as we are. This includes everyone: professional Python developers, data scientists, Python hobbyists and students.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-1.jpeg" alt="" class="wp-image-679944"/></figure>



<p>However, we know that being able to attend a Python conference in person is not something that everyone can do, either because they don’t have a local conference, or cannot travel to one. So within the PyCharm team we started thinking: what if we could bring the five-star experience of Python conferences to everyone? What if everyone could have the experience of learning from professional speakers, accessing great networking opportunities, hearing from various voices from across the community, and &#8211; most importantly &#8211; having fun, no matter where they are in the world?</p>



<h2 class="wp-block-heading">Python is for Everyone &#8211; Announcing Python Unplugged on PyTV!</h2>



<figure class="wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio"><div class="wp-block-embed__wrapper">
<iframe loading="lazy" title="Announcing &quot;Python Unplugged on PyTV&quot; – March 4, 2026" src="https://www.youtube.com/embed/8KblH4leUVA?feature=oembed" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" referrerpolicy="strict-origin-when-cross-origin" allowfullscreen></iframe>
</div></figure>



<p>After almost a year of planning, we’re proud to announce we’ll be hosting the first ever PyTV &#8211; a free online conference for everyone!</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-8.png" alt="" class="wp-image-679954"/></figure>



<p>Join us on <strong>March 4th 2026</strong>, for an unforgettable, non-stop event, streamed from our studio in Amsterdam. We’ll be joined live by 15 well-known and beloved <a href="https://lp.jetbrains.com/python-unplugged/" target="_blank" rel="noopener">speakers</a> from Python communities around the globe, including Carol Willing, Deb Nicholson, Sheena O’Connell, Paul Everitt, Marlene Mhangami, and Carlton Gibson. They’ll be speaking about topics such as core Python, AI, community, web development and data science.&nbsp;</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2026/02/image-9.png" alt="" class="wp-image-679964"/></figure>



<p>You can get involved in the fun as well! Throughout the livestream, you can join our chat on Discord, where you can interact with other participants and our speakers. We’ve also prepared games and quizzes, with fabulous prizes up for grabs! You might even be able to get your hands on some of the super cool conference swag that we designed specifically for this event.</p>



<p><strong>What are you waiting for? <a href="https://lp.jetbrains.com/python-unplugged/" target="_blank" rel="noopener">Sign up here.</a>&nbsp;</strong></p>



<p>If you are local to Amsterdam, you can also sign up for the <a href="https://www.meetup.com/pyladiesams/" target="_blank" rel="noopener">PyLadies Amsterdam meetup</a>. It will be held on the same day as the conference, and will give you a chance to meet some of the PyTV speakers in person.</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Google Colab Support Is Now Available in PyCharm 2025.3.2</title>
		<link>https://blog.jetbrains.com/pycharm/2026/01/google-colab-support-is-now-available-in-pycharm-2025-3-2/</link>
		
		<dc:creator><![CDATA[Ilia Afanasiev]]></dc:creator>
		<pubDate>Wed, 28 Jan 2026 09:33:49 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2026/01/Colab-PC-social-BlogFeatured-1280x720-1-1.png</featuredImage>		<product ><![CDATA[jetbrains-for-data]]></product>
		<category><![CDATA[releases]]></category>
		<category><![CDATA[jupyter-notebooks]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=677006</guid>

					<description><![CDATA[PyCharm is designed to support the full range of modern Python workflows, from web development to data and ML/AI work, in a single IDE. An essential part of these workflows is Jupyter notebooks, which are widely used for experimentation, data exploration, and prototyping across many roles. PyCharm provides first-class support for Jupyter notebooks, both locally [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>PyCharm is designed to support the full range of modern Python workflows, from web development to data and ML/AI work, in a single IDE. An essential part of these workflows is Jupyter notebooks, which are widely used for experimentation, data exploration, and prototyping across many roles.</p>



<p>PyCharm provides first-class support for Jupyter notebooks, both locally and when connecting to <a href="https://www.jetbrains.com/help/pycharm/configuring-jupyter-notebook.html#add-external-server" target="_blank" rel="noopener">external Jupyter servers</a>, with IDE features such as refactoring and navigation available directly in notebooks. Meanwhile, Google Colab has become a key tool for running notebook-based experiments in the cloud, especially when local resources are insufficient.</p>



<p>With PyCharm 2025.3.2, we’re bringing local IDE workflows and Colab-hosted notebooks together. Google Colab support is now available for free in PyCharm as a core feature, along with basic Jupyter notebook support. If you already use Google Colab, you can now bring your notebooks into PyCharm and work with them using IDE features designed for larger projects and longer development sessions.</p>


    <div class="buttons">
        <div class="buttons__row">
                                                <a href="https://www.jetbrains.com/pycharm/download" class="btn" target="" rel="noopener">Download PyCharm</a>
                                                    </div>
    </div>







<h3 class="wp-block-heading">Getting started with Google Colab in PyCharm</h3>



<p>Connecting PyCharm to Colab is quick and straightforward:</p>



<ol>
<li>Open a Jupyter notebook in PyCharm.</li>



<li>Select Google Colab (Beta) from the Jupyter server menu in the top-right corner.</li>



<li>Sign in to your Google account.</li>



<li>Create and use a Colab-backed server for the notebook.</li>
</ol>



<p>Once connected, your notebook behaves as usual, with navigation, inline outputs, tables, and visualizations rendered directly in the editor.</p>



<figure class="wp-block-video"><video controls src="https://blog.jetbrains.com/wp-content/uploads/2026/01/pycharm-colab.mp4"></video></figure>



<h3 class="wp-block-heading">Working with data and files&nbsp;</h3>



<p>When your Jupyter notebook depends on files that are not yet available on the Colab machine, PyCharm helps you handle this without interrupting your workflow. If a file is missing, you can upload it directly from your local environment. The remote file structure is also visible in the <em>Project</em> tool window, so you can browse directories and inspect files as you work.</p>



<p>Whether you’re experimenting with data, prototyping models, or working with notebooks that outgrow local resources, this integration makes it easier to move between local work, remote execution, and cloud resources without changing how you work in PyCharm.</p>



<p>If you’d like to try it out:</p>



<ul>
<li><a href="https://www.jetbrains.com/pycharm/download/" target="_blank" rel="noopener">Download PyCharm 2025.3.2</a></li>



<li>Learn more about <a href="https://developers.google.com/colab" target="_blank" rel="noopener">Google Colab</a></li>
</ul>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>Next Edit Suggestions: Now Generally Available</title>
		<link>https://blog.jetbrains.com/ai/2025/12/next-edit-suggestions-now-generally-available/</link>
		
		<dc:creator><![CDATA[Anton Semenkin]]></dc:creator>
		<pubDate>Thu, 18 Dec 2025 16:10:39 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2025/12/AI-social-BlogFeatured-1280x720-1-1.png</featuredImage>		<product ><![CDATA[pycharm]]></product>
		<category><![CDATA[news]]></category>
		<category><![CDATA[ai-assistant]]></category>
		<category><![CDATA[ai-in-ides]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=ai&#038;p=669678</guid>

					<description><![CDATA[The next edit suggestions feature is now enabled in all JetBrains IDEs for JetBrains AI Pro, AI Ultimate, and AI Enterprise subscribers. Yes, you read that right! JetBrains-native diff suggestions are available right in your editor. Global support for optimized latency. Out-of-the-box IDE actions for reliability. And the best part? It doesn’t consume your AI [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>The next edit suggestions feature is now enabled in all JetBrains IDEs for JetBrains AI Pro, AI Ultimate, and AI Enterprise subscribers.</p>



<p>Yes, you read that right! JetBrains-native diff suggestions are available right in your editor. Global support for optimized latency. Out-of-the-box IDE actions for reliability. And the best part? <strong>It</strong> <strong>doesn’t consume your AI quota</strong>.</p>



<h2 class="wp-block-heading">What are next edit suggestions?</h2>



<p>Like the suggestions provided by AI code completion, next edit suggestions (NES) appear as you type. The difference is that NES can be proposed beyond the immediate vicinity of your caret, and they can modify existing code instead of exclusively adding new code. This feature is a natural extension of code completion, and together they comprise the in-flow <em>Tab-Tab</em> experience.&nbsp;</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/image-42.png" alt="" class="wp-image-669756"/></figure>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/image-43.png" alt="" class="wp-image-669767"/></figure>



<p>The NES feature runs silently in the background, generating suggestions as you modify your code. It then gives you the option to review and decide whether to accept them in a small in-editor diff view (the NES UI). The feature adapts how it presents the suggestions, showing them to you in the least intrusive way to avoid interfering with your work. Large changes appear in a dedicated diff view, while smaller suggestions are shown in a larger popup.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/2.-C_fields.gif" alt="" class="wp-image-669789"/></figure>



<p>Overall, NES provide a smart code editing experience. Let’s agree to share responsibilities as follows: you can simply type and continue development as you used to, and we suggest small digestible diffs that help you do your job faster. Deal?</p>



<h2 class="wp-block-heading">Who can use NES?&nbsp;</h2>



<p>With the latest AI Assistant update, next edit suggestions are enabled by default for all users with AI Pro, AI Ultimate, or AI Enterprise subscriptions. Unlike AI code completion, the next edit suggestions feature is currently unavailable for AI Free license holders. Stay tuned, though – we are actively working on bringing it to a wider audience!</p>



<p>You can always learn more about which AI features are available in different pricing tiers on our <a href="https://www.jetbrains.com/ai-ides/buy/?section=personal&amp;billing=yearly" target="_blank" rel="noopener">official page</a>.</p>



<h2 class="wp-block-heading">How do NES work?</h2>



<p>Trust us, there is a lot we could say about the internals, but we’ll try to keep things simple here.</p>



<p>Long story short, next edit suggestions are where AI meets 🤝 the intelligence of JetBrains IDEs. Under the hood, the feature calls our cloud-based custom AI model and leverages deterministic IDE actions where possible.&nbsp;</p>



<h3 class="wp-block-heading">AI model</h3>



<p>Currently, at their core, NES rely mostly on suggestions provided by a model fine-tuned specifically for this task.&nbsp;</p>



<p>Much like <a href="https://huggingface.co/collections/JetBrains/mellum" target="_blank" rel="noopener">Mellum</a>, the model is a small language model (SLM) that leverages cloud GPU infrastructure to provide the best possible latency all around the world. Unlike Mellum, however, the underlying model is bigger and leverages a different type of context: the history of your recent changes as opposed to the current file and RAG.</p>



<p>Bigger does not always mean slower! Our inference pipelines differ for code completion and next edit suggestions generation. NES employ several inference tricks that keep latency under 200 ms for the majority of requests, even at the busiest times of the day 💪. If you ever thought that completion in JetBrains IDEs was slow, it’s time to reconsider!</p>



<h3 class="wp-block-heading">IDE actions (code insights)</h3>



<p>Developers love our IDEs because of their reliability, and next edit suggestions put that aspect at your fingertips.</p>



<p>As part of their pipeline, when invoked, NES look for available code insights provided by the IDE and show them in the NES UI if they are appropriate. One of the easiest ways to see this interaction at work would be to look at a suggestion that renames an identifier in a file. The next edit suggestion will activate the IDE’s <em>Rename </em>refactoring, and usages will be conveniently updated. This even works with multi-file search!</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/dark-multi-file-rename-1.gif" alt="" class="wp-image-669800"/></figure>



<p>The integration between next edit suggestions and IDE code insights is not yet fully complete. Because even frontier models struggle with out-of-distribution tools, or even just having a large number of tools in general, we are intentionally adding new IDE actions to NES slowly. We are prioritizing the ones that are useful the most often, as well as the ones the models can use most effectively. Let us know in the comments which IDE actions you would find useful in NES!</p>



<h3 class="wp-block-heading">Summary</h3>



<p>Next edit suggestions don’t replace the existing forms of code completion, but complement them, ensuring the best speed and relevance. Where code completion provides suggestions for new material, the next edit suggetions model works in the field of, well, <em>edits</em>. It is optimised to propose changes to existing code, but sometimes the best edit is simply to add something new. In those cases, the suggestions will look like completions because they are presented the same way – as inline gray text.&nbsp;</p>



<p>The simple scheme below explains which suggestion provider can be handled by which UI.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/TAB_-models-to-UX-relation.jpg" alt="" class="wp-image-669811"/></figure>



<h2 class="wp-block-heading">Settings panel update</h2>



<p>In addition to enabling this new feature, we are redesigning the settings for AI code completion and next edit suggestions. Shortly after the start of the new year, the settings for these features will be simplified. Instead of having to navigate multiple views, you will be able to view everything on a single screen, with all the most important options available.</p>



<p>Here’s a sneak peek of the new design:</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/image-44.png" alt="" class="wp-image-670028"/></figure>



<p>As you can see, the settings for local completion, cloud-based completion, and next edit suggestions are all combined on a single page where you can decide what you want and what you don’t.&nbsp;</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/image-45.png" alt="" class="wp-image-670045"/></figure>



<h2 class="wp-block-heading">AI code completion and NES cheat sheet</h2>



<p>Deciding which types of suggestions to enable may feel a bit overwhelming, so we’ve put together a short cheat sheet to help clarify which settings to enable in the new settings panel, depending on your preferred workflow.</p>



<h3 class="wp-block-heading">Case 1: You don’t want AI in your editor</h3>



<p>Simply turn off inline completion and next edit suggestions on this panel. We’ll make sure you don’t see any results of matrix multiplications.</p>



<h3 class="wp-block-heading">Case 2: You don’t want cloud-based suggestions</h3>



<p>Just turn on inline completion with local models. Those models are already bundled into your IDE and work without an internet connection. Good ol’ <a href="https://blog.jetbrains.com/blog/2024/04/04/full-line-code-completion-in-jetbrains-ides-all-you-need-to-know/">full line code completion</a> will have your back.</p>



<p>If you want your own local solution, you can plug any open-source model into the IDE via LM Studio or Ollama. This option is available on the <em>AI Assistant</em> | <em>Models</em> settings page. Note that, currently, this option only works for code completion. We will closely monitor the level of quality that is possible with local inference for NES, with the aim of eventually including it as well.</p>



<h3 class="wp-block-heading">Case 3: You like completion but NES seem off</h3>



<p>In this case, the best solution is to turn on inline completion with the <em>Cloud and local</em> models option and make sure that next edit suggestions are turned off. You will get the best from the Mellum model, and the IDE will automatically fall back to local models if your internet connection is unstable.&nbsp;</p>



<h3 class="wp-block-heading">Case 4: You like full-blown in-editor AI assistance</h3>



<p>Turn on both cloud models for inline completion and next edit suggestions to get code snippet suggestions as you modify your source code.</p>



<h2 class="wp-block-heading">What’s next for NES?</h2>



<p>Here is a quick look at some of the improvements we’re already working on:</p>



<ul>
<li>Smarter and more precise suggestions</li>



<li>More IDE actions for NES to use</li>



<li>Longer tab sequences</li>
</ul>



<p>Many other developments are on our radar, and we’ll keep you updated as they come closer to fruition.</p>



<h2 class="wp-block-heading">Thank-you note</h2>



<p>While you update your AI Assistant and GPUs go brrr, we would like to thank everyone who participated in the <a href="https://blog.jetbrains.com/ai/2025/08/introducing-next-edit-suggestions-in-jetbrains-ai-assistant/">open Beta test for the next edit suggestions</a> feature this fall.</p>



<p>Over the last few months, the feature has been available to JetBrains AI subscribers who were willing to try it and share anonymous usage statistics. With your help, we were able to make sure the feature was ready and properly prepare the cloud infrastructure for a full-scale release. Thank you so much! ❤️</p>



<p><em>Tab</em>&#8211;<em>Tab</em>,</p>



<p>Your AI completion team</p>
]]></content:encoded>
					
		
		
			</item>
		<item>
		<title>PyCharm 2025.3 – Unified IDE, Jupyter notebooks in remote development, uv as default, and more</title>
		<link>https://blog.jetbrains.com/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</link>
		
		<dc:creator><![CDATA[Ilia Afanasiev]]></dc:creator>
		<pubDate>Mon, 08 Dec 2025 16:57:36 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2025/12/PC-releases-BlogFeatured-1280x720-1.png</featuredImage>		<product ><![CDATA[jetbrains-for-data]]></product>
		<category><![CDATA[releases]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=pycharm&#038;p=666528</guid>

					<description><![CDATA[We’re excited to announce that PyCharm 2025.3 is here! This release continues our mission to make PyCharm the most powerful Python IDE for web, data, and AI/ML development. It marks the migration of Community users to the unified PyCharm and brings full support for Jupyter notebooks in remote development, uv as the default environment manager, [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>We’re excited to announce that PyCharm 2025.3 is here! This release continues our mission to make PyCharm the most powerful Python IDE for web, data, and AI/ML development.</p>



<p>It marks the migration of Community users to the unified PyCharm and brings full support for Jupyter notebooks in remote development, uv as the default environment manager, proactive data exploration, new LSP tools support, the introduction of Claude Agent, and over 300 bug fixes.</p>


    <div class="buttons">
        <div class="buttons__row">
                                                <a href="https://www.jetbrains.com/pycharm/download/" class="btn" target="" rel="noopener">Download now</a>
                                                    </div>
    </div>







<h2 class="wp-block-heading">Community user migration to the unified PyCharm</h2>



<p>As announced earlier, PyCharm 2025.2 was the last major release of the Community Edition. With PyCharm 2025.3, we’re introducing a smooth migration path for Community users to the unified PyCharm.</p>



<p>The unified version brings everything together in a single product – Community users can continue using PyCharm for free and now also benefit from built-in Jupyter support.</p>



<p>With a one-click option to start a free Pro trial, it’s easier than ever to explore PyCharm’s advanced features for data science, AI/ML, and web development.</p>



<p><a href="https://www.jetbrains.com/pycharm/whatsnew" target="_blank" rel="noopener">Learn more in the full What’s New post →</a></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/image-13.png" alt="" class="wp-image-666529"/></figure>



<h2 class="wp-block-heading">Jupyter notebooks</h2>



<p>Jupyter notebooks are now fully supported in remote development. You can open, edit, and run notebooks directly on a remote machine without copying them to your local environment.</p>



<p>The <a href="https://www.jetbrains.com/help/pycharm/jupyter-notebook-support.html#jupyter-variables" target="_blank" rel="noopener"><em>Variables</em> tool window</a> also received sorting options, letting you organize notebook variables by name or type for easier data exploration.</p>



<p><a href="https://www.jetbrains.com/pycharm/whatsnew/#page__content-jupyter-notebooks" target="_blank" rel="noopener">Read more about Jupyter improvements →</a></p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/remote_jupyter.png" alt="" class="wp-image-666540"/></figure>



<h2 class="wp-block-heading">uv now the default for new projects</h2>



<p>When uv is detected on your system, PyCharm now automatically suggests it as the default environment manager in the <em>New Project</em> wizard.</p>



<p>For projects managed by uv, uv run is also used as the default command for your run configurations.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/UV.png" alt="" class="wp-image-666551"/></figure>



<h2 class="wp-block-heading">Proactive data exploration <sup><mark style="background-color:rgba(0, 0, 0, 0)" class="has-inline-color has-vivid-green-cyan-color">Pro</mark></sup></h2>



<p>PyCharm now automatically analyzes your pandas DataFrames to detect the most common data quality issues. If any are found, you can review them and use Fix with AI to generate cleanup code automatically.</p>



<p>The analysis runs quietly in the background to keep your workflow smooth and uninterrupted.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/data_exploration.png" alt="" class="wp-image-666562"/></figure>



<h2 class="wp-block-heading">Support for new LSP tools</h2>



<p>PyCharm 2025.3 expands its LSP integration with support for Ruff, ty, Pyright, and Pyrefly.</p>



<p>These bring advanced formatting, type checking, and inline type hints directly into your workflow.</p>



<p>More on <a href="https://www.jetbrains.com/help/pycharm/lsp-tools.html" target="_blank" rel="noopener">LSP tools</a>.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/lsp_tools.png" alt="" class="wp-image-666573"/></figure>



<h2 class="wp-block-heading">AI features</h2>



<h3 class="wp-block-heading">Multi-agent experience: Junie and Claude Agent</h3>



<p>Work with your preferred AI agent from a single chat: Junie by JetBrains and Claude Agent can now be used directly in the AI interface.&nbsp;</p>



<p>Claude Agent is the <a href="https://blog.jetbrains.com/ai/2025/09/introducing-claude-agent-in-jetbrains-ides/">first third-party AI agent</a> natively integrated into JetBrains IDEs.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/AI_agents.png" alt="" class="wp-image-666584"/></figure>



<h3 class="wp-block-heading">Bring Your Own Key (BYOK) is coming soon to JetBrains AI</h3>



<p>BYOK will let you connect your own API keys from OpenAI, Anthropic, or any OpenAI API-compatible local model, giving you more flexibility and control over how you use AI in JetBrains IDEs.</p>



<p><a href="https://blog.jetbrains.com/ai/2025/11/bring-your-own-key-byok-is-coming-soon-to-jetbrains-ai/">Read more</a></p>



<h3 class="wp-block-heading">Transparent in-IDE AI quota tracking&nbsp;</h3>



<p>Monitoring and managing your AI resources just got a lot easier, as you can now view your remaining AI Credits, renewal date, and top-up balance directly inside PyCharm.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/ai_quota-1.png" alt="" class="wp-image-666595"/></figure>



<h2 class="wp-block-heading">UIX changes</h2>



<h3 class="wp-block-heading"><em>Islands</em> theme</h3>



<p>The new <em>Islands</em> theme is now the default for all users, offering improved contrast, balanced layouts, and a softer look in both dark and light modes.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/islands.webp" alt="" class="wp-image-666606"/></figure>



<h3 class="wp-block-heading">New <em>Welcome</em> screen</h3>



<p>We’ve introduced a new non-modal <em>Welcome</em> screen that keeps your most common actions within reach and provides a smoother start to your workflow.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/SM_Welcome_Screen.png" alt="" class="wp-image-666617"/></figure>



<h3 class="wp-block-heading">Looking for more?</h3>



<ul>
<li>Visit our <a href="https://www.jetbrains.com/pycharm/whatsnew" target="_blank" rel="noopener">What’s New page</a> to learn about all 2025.3 features and bug fixes.</li>



<li>Read the <a href="https://youtrack.jetbrains.com/articles/PY-A-233538495/PyCharm-2025.3-253.28294.336-build-Release-Notes" target="_blank" rel="noopener">release notes</a> for the full breakdown of the changes.</li>



<li>If you encounter any problems, please report them via our <a href="https://youtrack.jetbrains.com/issues/PY" target="_blank" rel="noopener">issue tracker</a> so we can address them promptly.</li>
</ul>



<p>We’d love to hear your feedback on PyCharm 2025.3 – leave your comments below or connect&nbsp;with us on <a href="https://x.com/pycharm" target="_blank">X</a> and <a href="https://bsky.app/profile/pycharm.dev" target="_blank" rel="noopener">BlueSky</a>.</p>
]]></content:encoded>
					
		
		
		                    <language>
                        <code><![CDATA[zh-hans]]></code>
                        <url>https://blog.jetbrains.com/zh-hans/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[pt-br]]></code>
                        <url>https://blog.jetbrains.com/pt-br/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ko]]></code>
                        <url>https://blog.jetbrains.com/ko/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[ja]]></code>
                        <url>https://blog.jetbrains.com/ja/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[fr]]></code>
                        <url>https://blog.jetbrains.com/fr/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[es]]></code>
                        <url>https://blog.jetbrains.com/es/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                                    <language>
                        <code><![CDATA[de]]></code>
                        <url>https://blog.jetbrains.com/de/pycharm/2025/12/pycharm-2025-3-unified-ide-jupyter-notebooks-in-remote-development-uv-as-default-and-more/</url>
                    </language>
                	</item>
		<item>
		<title>Meet the Islands Theme – The New Default Look for JetBrains IDEs</title>
		<link>https://blog.jetbrains.com/platform/2025/12/meet-the-islands-theme-the-new-default-look-for-jetbrains-ides/</link>
		
		<dc:creator><![CDATA[Olga Berdnikova]]></dc:creator>
		<pubDate>Mon, 08 Dec 2025 09:53:16 +0000</pubDate>
		<featuredImage>https://blog.jetbrains.com/wp-content/uploads/2025/12/JB-social-BlogFeatured-1280x720-1-2.png</featuredImage>		<product ><![CDATA[dotnet]]></product>
		<product ><![CDATA[idea]]></product>
		<product ><![CDATA[pycharm]]></product>
		<category><![CDATA[intellij]]></category>
		<category><![CDATA[idea]]></category>
		<category><![CDATA[news]]></category>
		<category><![CDATA[intellij-platform]]></category>
		<category><![CDATA[jetbrains-ides]]></category>
		<category><![CDATA[ui]]></category>
		<guid isPermaLink="false">https://blog.jetbrains.com/?post_type=platform&#038;p=664735</guid>

					<description><![CDATA[The&#160;Islands&#160;theme&#160;is now the default look across JetBrains IDEs starting with version 2025.3.This update is more than a visual refresh. It’s our commitment to creating a soft, balanced environment designed to support focus and comfort throughout your workflow. We began introducing the new theme&#160;earlier this year, gathering feedback, conducting research, and testing it hands-on with developers [&#8230;]]]></description>
										<content:encoded><![CDATA[
<p>The&nbsp;<strong><em>Islands</em>&nbsp;theme</strong>&nbsp;is now the default look across JetBrains IDEs starting with version 2025.3.<br>This update is more than a visual refresh. It’s our commitment to creating a soft, balanced environment designed to support focus and comfort throughout your workflow.</p>



<p>We began introducing the new theme&nbsp;<a href="https://blog.jetbrains.com/platform/2025/09/islands-theme-the-new-look-coming-to-jetbrains-ides/" target="_blank" rel="noreferrer noopener">earlier this year</a>, gathering feedback, conducting research, and testing it hands-on with developers who use our IDEs every day.</p>



<p>The result is a modern, refined design shaped by real workflows and real feedback. It’s still the IDE you know, just softer, lighter, and more cohesive.&nbsp;</p>



<p>Let’s take a closer look. Literally.</p>


                        <div class="indent-img">
                <img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/mainNewSize-1.png" alt="">
            </div>
            






<h2 class="wp-block-heading" id="softer,-clearer,-and-easier-on-the-eyes">Softer, clearer, and easier on the eyes</h2>



<p>The&nbsp;<em>Islands</em>&nbsp;theme introduces a clean, uncluttered layout with rounded corners and balanced spacing, making the UI feel softer and easier on the eyes. We’ve also made tool window borders more distinct, making it easier to resize elements and adjust the workspace to your liking.</p>


            <figure class="media-with-caption">
                            <figcaption>
                    <blockquote>
                        <p>“It’s a modern feel. The radius on the borders and more distinctive layers bring a fresh feeling to the UI.&#8221;</p>
                    </blockquote>
                </figcaption>
                                        <img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/softer.gif" alt="">
                    </figure>
    






<h2 class="wp-block-heading" id="instant-tab-recognition">Instant tab recognition</h2>



<p>When working with multiple files, finding your active tab should never slow you down. The&nbsp;<em>Islands</em>&nbsp;theme improves tab recognition, making the active one clearly visible and easier to spot at a glance.&nbsp;</p>


            <figure class="media-with-caption">
                            <figcaption>
                    <blockquote>
                        <p>“The active tab is very obvious, which is really nice”</p>
                    </blockquote>
                </figcaption>
                                        <img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/tab.gif" alt="">
                    </figure>
    






<h2 class="wp-block-heading" id="organized-spaces-for-focus-support">Organized spaces for focus support</h2>



<p>The new design introduces a clear separation between working areas, giving each part of the IDE – the editor, tool windows, and panels – its own visual space. This layout feels more organized and easier to navigate, helping you move around the IDE without losing focus or pace.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/organized.gif" alt="" class="wp-image-664784"/></figure>



<p>If you want even clearer visual emphasis on the editor, you can enable the&nbsp;<em>Different tool window background</em>&nbsp;option in&nbsp;<em>Settings | Appearance</em>&nbsp;under the&nbsp;<em>Islands</em>&nbsp;theme settings.</p>



<figure class="wp-block-image size-full"><img style="width:100% !important; height:auto !important; max-width:100% !important;" decoding="async" loading="lazy" src="https://blog.jetbrains.com/wp-content/uploads/2025/12/last-1.webp" alt="" class="wp-image-664796"/></figure>



<p>This is what we wanted to share about the new <em>Islands</em> theme, now the default look across all JetBrains IDEs. This thoughtful visual update shaped by feedback from daily users and aligned with the latest design directions in macOS and Windows 11 offers a softer, clearer, and more comfortable environment. And we believe this helps you stay productive and focused on what matters most – your code.</p>
]]></content:encoded>
					
		
		
			</item>
	</channel>
</rss>
