![]() |
|
![]() |
| My understanding of their comment is that the source is made available immediately. It’s just that you need to pay for a license to use it for the first couple years. |
![]() |
| Pay-for closed-source is profitable for about a week. Then some asshole offers a shittier but free facsimile and community contributions render it good enough to kill the original. |
![]() |
| It doesn't mean your code is NOT being uploaded somewhere. They could add an easy switch to use the editor 'offline', not that they have to. I'll go back to Helix. |
![]() |
| I would also cast my vote for sublime text. The performance is amazing, the defaults are great and the extensions cover a lot of the use cases |
![]() |
| A GUI that uses native controls and platform UI conventions with the native behavior expected on the given platform, or a near-indistinguishable equivalent of that. |
![]() |
| > Extensions can add the following capabilities to Zed: Languages, Themes, Slash Commands
This is a great start but it's far from what most would accept as "programmable" or richly extensible. |
![]() |
| What were the Zed features that made you to switch? I feel like with todays ecosystem it's easier to complete neovim experience with plugins than wait for Zed devs to catch up. |
![]() |
| Codeium works well but I really like the copilot chat plugin as well - it generally does a good job of explaining highlighted code, fixing errors, and other code interactions. |
![]() |
| yeah that caught my eye too, looks to me like speculative editing (they mentioned that its faster to each its input) + prompt-caching it would literally build up on all the tech they have |
![]() |
| I find the only "AI" I need is really just "Intellisense". Just auto complete repetitive lines or symbol names intelligently, and that doesn't even require an AI model. |
![]() |
| I hope that in a few years we look back at this era of "prompt an LLM for a shell command and instantly run whatever it spits out by pressing enter" with collective embarrassment. |
![]() |
| The macOS reference is irrelevant to Java here.
The charset defines the encoding which applies to first and foremost I/O behavior on how it treats otherwise untyped stream of bytes that are being converted to or from (UTF-16) text as stored by Java. https://openjdk.org/jeps/400 is yesterday's news and something that .NET has been doing since long time ago (UTF8 encoding is an unconditional default starting with .NET Core 1.0 (2017)). Whether Win32 APIs take UTF-8 or something else (well, it's usually ANSI or UTF-16) is something for the binding libraries or similar abstraction packages for a language of choice to deal with, and has rather minor impact on the overall flamegraph if you profile a sample application. I find it strange having to defend this, but the UTF-8 vs UTF-16 argument really has no place in 2024 as dealing with popular encodings is as solved problem as it gets in all languages with adequate standard library. |
![]() |
| * Both Linux and MacOS are unices, there is less effort.
* The framework they use supports X11 and Wayland out of the box, it wasn't as much effort as you'd think. * They accept contributions. |
![]() |
| I like the idea of Zed, and I recently went editor hopping. I installed Zed but was immediately hit with "your gpu is not supported/ will run incredibly slow" message. Gah... |
![]() |
| In my case this pointed out a problem with my NVIDA drivers that I didn’t know about. Once I fixed that issue my whole KDE system ran much faster and allowed Zed to run |
![]() |
| I think Zed is starting with a more transparent elegant foundations and then they'll build in more optional magic from there. For example, they're working on automatic codebase RAG |
![]() |
| Anthropic working with Paul Gauthier and Zed being aider-aware would be phenomenal. He's been working this for a while:
https://news.ycombinator.com/item?id=35947073 When familiar with Aider, it feels as if this Zed.ai post is chasing Paul's remarkably pragmatic ideas for making LLMs adept at codebases, without yet hitting the same depth of repo understanding or bringing automated smart loops to the process. Watching Aider's "wait, you got that wrong" prompt chains kick in before handing the code back to you is a taste of "AI". If your IDE is git savvy, then working with Aider in an Aider REPL terminal session with frequent /commits that update your IDE is like pair programming with a junior dev that happens to have read all the man pages, docs wikis, and stackoverflow answers for your project. |
![]() |
| Missing from this announcement is language around Privacy. Cursor for example has a Privacy Mode that promises not to store code, and this seems like a critical feature for any AI enhanced dev tools. |
![]() |
| Let me be very direct - what's the strength over the competition, e.g. Cody? The fact that it's its own text editor? I'm seeing the assistant emphasized but that just looks like Cody to me. |
![]() |
| Has any long-term Emacs user delved into Zed and ported the cool features yet?
Don't take it as sarcasm, I am genuinely interested. I think Emacs' malleability is what still keeps it alive. |
![]() |
| Is all the overhead required to use the AI features easily disabled with a feature flag such that zero CPU cost and zero network transmission occurs? |
![]() |
| I had hope Zed would be a good editor for junior developers, but that ship apparently has sailed, and it's destination isn't where we need to go. |
![]() |
| Are you talking specifically about the general UX? On initial glance it does look a little like there is a bit more of a learning curve to navigate. |
![]() |
| I wonder if there's already a solution that allows me to ask questions about local codebases. e.g. how does this subsystem work. |
![]() |
| This feature does exactly that. You can open up the chat panel, run "/tab name-of-tab" or "/file path-to-file" and then start asking questions about the code. |
![]() |
| The Zed configuration panel includes tools for adding an Anthropic API key, a Google Gemini API key, an OpenAI API key or connecting to a local instance of ollama. |
![]() |
| First the unsolicited package installation controversy now they jumped onto the AI bandwagon. Is this a speedrun attempt at crashing a newly created company?
What's next? Web3 integration? Blockchain? |
![]() |
| am Cursor main, dont really have any burning pains that make me want to change tools but open to what I dont know.
Zed vs Cursor review anyone? |
![]() |
| Two areas where I think Zed might fall behind: Cursor Tab is REALLY good and probably requires some finetuning/ML chops and some boutique training data.
For composer, there's going to be more use of "shadow workspace" https://www.cursor.com/blog/shadow-workspace to create an agentic feedback loop/ objective function for codegen along with an ability to navigate the language server and look up definitions and just generally have full context like an engineer. Are there plans for the same in zed? Also, cursor has a model agnostic apply model, whereas you all are leaning on claude. |
![]() |
| Cursor is electron/vscode based. Zed uses a custom built rust UI and editor model that gives 120fps rendering. (Or was it 60fps)
It is really smooth on a Mac with ProMotion. |
![]() |
| Hey! I'm Nate from Zed. There are a lot of questions about this, here are some quick thoughts...
Cursor is great – We explored an alternate approach to our assistant similar to theirs as well, but in the end we found we wanted to lean into what we think our super power is: Transforming text. So we leaned into it heavily. Zed's assistant is completely designed around retrieving, editing and managing text to create a "context"[0]. That context can be used to have conversations, similar to any assistant chatbot, but can also be used to power transformations right in your code[1], in your terminal, when writing prompts in the Prompt Library... The goal is for context to be highly hackable. You can use the /prompt command to create nested prompts, use globs in the /file command to dynamically import files in a context or prompt... We even expose the underlying prompt templates that power things like the inline assistant so you can override them[2]. This approach doesn't give us the _simplest_ or most approachable assistant, but we think it gives us and everyone else the tools to create the assistant experience that is actually useful to them. We try to build the things we want, then share it with everyone else. TL;DR: Everything is text because text is familiar and it puts you in control. [0]: https://zed.dev/docs/assistant/contexts.html [1]: https://zed.dev/docs/assistant/inline-assistant [2]: https://zed.dev/docs/assistant/prompting#overriding-template... |
![]() |
| Hey! I really see the power in Zed and the extensibility and simplicity. Great approach.
I posted this above, but want you to see it: Two areas where I think Zed might fall behind: Cursor Tab is REALLY good and probably requires some finetuning/ML chops and some boutique training data. For composer, there's going to be more use of "shadow workspace" https://www.cursor.com/blog/shadow-workspace to create an agentic feedback loop/ objective function for codegen, along with an ability to navigate the language server and look up definitions and just generally have full context like an engineer Also, cursor has a model agnostic apply model, whereas you all are leaning on claude. Any plans to address this from the core team or more of a community thing? I think some of this might be a heavy lift I really like the shared context idea, and the transparency and building primitives for an ecosystem |
![]() |
| I mainly use PyCharm and I found the auto-complete to be good. It doesn't always kick in when I expect but some of the suggestions have been surprisingly complex and correct. |
![]() |
| Maybe its because there's literally no point in using a local llm for code-completion. You'd be spending 90% of your time correcting it. Its barely worth it to use co-pilot. |
![]() |
| What it shows is that it can be done — in a limited way. Other people might not like those limits and chose to go a different way. I am not sure what's worth lamenting here. |
![]() |
| Hmm. I was excited about Zed, but it now seems painfully clear they’re headed in a completely different direction than I’m interested in. Back to neovim, I guess… |
However, personally, I prefer to have it configured to talk directly to Anthropic, to limit the number of intermediaries seeing my code, but in general I can see myself using this in the future.
More importantly, I’m happy that they might be closing in on a good revenue stream. I don’t yet see the viability of the collaboration feature as a business model, and I was worried they’re gonna have trouble finding a way to sensibly monetize Zed and quit it at some point. This looks like a very sensible way, one that doesn’t cannibalize the open-source offering, and one that I can imagine working.
Fingers crossed, and good luck to them!
[0]: https://news.ycombinator.com/item?id=41286612