mirror of
https://github.com/roc-lang/roc.git
synced 2025-12-23 08:48:03 +00:00
Add document storage and didOpen/didChange to LSP (#8453)
* Add documents storage Currently using a StringHashMap to store the different project documents. Might investigate into doing some caching later down the line. * Integrate document store and notifications Add the document store to the Server struct. Also separate the handlers into request and notifications as their data structure and expected response differ. * Separate values assignment to different files In order to remove the need to modify `initialize.zig` and `protocol.zig` whenever a new version or new server capabilities is implemented, version was moved to the Server struct and capabilities to a separate file. * Add support for incremental change Document store now supports incremental change of files which mean it's possible to change a part of the file without inserting the whole file each time. * Add notification for didOpen and didChange Currently just stores opened buffer into a StringHashMap which will allow parsing them later on. * Fix enum conversion The text document sync type was being sent as a string instead of its integer equivalent. * Add tests for document store * Ensure test wiring * Add more documentation to README.md --------- Co-authored-by: Anton-4 <17049058+Anton-4@users.noreply.github.com>
This commit is contained in:
parent
34e38bc336
commit
0426fe4a93
12 changed files with 407 additions and 12 deletions
|
|
@ -4,14 +4,24 @@ written in Zig as part of the Rust to Zig rewrite.
|
|||
|
||||
## Current state
|
||||
The experimental LSP currently only holds the scaffolding for the incoming implementation.
|
||||
It doesn't implement any LSP capabilities yet except `initialized` and `exit` which allows it
|
||||
to be connected to an editor and verify it's actually running.
|
||||
It doesn't provide any features yet, but it does connect to your editor, detect file change
|
||||
and store the buffer in memory.
|
||||
The following request have been handled :
|
||||
- `initialize`
|
||||
- `shutdown`
|
||||
The following notifications have been handled :
|
||||
- `initialized`
|
||||
- `exit`
|
||||
- `didOpen` (stores the buffer into a `StringHashMap`, but doesn't do any action on it)
|
||||
- `didChange` (same as `didOpen`, but also supports incremental changes)
|
||||
|
||||
|
||||
## How to implement new LSP capabilities
|
||||
The core functionalities of the LSP have been implemented in a way so that `transport.zig` and
|
||||
`protocol.zig` shouldn't have to be modified as more capabilities are added. When handling a new
|
||||
LSP method, like `textDocument/completion` for example, the handler should be added in the `handlers`
|
||||
directory and its call should be added in `server.zig` like this :
|
||||
directory and its call should be added either in `request` (if it expects a response) or `notification`
|
||||
(if it doesn't expect a response). `textDocument/completion` for example would go here :
|
||||
```zig
|
||||
const request_handlers = std.StaticStringMap(HandlerPtr).initComptime(.{
|
||||
.{ "initialize", &InitializeHandler.call },
|
||||
|
|
@ -19,8 +29,23 @@ const request_handlers = std.StaticStringMap(HandlerPtr).initComptime(.{
|
|||
.{ "textDocument/completion", &CompletionHandler.call },
|
||||
});
|
||||
```
|
||||
The `Server` holds the state so it will be responsible of knowing the project and how different parts
|
||||
interact. This is then accessible by every handler.
|
||||
When adding a new capability, if the server is ready to support it, you need to add the capabilities to
|
||||
the `capabilities.zig` file for the `initialize` response to tell the client the capabilities is available :
|
||||
```zig
|
||||
pub fn buildCapabilities() ServerCapabilities {
|
||||
return .{
|
||||
.textDocumentSync = .{
|
||||
.openClose = true,
|
||||
.change = @intFromEnum(ServerCapabilities.TextDocumentSyncKind.incremental),
|
||||
},
|
||||
};
|
||||
}
|
||||
```
|
||||
Here we tell the client that `textDocumentSync` is available in accordance to the LSP specifications data
|
||||
structure. The `Server` struct holds the state, meaning in has the knowledge of the project files, the
|
||||
documentation, the type inference, the syntax, etc. Every handler has access to it. These points of knowledge
|
||||
are ideally separated in different fields of the server. For example, the opened buffer and other desired files
|
||||
are stored in a `DocumentStore` which is a struct containing a `StringHashMap`, accessible through the `Server`.
|
||||
|
||||
## Starting the server
|
||||
Build the Roc toolchain and run:
|
||||
|
|
@ -37,7 +62,7 @@ roc experimental-lsp --debug-transport
|
|||
|
||||
Passing the `--debug-transport` flag will create a log file in your OS tmp folder (`/tmp` on Unix
|
||||
systems). A mirror of the raw JSON-RPC traffic will be appended to the log file. Watching the file
|
||||
will allow an user to see incoming and outgoing message between the server and the editor
|
||||
will allow a user to see incoming and outgoing message between the server and the editor
|
||||
```bash
|
||||
tail -f /tmp/roc-lsp-debug.log
|
||||
---
|
||||
|
|
|
|||
28
src/lsp/capabilities.zig
Normal file
28
src/lsp/capabilities.zig
Normal file
|
|
@ -0,0 +1,28 @@
|
|||
const std = @import("std");
|
||||
|
||||
/// Aggregates all server capabilities supported by the Roc LSP.
|
||||
pub const ServerCapabilities = struct {
|
||||
positionEncoding: []const u8 = "utf-16",
|
||||
textDocumentSync: ?TextDocumentSyncOptions = null,
|
||||
|
||||
pub const TextDocumentSyncOptions = struct {
|
||||
openClose: bool = false,
|
||||
change: u32 = @intFromEnum(TextDocumentSyncKind.none),
|
||||
};
|
||||
|
||||
pub const TextDocumentSyncKind = enum(u32) {
|
||||
none = 0,
|
||||
full = 1,
|
||||
incremental = 2,
|
||||
};
|
||||
};
|
||||
|
||||
/// Returns the server capabilities currently implemented.
|
||||
pub fn buildCapabilities() ServerCapabilities {
|
||||
return .{
|
||||
.textDocumentSync = .{
|
||||
.openClose = true,
|
||||
.change = @intFromEnum(ServerCapabilities.TextDocumentSyncKind.incremental),
|
||||
},
|
||||
};
|
||||
}
|
||||
108
src/lsp/document_store.zig
Normal file
108
src/lsp/document_store.zig
Normal file
|
|
@ -0,0 +1,108 @@
|
|||
const std = @import("std");
|
||||
|
||||
/// Stores the latest contents of each open text document.
|
||||
pub const DocumentStore = struct {
|
||||
allocator: std.mem.Allocator,
|
||||
entries: std.StringHashMap(Document),
|
||||
|
||||
/// Snapshot of a document's contents and version.
|
||||
pub const Document = struct {
|
||||
text: []u8,
|
||||
version: i64,
|
||||
};
|
||||
|
||||
pub const Range = struct {
|
||||
start_line: usize,
|
||||
start_character: usize,
|
||||
end_line: usize,
|
||||
end_character: usize,
|
||||
};
|
||||
|
||||
/// Creates an empty store backed by the provided allocator.
|
||||
pub fn init(allocator: std.mem.Allocator) DocumentStore {
|
||||
return .{ .allocator = allocator, .entries = std.StringHashMap(Document).init(allocator) };
|
||||
}
|
||||
|
||||
/// Releases all tracked documents and frees associated memory.
|
||||
pub fn deinit(self: *DocumentStore) void {
|
||||
var it = self.entries.iterator();
|
||||
while (it.next()) |entry| {
|
||||
self.allocator.free(entry.key_ptr.*);
|
||||
self.allocator.free(entry.value_ptr.text);
|
||||
}
|
||||
self.entries.deinit();
|
||||
self.* = undefined;
|
||||
}
|
||||
|
||||
/// Inserts or replaces the document at `uri` with the given text and version.
|
||||
pub fn upsert(self: *DocumentStore, uri: []const u8, version: i64, text: []const u8) !void {
|
||||
const gop = try self.entries.getOrPut(uri);
|
||||
if (!gop.found_existing) {
|
||||
gop.key_ptr.* = try self.allocator.dupe(u8, uri);
|
||||
} else {
|
||||
self.allocator.free(gop.value_ptr.text);
|
||||
}
|
||||
|
||||
gop.value_ptr.* = .{
|
||||
.text = try self.allocator.dupe(u8, text),
|
||||
.version = version,
|
||||
};
|
||||
}
|
||||
|
||||
/// Removes a document from the store, if present.
|
||||
pub fn remove(self: *DocumentStore, uri: []const u8) void {
|
||||
if (self.entries.fetchRemove(uri)) |removed| {
|
||||
self.allocator.free(removed.key);
|
||||
self.allocator.free(removed.value.text);
|
||||
}
|
||||
}
|
||||
|
||||
/// Returns the stored document (if any). The returned slice references memory owned by the store.
|
||||
pub fn get(self: *DocumentStore, uri: []const u8) ?Document {
|
||||
if (self.entries.get(uri)) |doc| {
|
||||
return doc;
|
||||
}
|
||||
return null;
|
||||
}
|
||||
|
||||
/// Applies a range replacement to an existing document using UTF-16 positions.
|
||||
pub fn applyRangeReplacement(self: *DocumentStore, uri: []const u8, version: i64, range: Range, new_text: []const u8) !void {
|
||||
const entry = self.entries.getPtr(uri) orelse return error.DocumentNotFound;
|
||||
const start_offset = try positionToOffset(entry.text, range.start_line, range.start_character);
|
||||
const end_offset = try positionToOffset(entry.text, range.end_line, range.end_character);
|
||||
if (start_offset > end_offset or end_offset > entry.text.len) return error.InvalidRange;
|
||||
|
||||
const replaced = end_offset - start_offset;
|
||||
const new_len = entry.text.len - replaced + new_text.len;
|
||||
var buffer = try self.allocator.alloc(u8, new_len);
|
||||
errdefer self.allocator.free(buffer);
|
||||
|
||||
@memcpy(buffer[0..start_offset], entry.text[0..start_offset]);
|
||||
@memcpy(buffer[start_offset .. start_offset + new_text.len], new_text);
|
||||
@memcpy(buffer[start_offset + new_text.len ..], entry.text[end_offset..]);
|
||||
|
||||
self.allocator.free(entry.text);
|
||||
entry.text = buffer;
|
||||
entry.version = version;
|
||||
}
|
||||
|
||||
fn positionToOffset(text: []const u8, line: usize, character_utf16: usize) !usize {
|
||||
var current_line: usize = 0;
|
||||
var index: usize = 0;
|
||||
while (current_line < line) : (current_line += 1) {
|
||||
const newline_index = std.mem.indexOfScalarPos(u8, text, index, '\n') orelse return error.InvalidPosition;
|
||||
index = newline_index + 1;
|
||||
}
|
||||
|
||||
var utf16_units: usize = 0;
|
||||
var it = std.unicode.Utf8Iterator{ .bytes = text[index..], .i = 0 };
|
||||
while (utf16_units < character_utf16) {
|
||||
const slice = it.nextCodepointSlice() orelse return error.InvalidPosition;
|
||||
const cp = std.unicode.utf8Decode(slice) catch return error.InvalidPosition;
|
||||
utf16_units += if (cp <= 0xFFFF) 1 else 2;
|
||||
}
|
||||
|
||||
if (utf16_units != character_utf16) return error.InvalidPosition;
|
||||
return index + it.i;
|
||||
}
|
||||
};
|
||||
95
src/lsp/handlers/did_change.zig
Normal file
95
src/lsp/handlers/did_change.zig
Normal file
|
|
@ -0,0 +1,95 @@
|
|||
const std = @import("std");
|
||||
const DocumentStore = @import("../document_store.zig").DocumentStore;
|
||||
|
||||
/// Handler for `textDocument/didChange` notifications (supports incremental edits).
|
||||
pub fn handler(comptime ServerType: type) type {
|
||||
return struct {
|
||||
pub fn call(self: *ServerType, params_value: ?std.json.Value) !void {
|
||||
const params = params_value orelse return;
|
||||
const obj = switch (params) {
|
||||
.object => |o| o,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const text_doc_value = obj.get("textDocument") orelse return;
|
||||
const text_doc = switch (text_doc_value) {
|
||||
.object => |o| o,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const uri_value = text_doc.get("uri") orelse return;
|
||||
const uri = switch (uri_value) {
|
||||
.string => |s| s,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const version_value = text_doc.get("version") orelse std.json.Value{ .integer = 0 };
|
||||
const version: i64 = switch (version_value) {
|
||||
.integer => |v| v,
|
||||
.float => |f| @intFromFloat(f),
|
||||
else => 0,
|
||||
};
|
||||
|
||||
const changes_value = obj.get("contentChanges") orelse return;
|
||||
const changes = switch (changes_value) {
|
||||
.array => |arr| arr,
|
||||
else => return,
|
||||
};
|
||||
if (changes.items.len == 0) return;
|
||||
|
||||
const last_change = changes.items[changes.items.len - 1];
|
||||
const change_obj = switch (last_change) {
|
||||
.object => |o| o,
|
||||
else => return,
|
||||
};
|
||||
const text_value = change_obj.get("text") orelse return;
|
||||
const text = switch (text_value) {
|
||||
.string => |s| s,
|
||||
else => return,
|
||||
};
|
||||
if (change_obj.get("range")) |range_value| {
|
||||
const range = parseRange(range_value) catch |err| {
|
||||
std.log.err("invalid range for {s}: {s}", .{ uri, @errorName(err) });
|
||||
return;
|
||||
};
|
||||
self.doc_store.applyRangeReplacement(uri, version, range, text) catch |err| {
|
||||
std.log.err("failed to apply incremental change for {s}: {s}", .{ uri, @errorName(err) });
|
||||
};
|
||||
} else {
|
||||
self.doc_store.upsert(uri, version, text) catch |err| {
|
||||
std.log.err("failed to apply full change for {s}: {s}", .{ uri, @errorName(err) });
|
||||
};
|
||||
}
|
||||
}
|
||||
|
||||
fn parseRange(value: std.json.Value) !DocumentStore.Range {
|
||||
const range_obj = switch (value) {
|
||||
.object => |o| o,
|
||||
else => return error.InvalidRange,
|
||||
};
|
||||
const start_obj = switch (range_obj.get("start") orelse return error.InvalidRange) {
|
||||
.object => |o| o,
|
||||
else => return error.InvalidRange,
|
||||
};
|
||||
const end_obj = switch (range_obj.get("end") orelse return error.InvalidRange) {
|
||||
.object => |o| o,
|
||||
else => return error.InvalidRange,
|
||||
};
|
||||
return DocumentStore.Range{
|
||||
.start_line = parseIndex(start_obj, "line") catch return error.InvalidRange,
|
||||
.start_character = parseIndex(start_obj, "character") catch return error.InvalidRange,
|
||||
.end_line = parseIndex(end_obj, "line") catch return error.InvalidRange,
|
||||
.end_character = parseIndex(end_obj, "character") catch return error.InvalidRange,
|
||||
};
|
||||
}
|
||||
|
||||
fn parseIndex(obj: std.json.ObjectMap, field: []const u8) !usize {
|
||||
const value = obj.get(field) orelse return error.MissingField;
|
||||
return switch (value) {
|
||||
.integer => |v| if (v < 0) error.InvalidField else @intCast(v),
|
||||
.float => |f| if (f < 0) error.InvalidField else @intFromFloat(f),
|
||||
else => return error.InvalidField,
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
43
src/lsp/handlers/did_open.zig
Normal file
43
src/lsp/handlers/did_open.zig
Normal file
|
|
@ -0,0 +1,43 @@
|
|||
const std = @import("std");
|
||||
|
||||
/// Handler for `textDocument/didOpen` notifications.
|
||||
pub fn handler(comptime ServerType: type) type {
|
||||
return struct {
|
||||
pub fn call(self: *ServerType, params_value: ?std.json.Value) !void {
|
||||
const params = params_value orelse return;
|
||||
const obj = switch (params) {
|
||||
.object => |o| o,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const text_doc_value = obj.get("textDocument") orelse return;
|
||||
const text_doc = switch (text_doc_value) {
|
||||
.object => |o| o,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const uri_value = text_doc.get("uri") orelse return;
|
||||
const uri = switch (uri_value) {
|
||||
.string => |s| s,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const text_value = text_doc.get("text") orelse return;
|
||||
const text = switch (text_value) {
|
||||
.string => |s| s,
|
||||
else => return,
|
||||
};
|
||||
|
||||
const version_value = text_doc.get("version") orelse std.json.Value{ .integer = 0 };
|
||||
const version: i64 = switch (version_value) {
|
||||
.integer => |v| v,
|
||||
.float => |f| @intFromFloat(f),
|
||||
else => 0,
|
||||
};
|
||||
|
||||
self.doc_store.upsert(uri, version, text) catch |err| {
|
||||
std.log.err("failed to open {s}: {s}", .{ uri, @errorName(err) });
|
||||
};
|
||||
}
|
||||
};
|
||||
}
|
||||
|
|
@ -1,5 +1,6 @@
|
|||
const std = @import("std");
|
||||
const protocol = @import("../protocol.zig");
|
||||
const capabilities = @import("../capabilities.zig");
|
||||
|
||||
/// Returns the `initialize` method handler for the LSP.
|
||||
pub fn handler(comptime ServerType: type) type {
|
||||
|
|
@ -20,10 +21,10 @@ pub fn handler(comptime ServerType: type) type {
|
|||
self.state = .waiting_for_initialized;
|
||||
|
||||
const response = protocol.InitializeResult{
|
||||
.capabilities = .{},
|
||||
.capabilities = capabilities.buildCapabilities(),
|
||||
.serverInfo = .{
|
||||
.name = ServerType.server_name,
|
||||
.version = "0.1",
|
||||
.version = ServerType.version,
|
||||
},
|
||||
};
|
||||
|
||||
|
|
|
|||
|
|
@ -12,4 +12,5 @@ test "lsp tests" {
|
|||
std.testing.refAllDecls(@import("test/protocol_test.zig"));
|
||||
std.testing.refAllDecls(@import("test/server_test.zig"));
|
||||
std.testing.refAllDecls(@import("test/transport_test.zig"));
|
||||
std.testing.refAllDecls(@import("test/document_store_test.zig"));
|
||||
}
|
||||
|
|
|
|||
|
|
@ -191,9 +191,7 @@ pub const ServerInfo = struct {
|
|||
};
|
||||
|
||||
/// Capabilities advertised back to the editor.
|
||||
pub const ServerCapabilities = struct {
|
||||
positionEncoding: []const u8 = "utf-16",
|
||||
};
|
||||
pub const ServerCapabilities = @import("capabilities.zig").ServerCapabilities;
|
||||
|
||||
/// Response body returned after a successful initialization.
|
||||
pub const InitializeResult = struct {
|
||||
|
|
|
|||
|
|
@ -2,8 +2,11 @@ const std = @import("std");
|
|||
const builtin = @import("builtin");
|
||||
const protocol = @import("protocol.zig");
|
||||
const makeTransport = @import("transport.zig").Transport;
|
||||
const DocumentStore = @import("document_store.zig").DocumentStore;
|
||||
const initialize_handler_mod = @import("handlers/initialize.zig");
|
||||
const shutdown_handler_mod = @import("handlers/shutdown.zig");
|
||||
const did_open_handler_mod = @import("handlers/did_open.zig");
|
||||
const did_change_handler_mod = @import("handlers/did_change.zig");
|
||||
|
||||
const log = std.log.scoped(.roc_lsp_server);
|
||||
|
||||
|
|
@ -14,19 +17,29 @@ pub fn Server(comptime ReaderType: type, comptime WriterType: type) type {
|
|||
const TransportType = makeTransport(ReaderType, WriterType);
|
||||
const HandlerFn = fn (*Self, *protocol.JsonId, ?std.json.Value) anyerror!void;
|
||||
const HandlerPtr = *const HandlerFn;
|
||||
const NotificationFn = fn (*Self, ?std.json.Value) anyerror!void;
|
||||
const NotificationPtr = *const NotificationFn;
|
||||
const InitializeHandler = initialize_handler_mod.handler(Self);
|
||||
const ShutdownHandler = shutdown_handler_mod.handler(Self);
|
||||
const request_handlers = std.StaticStringMap(HandlerPtr).initComptime(.{
|
||||
.{ "initialize", &InitializeHandler.call },
|
||||
.{ "shutdown", &ShutdownHandler.call },
|
||||
});
|
||||
const DidOpenHandler = did_open_handler_mod.handler(Self);
|
||||
const DidChangeHandler = did_change_handler_mod.handler(Self);
|
||||
const notification_handlers = std.StaticStringMap(NotificationPtr).initComptime(.{
|
||||
.{ "textDocument/didOpen", &DidOpenHandler.call },
|
||||
.{ "textDocument/didChange", &DidChangeHandler.call },
|
||||
});
|
||||
|
||||
allocator: std.mem.Allocator,
|
||||
transport: TransportType,
|
||||
client: protocol.ClientState = .{},
|
||||
state: State = .waiting_for_initialize,
|
||||
doc_store: DocumentStore,
|
||||
|
||||
pub const server_name = "roc-lsp";
|
||||
pub const version = "0.1";
|
||||
|
||||
pub const State = enum {
|
||||
waiting_for_initialize,
|
||||
|
|
@ -41,12 +54,14 @@ pub fn Server(comptime ReaderType: type, comptime WriterType: type) type {
|
|||
return .{
|
||||
.allocator = allocator,
|
||||
.transport = TransportType.init(allocator, reader, writer, log_file),
|
||||
.doc_store = DocumentStore.init(allocator),
|
||||
};
|
||||
}
|
||||
|
||||
pub fn deinit(self: *Self) void {
|
||||
self.client.deinit(self.allocator);
|
||||
self.transport.deinit();
|
||||
self.doc_store.deinit();
|
||||
}
|
||||
|
||||
pub fn run(self: *Self) !void {
|
||||
|
|
@ -112,7 +127,7 @@ pub fn Server(comptime ReaderType: type, comptime WriterType: type) type {
|
|||
try self.sendError(id, .method_not_found, "method not implemented");
|
||||
}
|
||||
|
||||
fn handleNotification(self: *Self, method: []const u8, _: ?std.json.Value) !void {
|
||||
fn handleNotification(self: *Self, method: []const u8, params: ?std.json.Value) !void {
|
||||
if (std.mem.eql(u8, method, "initialized")) {
|
||||
if (self.state == .waiting_for_initialized) {
|
||||
self.state = .running;
|
||||
|
|
@ -125,6 +140,13 @@ pub fn Server(comptime ReaderType: type, comptime WriterType: type) type {
|
|||
return;
|
||||
}
|
||||
|
||||
if (notification_handlers.get(method)) |handler| {
|
||||
handler(self, params) catch |err| {
|
||||
log.err("notification handler {s} failed: {s}", .{ method, @errorName(err) });
|
||||
};
|
||||
return;
|
||||
}
|
||||
|
||||
// Other notifications are ignored until server capabilities are implemented.
|
||||
}
|
||||
|
||||
|
|
@ -166,6 +188,12 @@ pub fn Server(comptime ReaderType: type, comptime WriterType: type) type {
|
|||
.result = result,
|
||||
});
|
||||
}
|
||||
|
||||
/// Returns the stored document (testing helper; returns null outside tests).
|
||||
pub fn getDocumentForTesting(self: *Self, uri: []const u8) ?DocumentStore.Document {
|
||||
if (!builtin.is_test) return null;
|
||||
return self.doc_store.get(uri);
|
||||
}
|
||||
};
|
||||
}
|
||||
|
||||
|
|
|
|||
|
|
@ -4,4 +4,5 @@ comptime {
|
|||
_ = @import("test/transport_test.zig");
|
||||
_ = @import("test/server_test.zig");
|
||||
_ = @import("test/protocol_test.zig");
|
||||
_ = @import("test/document_store_test.zig");
|
||||
}
|
||||
|
|
|
|||
31
src/lsp/test/document_store_test.zig
Normal file
31
src/lsp/test/document_store_test.zig
Normal file
|
|
@ -0,0 +1,31 @@
|
|||
const std = @import("std");
|
||||
const DocumentStore = @import("../document_store.zig").DocumentStore;
|
||||
|
||||
test "document store upserts and retrieves documents" {
|
||||
const allocator = std.testing.allocator;
|
||||
var store = DocumentStore.init(allocator);
|
||||
defer store.deinit();
|
||||
|
||||
try store.upsert("file:///test", 1, "hello");
|
||||
const doc = store.get("file:///test") orelse return error.MissingDocument;
|
||||
try std.testing.expectEqual(@as(i64, 1), doc.version);
|
||||
try std.testing.expectEqualStrings("hello", doc.text);
|
||||
}
|
||||
|
||||
test "document store applies incremental changes" {
|
||||
const allocator = std.testing.allocator;
|
||||
var store = DocumentStore.init(allocator);
|
||||
defer store.deinit();
|
||||
|
||||
try store.upsert("file:///test", 1, "hello world");
|
||||
try store.applyRangeReplacement(
|
||||
"file:///test",
|
||||
2,
|
||||
.{ .start_line = 0, .start_character = 6, .end_line = 0, .end_character = 11 },
|
||||
"roc",
|
||||
);
|
||||
|
||||
const doc = store.get("file:///test") orelse return error.MissingDocument;
|
||||
try std.testing.expectEqual(@as(i64, 2), doc.version);
|
||||
try std.testing.expectEqualStrings("hello roc", doc.text);
|
||||
}
|
||||
|
|
@ -150,3 +150,39 @@ test "server rejects re-initialization requests" {
|
|||
const error_obj = parsed_error.value.object.get("error") orelse return error.ExpectedError;
|
||||
try std.testing.expect(error_obj.object.get("code").?.integer == @intFromEnum(protocol.ErrorCode.invalid_request));
|
||||
}
|
||||
|
||||
test "server tracks documents on didOpen/didChange" {
|
||||
const allocator = std.testing.allocator;
|
||||
const open_msg = try frame(allocator,
|
||||
\\{"jsonrpc":"2.0","method":"textDocument/didOpen","params":{"textDocument":{"uri":"file:///test.roc","version":1,"text":"app main = 0"}}}
|
||||
);
|
||||
defer allocator.free(open_msg);
|
||||
const change_msg = try frame(allocator,
|
||||
\\{"jsonrpc":"2.0","method":"textDocument/didChange","params":{"textDocument":{"uri":"file:///test.roc","version":2},"contentChanges":[{"text":"app main = 42","range":{"start":{"line":0,"character":0},"end":{"line":0,"character":12}}}]}}
|
||||
);
|
||||
defer allocator.free(change_msg);
|
||||
|
||||
var builder = std.ArrayList(u8){};
|
||||
defer builder.deinit(allocator);
|
||||
try builder.ensureTotalCapacity(allocator, open_msg.len + change_msg.len);
|
||||
try builder.appendSlice(allocator, open_msg);
|
||||
try builder.appendSlice(allocator, change_msg);
|
||||
const combined = try builder.toOwnedSlice(allocator);
|
||||
defer allocator.free(combined);
|
||||
|
||||
var reader_stream = std.io.fixedBufferStream(combined);
|
||||
var writer_buffer: [32]u8 = undefined;
|
||||
var writer_stream = std.io.fixedBufferStream(&writer_buffer);
|
||||
|
||||
const ReaderType = @TypeOf(reader_stream.reader());
|
||||
const WriterType = @TypeOf(writer_stream.writer());
|
||||
var server = try server_module.Server(ReaderType, WriterType).init(allocator, reader_stream.reader(), writer_stream.writer(), null);
|
||||
defer server.deinit();
|
||||
try server.run();
|
||||
|
||||
const maybe_doc = server.getDocumentForTesting("file:///test.roc");
|
||||
try std.testing.expect(maybe_doc != null);
|
||||
const doc = maybe_doc.?;
|
||||
try std.testing.expectEqualStrings("app main = 42", doc.text);
|
||||
try std.testing.expectEqual(@as(i64, 2), doc.version);
|
||||
}
|
||||
|
|
|
|||
Loading…
Add table
Add a link
Reference in a new issue