Compare commits

...

61 commits

Author SHA1 Message Date
Noah Santschi-Cooney
85cbb6d81e
release 0.9.9 2023-02-12 16:32:38 +00:00
Noah S-C
fb12c9b144
Merge pull request #47 from GeForceLegend/rust-rewrite
Compute shader with suffixes support and some small fix
2023-02-12 17:00:24 +01:00
GeForceLegend
4dd5542355 Fixed icons 2023-02-05 10:53:28 +08:00
GeForceLegend
78b6a6ef1d Added highlight support for csh with suffixes
Compute shader with _a to _z suffix can get highlighted now
2023-02-05 09:56:56 +08:00
GeForceLegend
b6da5c97fb Update to latest rust nightly
I'm not sure if these 2 packages are same since I can't find any document direct to std::lazy::OnceCell
2023-02-05 09:54:56 +08:00
Noah S-C
d1d1e2377b
Merge pull request #43 from TLATER/tlater/fix-eglot-crash
Fix crashes when run with eglot as a client
2022-08-18 00:52:05 -07:00
Tristan Daniël Maat
05e52fc8d0
Fix crashes when run with eglot as a client
This just handles ConfigurationDidChange events in which the "mcglsl"
key wasn't set without crashing, by ignoring them; if the
configuration changed, but none of our configuration changed, there's
nothing we need to update.

Obviously there is still a lot that can go wrong here, but this allows
running the language server with https://github.com/joaotavora/eglot,
as that language server sends a ConfigurationDidChange on startup if
any configuration is set (including for other language servers).
2022-08-18 04:14:36 +01:00
Noah S-C
83c86aeff2
Merge pull request #42 from BalintCsala/top_level_check
Fixes top level check to account for mod-added dimensions
2022-04-29 20:11:06 +01:00
Bálint
0768abb122 Replaced slash conversion with lib function 2022-04-29 19:02:15 +02:00
Bálint
2c2dbfb3e3 Merge branch 'top_level_check' of https://github.com/BalintCsala/mcshader-lsp into top_level_check 2022-04-29 08:08:20 +02:00
Bálint
f8cc2eed22 Fixed issue with modded worlds 2022-04-29 08:07:52 +02:00
Bálint
c737409fde Doesn't detect vsh files yet 2022-04-29 08:04:34 +02:00
Noah Santschi-Cooney
d8d77ac600
release 0.9.8 2022-04-24 22:38:20 +01:00
Noah Santschi-Cooney
941822c5c7
hardcode top-level file set
avoids unimported non-toplevel fsh/vsh/etc files being treated as toplevel
2022-04-24 22:17:20 +01:00
Noah Santschi-Cooney
3b568ea087
fix dfs iterator when same file imported multiple times into another 2022-04-24 22:16:45 +01:00
Noah Santschi-Cooney
27d1d7b34e
fix nvidia diagnostics line offset, once and for all (hopefully) 2022-04-24 21:53:21 +01:00
Noah Santschi-Cooney
3c58af95fa
fix logo images because gitattributes treated them as text 2022-04-24 00:57:37 +01:00
Noah Santschi-Cooney
f45e1a4b87
v0.9.7: the real 0.9.6 2022-04-24 00:05:13 +01:00
Noah Santschi-Cooney
d43bfec582
v0.9.6: now with fixed CI 2022-04-23 23:15:45 +01:00
Noah Santschi-Cooney
a7cbaa198b
v0.9.6 2022-04-23 22:33:21 +01:00
Noah Santschi-Cooney
fecc41168a
fix include merging for when a file imports another file more than once directly 2022-04-18 01:08:53 +01:00
Noah Santschi-Cooney
1529460a5c
support document symbols request 2022-04-16 00:21:10 +01:00
Noah Santschi-Cooney
f66f56603a
implemented file-local go-to-def for variables 2022-04-10 20:54:23 +01:00
Noah Santschi-Cooney
d8cb0465ef
implemented file-local find-refs for functions 2022-04-10 00:28:12 +01:00
Noah Santschi-Cooney
3b865dfda2
updated AMD diagnostics regex 2022-04-09 18:11:50 +01:00
Noah Santschi-Cooney
d3365c3bff
big heckin reworkerino to use the standard #line directive as per spec 2022-04-03 21:31:18 +01:00
Noah Santschi-Cooney
cb7c9b8b49
(probably for AMD) enable 'GL_GOOGLE_cpp_style_line_directive' in preamble 2022-03-25 01:30:48 +00:00
Noah Santschi-Cooney
b4a0636d43
replace (NodeIndex, Option<NodeIndex>) with FilialTuple globally 2022-03-25 01:29:01 +00:00
Noah Santschi-Cooney
9a499d581b
move tests to their respective files 2022-03-24 22:33:51 +00:00
Noah Santschi-Cooney
3957eaed17
go-to-def for same-file functions from function calls working 2022-03-21 21:58:11 +00:00
Noah Santschi-Cooney
616b7cef74
revert bumping vscode-languageclient, broke restarting 2022-03-20 15:17:48 +00:00
Noah Santschi-Cooney
d5b0dcffb2
wip 2022-03-20 01:28:56 +00:00
Noah Santschi-Cooney
f8dd31ca81
extract out diagnostics parser. pls submit AMD lint output in github issues 2022-03-19 23:27:33 +00:00
Noah Santschi-Cooney
d3c0869288
tree-sitter fun 👀 2022-03-19 22:58:01 +00:00
Noah Santschi-Cooney
86100aa008
chore: reorganizing lsp commands 2022-03-19 20:26:30 +00:00
Noah Santschi-Cooney
e001b4a8b1
fixup client imports 2022-03-18 19:22:17 +00:00
Noah Santschi-Cooney
9a9ed21f13
allow configurable log level 2022-03-18 19:11:48 +00:00
Noah Santschi-Cooney
ebab8c899a
formatting and adding a buncha log points 2022-03-18 19:00:42 +00:00
Noah Santschi-Cooney
7cf009ee61
added base logging infrastructure, more to come SOON 2022-03-15 01:01:50 +00:00
Noah Santschi-Cooney
b775bd2cd5
cargo: bump dependency versions 2022-03-14 23:58:49 +00:00
Noah Santschi-Cooney
a8f00fe927
npm: bump dependency versions 2022-03-14 23:12:40 +00:00
Noah Santschi-Cooney
79e107b748
add rustfmt configuration 2022-03-14 23:06:58 +00:00
Noah Santschi-Cooney
cccb5e25f7
corporate shilling: add LSIF upload action 2022-03-14 23:04:57 +00:00
Noah Santschi-Cooney
554777d0da
allow skipping bootstrap step and expose github actions extension build in CI on every push 2022-01-02 23:20:37 +00:00
Noah Santschi-Cooney
5747a9d9b1
attempt #4 for M1 Mac build: correct artifact path and skip M1 tests because not running on M1 machine 2022-01-02 22:30:32 +00:00
Noah Santschi-Cooney
57f4b7924b
attempt #3 for M1 Mac build: rust toolchains target edition 2022-01-02 22:15:21 +00:00
Noah Santschi-Cooney
5fd02b06f4
attempt #2 for M1 Mac build: correct targets edition 2022-01-02 22:06:28 +00:00
Noah Santschi-Cooney
65422c863f
fix failed compile in 3884126697 2022-01-02 21:59:44 +00:00
Noah Santschi-Cooney
734f0b014b
attempt #1 for M1 Mac build 2022-01-02 21:57:07 +00:00
Noah Santschi-Cooney
c7d8b02ee3
fixed all lint warnings 2021-03-05 17:48:55 +00:00
Noah Santschi-Cooney
db5e5afb26
compute shader support apparently? not tested pls no bulli 2021-03-05 17:26:51 +00:00
Noah Santschi-Cooney
b649aeb1f6
v0.9.5 2021-02-19 01:23:17 +00:00
Noah Santschi-Cooney
248afcd988
adds custom file association support to filesystem event watcher 2021-02-19 01:11:36 +00:00
Noah Santschi-Cooney
c854093a96
changelog 2021-02-19 01:07:57 +00:00
Noah Santschi-Cooney
551380a6ed
fixes merger for when file is merged twice
This happens in projects where an #include is normally guarded by #ifdef.
File A conditionally includes either B or C based on an #ifdef.
Both B and C include D, so D gets merged twice.
Previously, the offset to which a file view has been created is stored globally.
This means that the offset would be reused on a fresh include in the above situation.
Solved by storing offsets keyed by the NodeIndex of the parent file that is including and the NodeIndex of the file being included.
2021-02-19 01:02:09 +00:00
Noah Santschi-Cooney
9a770f69a4
v0.9.4 2021-02-13 18:39:29 +00:00
Noah Santschi-Cooney
22deb53ecd
dont add to graph or lint files out-of-tree 2021-02-13 18:28:36 +00:00
Noah Santschi-Cooney
3bfa7a2cc4
stops bootstraping of server when in debug mode 2021-02-13 18:28:09 +00:00
Noah Santschi-Cooney
72ea905413
fixes merger for mixed CRLF/LF projects 2021-02-13 18:27:42 +00:00
Noah Santschi-Cooney
fabbc68fd7
v0.9.3 2021-02-12 13:49:57 +00:00
Noah Santschi-Cooney
30bd0dd7f4
v0.9.2 2021-02-12 03:27:18 +00:00
83 changed files with 10993 additions and 5288 deletions

3
.gitattributes vendored
View file

@ -1 +1,2 @@
* text eol=lf
* text eol=lf
*.png binary

21
.github/workflows/extension.yml vendored Normal file
View file

@ -0,0 +1,21 @@
name: Build Extension
on:
push:
branches: [ rust-rewrite ]
pull_request:
branches: [ rust-rewrite ]
jobs:
build-vscode-extension:
runs-on: ubuntu-20.04
steps:
- uses: actions/checkout@v2
- run: npm i
- uses: HaaLeo/publish-vscode-extension@v0
id: vsce_build
with:
pat: 'sample text'
dryRun: true
- uses: actions/upload-artifact@v2
with:
name: vscode-mc-shader.vsix
path: ${{ steps.vsce_build.outputs.vsixPath }}

14
.github/workflows/lsif.yml vendored Normal file
View file

@ -0,0 +1,14 @@
name: LSIF
on:
- push
jobs:
index:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Generate LSIF data
uses: sourcegraph/lsif-rust-action@main
- name: Upload LSIF data
uses: sourcegraph/lsif-upload-action@master
with:
github_token: ${{ secrets.GITHUB_TOKEN }}

View file

@ -35,12 +35,19 @@ jobs:
- os: ubuntu-18.04
target: x86_64-unknown-linux-gnu
dir: server/mcshader-lsp
artifact: x86_64-unknown-linux-gnu
- os: windows-latest
target: x86_64-windows-msvc.exe
target: x86_64-pc-windows-msvc
dir: server/mcshader-lsp.exe
- os: macos-10.15
artifact: x86_64-windows-msvc.exe
- os: macos-11
target: x86_64-apple-darwin
dir: server/mcshader-lsp
artifact: x86_64-apple-darwin
- os: macos-11
target: aarch64-apple-darwin
dir: server/mcshader-lsp
artifact: aarch64-apple-darwin
steps:
- uses: actions/checkout@v2
- name: Install latest nightly
@ -48,8 +55,10 @@ jobs:
with:
toolchain: nightly
default: true
target: ${{ matrix.platforms.target }}
override: true
- name: Build server
run: cargo build --release --out-dir . -Z unstable-options
run: cargo build --release --target ${{ matrix.platforms.target }} --out-dir . -Z unstable-options
- name: Upload release file
uses: actions/upload-release-asset@v1
env:
@ -57,7 +66,7 @@ jobs:
with:
upload_url: ${{ needs.empty-release.outputs.upload_url }}
asset_path: ${{ matrix.platforms.dir }}
asset_name: mcshader-lsp-${{ matrix.platforms.target }}
asset_name: mcshader-lsp-${{ matrix.platforms.artifact }}
asset_content_type: application/octet-stream
release-vscode-extension:
runs-on: ubuntu-20.04

View file

@ -8,16 +8,44 @@ env:
CARGO_TERM_COLOR: always
jobs:
build-and-test:
runs-on: ${{ matrix.os }}
runs-on: ${{ matrix.platforms.os }}
defaults:
run:
working-directory: server
strategy:
matrix:
os: [ ubuntu-20.04, windows-latest, macos-10.15 ]
platforms:
- os: ubuntu-18.04
target: x86_64-unknown-linux-gnu
dir: server/mcshader-lsp
artifact: x86_64-unknown-linux-gnu
- os: windows-latest
target: x86_64-pc-windows-msvc
dir: server/mcshader-lsp.exe
artifact: x86_64-windows-msvc.exe
- os: macos-11
target: x86_64-apple-darwin
dir: server/mcshader-lsp
artifact: x86_64-apple-darwin
- os: macos-11
target: aarch64-apple-darwin
dir: server/mcshader-lsp
artifact: aarch64-apple-darwin
steps:
- uses: actions/checkout@v2
- name: Install latest nightly
uses: actions-rs/toolchain@v1
with:
toolchain: nightly
default: true
target: ${{ matrix.platforms.target }}
override: true
- name: Build server
run: cargo build
run: cargo build --target ${{ matrix.platforms.target }} --out-dir . -Z unstable-options
- uses: actions/upload-artifact@v2
with:
name: mcshader-lsp-${{ matrix.platforms.artifact }}
path: ${{ matrix.platforms.dir }}
- name: Run tests
run: cargo test
run: cargo test --target ${{ matrix.platforms.target }}
if: ${{ matrix.platforms.target != 'aarch64-apple-darwin' }}

3
.rustfmt.toml Normal file
View file

@ -0,0 +1,3 @@
edition = "2021"
fn_args_layout = "compressed"
max_width = 140

View file

@ -4,6 +4,106 @@ All notable changes to the "vscode-mc-shader" extension will be documented in th
The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
## [0.9.9]
### Added
- Support for mod world folders, outside the standard world{-1,0,1}.
- Support for compute shader files ending in \_a to \_z.
### Fixed
- Crash when running with eglot as LSP client.
- Extension icon client not displaying (encoding issue).
## [0.9.8]
### Fixed
- NVIDIA diagnostics line offset off-by-one due to confusion with erroneous (non-proper) GLSL files resulting in both -1 and -2 offsets appearing to be valid when only the former is.
- Non-toplevel files being treated as toplevel files when they have .fsh/.vsh/etc and not imported into a valid toplevel file.
- Fix issue in the depth-first-search iterator when a file is imported twice into another file with a different include in between.
## [0.9.7]
### Fixed
- Fixed bad release tag format
- Fixed extension silently failing on activation
## [0.9.6]
### Added
- MacOS M1 binary releases
- AMD OpenGL driver diagnostics output support. AMD linting is a-go 🚀
- Tree-sitter based go-to-definition/find-references/document symbols. Currently disabled until stabilized
### Fixed
- Another `#include` merging bug when a file is imported twice into another file at different lines
## [0.9.5]
### Added
- Filesystem watcher reads custom defined file associations
### Fixed
- Fixed `#include` merging for when file is merged twice that would normally be `#ifdef` guarded. Please see commit message of [551380a](https://github.com/Strum355/mcshader-lsp/commit/551380a6ed00709287460b7d8c88e7803956052c) for detailed explanation
## [0.9.4]
### Fixed
- `#include` merging when project consists of files with both CRLF and LF files
- Out-of-tree shader files are not linted or added to the dependency graph
- Client no longer attempts to bootstrap server when `MCSHADER_DEBUG=true`
## [0.9.3]
### Fixed
- Language server download for windows
## [0.9.2]
### Changed
- VSCode extension activation predicate to only when `shaders` folder exists at top level
### Added
- Additional client-side logging
## [0.9.1]
### Fixed
- Windows support in client not adding `.exe` to language server path
- Binary release CI
## [0.9.0]
### Changed
- Replaced in-process Typescript language server with Rust based language server
### Fixed
- Due to the above, `#include` directive handling is vastly improved
### Added
- Command to view read-only document representing a top-level file with all includes merged
- Command to generate a DOT graph file of the entire project
- Command to restart language server
### Removed
- `glslangValidatorPath` and `shaderpacksPath` config settings
## [0.8.5]
### Fixed
@ -33,4 +133,4 @@ The format is based on [Keep a Changelog](http://keepachangelog.com/en/1.0.0/)
- Support for #include directives
- Basic linting with highlighting with error propogation to all known parents of an include.
- Support for .fsh, .vsh, .glsl and .gsh files.
- Incomplete completion items
- Incomplete completion items

View file

@ -1,7 +1,7 @@
# Minecraft GLSL Shaders Language Server
## mcshader-lsp
[![Marketplace Version](https://vsmarketplacebadge.apphb.com/version/strum355.vscode-mc-shader.svg)](https://marketplace.visualstudio.com/items?itemName=strum355.vscode-mc-shader) [![Installs](https://vsmarketplacebadge.apphb.com/installs/strum355.vscode-mc-shader.svg)](https://marketplace.visualstudio.com/items?itemName=strum355.vscode-mc-shader)
[![Marketplace Version](https://img.shields.io/visual-studio-marketplace/v/strum355.vscode-mc-shader.svg)](https://marketplace.visualstudio.com/items?itemName=strum355.vscode-mc-shader) [![Installs](https://img.shields.io/visual-studio-marketplace/i/strum355.vscode-mc-shader.svg)](https://marketplace.visualstudio.com/items?itemName=strum355.vscode-mc-shader)
[![license](https://img.shields.io/github/license/Strum355/vscode-mc-shader.svg)](https://github.com/Strum355/mcshader-lsp)
[![Issues](https://img.shields.io/github/issues-raw/Strum355/mcshader-lsp.svg)](https://github.com/Strum355/mcshader-lsp/issues)
[![Build Status](https://img.shields.io/drone/build/Strum355/mcshader-lsp)](https://cloud.drone.io/Strum355/mcshader-lsp)
@ -12,7 +12,7 @@ Currently supported editors:
- [Visual Studio Code](https://code.visualstudio.com/) with `vscode-mc-shader`
<img src="https://github.com/Strum355/mcshader-lsp/raw/master/logo.png" width="20%" height="20%">
<img src="https://github.com/Strum355/mcshader-lsp/raw/rust-rewrite/logo.png" width="20%" height="20%">
## Features

1312
client/package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -5,16 +5,18 @@
"rollup": "rollup -c"
},
"dependencies": {
"adm-zip": "^0.4.14",
"node-fetch": "^2.6.0",
"vscode-languageclient": "^6.1.3"
"@rollup/plugin-json": "^4.1.0",
"adm-zip": "^0.5.9",
"encoding": "^0.1.13",
"node-fetch": "^2.6.7",
"vscode-languageclient": "^6.1.4"
},
"devDependencies": {
"rollup": "^2.38.1",
"@rollup/plugin-commonjs": "^17.1.0",
"@rollup/plugin-node-resolve": "^11.1.1",
"@types/vscode": "^1.47.0",
"@types/adm-zip": "^0.4.32",
"@types/node-fetch": "^2.5.4"
"@rollup/plugin-commonjs": "^21.0.2",
"@rollup/plugin-node-resolve": "^13.1.3",
"@types/adm-zip": "^0.4.34",
"@types/node-fetch": "^2.6.1",
"@types/vscode": "^1.65.0",
"rollup": "^2.70.1"
}
}

View file

@ -1,11 +1,13 @@
import resolve from '@rollup/plugin-node-resolve';
import commonjs from '@rollup/plugin-commonjs';
import json from '@rollup/plugin-json';
import nodeBuiltins from 'builtin-modules';
/** @type { import('rollup').RollupOptions } */
export default {
input: 'out/extension.js',
plugins: [
json(),
resolve({
preferBuiltins: true
}),

View file

@ -1,3 +1,4 @@
import path = require('path')
import * as vscode from 'vscode'
import * as lsp from 'vscode-languageclient'
import { Extension } from './extension'
@ -30,7 +31,7 @@ export function virtualMergedDocument(e: Extension): Command {
command: 'virtualMerge',
arguments: [path]
})
} catch(e) {}
} catch (e) { }
return content
}
@ -40,17 +41,67 @@ export function virtualMergedDocument(e: Extension): Command {
onDidChange = this.onDidChangeEmitter.event
provideTextDocumentContent(uri: vscode.Uri, __: vscode.CancellationToken): vscode.ProviderResult<string> {
return getVirtualDocument(uri.path)
return getVirtualDocument(uri.path.replace('.flattened' + path.extname(uri.path), path.extname(uri.path)))
}
}
e.context.subscriptions.push(vscode.workspace.registerTextDocumentContentProvider('mcglsl', docProvider))
return async () => {
const uri = vscode.window.activeTextEditor.document.uri
const path = vscode.Uri.parse('mcglsl:' + uri.path)
if (vscode.window.activeTextEditor.document.languageId != 'glsl') return
const uri = vscode.window.activeTextEditor.document.uri.path
.substring(0, vscode.window.activeTextEditor.document.uri.path.lastIndexOf('.'))
+ '.flattened.'
+ vscode.window.activeTextEditor.document.uri.path
.slice(vscode.window.activeTextEditor.document.uri.path.lastIndexOf('.') + 1)
const path = vscode.Uri.parse(`mcglsl:${uri}`)
const doc = await vscode.workspace.openTextDocument(path)
docProvider.onDidChangeEmitter.fire(path)
await vscode.window.showTextDocument(doc, {preview: true})
await vscode.window.showTextDocument(doc, {
viewColumn: vscode.ViewColumn.Two,
preview: true
})
}
}
export function parseTree(e: Extension): Command {
const getVirtualDocument = async (path: string): Promise<string | null> => {
let content: string = ''
try {
content = await e.lspClient.sendRequest<string>(lsp.ExecuteCommandRequest.type.method, {
command: 'parseTree',
arguments: [path]
})
} catch (e) { }
return content
}
const docProvider = new class implements vscode.TextDocumentContentProvider {
onDidChangeEmitter = new vscode.EventEmitter<vscode.Uri>()
onDidChange = this.onDidChangeEmitter.event
provideTextDocumentContent(uri: vscode.Uri, _: vscode.CancellationToken): vscode.ProviderResult<string> {
if (uri.path.includes('.flattened.')) return ''
return getVirtualDocument(uri.path.substring(0, uri.path.lastIndexOf('.')))
}
}
e.context.subscriptions.push(vscode.workspace.registerTextDocumentContentProvider('mcglsl', docProvider))
return async () => {
if (vscode.window.activeTextEditor.document.languageId != 'glsl') return
const uri = vscode.window.activeTextEditor.document.uri
const path = vscode.Uri.parse(`mcglsl:${uri.path}.ast`)
const doc = await vscode.workspace.openTextDocument(path)
docProvider.onDidChangeEmitter.fire(path)
await vscode.window.showTextDocument(doc, {
viewColumn: vscode.ViewColumn.Two,
preview: true
})
}
}

View file

@ -6,12 +6,13 @@ import { log } from './log'
import { LanguageClient } from './lspClient'
import { download, getReleaseInfo } from './net'
import { PersistentState } from './persistent_state'
import * as path from 'path'
import * as path from 'path'
const platforms: { [key: string]: string } = {
'x64 win32': 'x86_64-pc-windows-msvc',
'x64 win32': 'x86_64-windows-msvc',
'x64 linux': 'x86_64-unknown-linux-gnu',
'x64 darwin': 'x86_64-apple-darwin',
'arm64 darwin': 'aarch64-apple-darwin'
}
export class Extension {
@ -24,8 +25,8 @@ export class Extension {
readonly package: {
version: string
} = vscode.extensions.getExtension(this.extensionID)!.packageJSON;
} = vscode.extensions.getExtension(this.extensionID)!.packageJSON
public get context(): vscode.ExtensionContext {
return this.extensionContext
}
@ -33,36 +34,67 @@ export class Extension {
public get lspClient(): lsp.LanguageClient {
return this.client
}
public activate = async (context: vscode.ExtensionContext) => {
this.extensionContext = context
this.state = new PersistentState(context.globalState)
await this.bootstrap()
if (!process.env['MCSHADER_DEBUG'] && !(vscode.workspace.getConfiguration('mcglsl').get('skipBootstrap') as boolean)) {
await this.bootstrap()
} else {
log.info('skipping language server bootstrap')
}
this.registerCommand('graphDot', commands.generateGraphDot)
this.registerCommand('restart', commands.restartExtension)
this.registerCommand('virtualMerge', commands.virtualMergedDocument)
this.registerCommand('parseTree', commands.parseTree)
log.info('starting language server...')
this.client = await new LanguageClient(this).startServer()
const lspBinary = process.env['MCSHADER_DEBUG'] ?
this.context.asAbsolutePath(path.join('server', 'target', 'debug', 'mcshader-lsp')) +
(process.platform === 'win32' ? '.exe' : '') :
path.join(this.context.globalStorageUri.fsPath, 'mcshader-lsp')
const filewatcherGlob = this.fileAssociationsToGlob(this.getGLSLFileAssociations())
this.client = await new LanguageClient(this, lspBinary, filewatcherGlob).startServer()
log.info('language server started!')
}
fileAssociationsToGlob = (associations: string[]): string => {
return '**/*.{'.concat(
associations.map(s => s.substring(s.indexOf('.'))).join(',')
) + '}'
}
getGLSLFileAssociations = (): string[] => {
const exts = ['.fsh', '.vsh', '.gsh', '.glsl']
const associations = vscode.workspace.getConfiguration('files').get('associations') as { [key: string]: string }
Object.keys(associations).forEach((key) => {
if (associations[key] === 'glsl') {
exts.push(key.substring(key.indexOf('*') + 1))
}
})
return exts
}
registerCommand = (name: string, f: (e: Extension) => commands.Command) => {
const cmd = f(this)
this.context.subscriptions.push(vscode.commands.registerCommand('mcglsl.'+name, cmd))
this.context.subscriptions.push(vscode.commands.registerCommand('mcglsl.' + name, cmd))
}
deactivate = async () => {
deactivate = async () => {
await this.lspClient.stop()
while(this.context.subscriptions.length > 0) {
while (this.context.subscriptions.length > 0) {
this.context.subscriptions.pop()?.dispose()
}
}
public updateStatus = (icon: string, text: string) => {
this.statusBarItem?.dispose()
this.statusBarItem = vscode.window.createStatusBarItem(vscode.StatusBarAlignment.Left)
@ -70,42 +102,59 @@ export class Extension {
this.statusBarItem.show()
this.context.subscriptions.push(this.statusBarItem)
}
public clearStatus = () => {
this.statusBarItem?.dispose()
}
private bootstrap = async () => {
mkdirSync(this.extensionContext.globalStoragePath, { recursive: true })
const dest = path.join(this.extensionContext.globalStoragePath, 'mcshader-lsp' + (process.platform === 'win32' ? '.exe' : ''))
const exists = await fs.stat(dest).then(() => true, () => false)
if (!exists) await this.state.updateServerVersion(undefined)
const release = await getReleaseInfo(this.package.version)
log.info('got release info from Github:\n\t', JSON.stringify(release))
const platform = platforms[`${process.arch} ${process.platform}`]
if (platform === undefined) {
vscode.window.showErrorMessage('Unfortunately we don\'t ship binaries for your platform yet.')
log.warn(`incompatible architecture/platform:\n\t${process.arch} ${process.platform}`)
return
}
if (release.tag_name === this.state.serverVersion) {
log.info('server version is same as extension:\n\t', this.state.serverVersion)
return
}
if (release.tag_name === this.state.serverVersion) return
const artifact = release.assets.find(artifact => artifact.name === `mcshader-lsp-${platform}${(process.platform === 'win32' ? '.exe' : '')}`)
log.info(`artifact with url ${artifact.browser_download_url} found`)
const userResponse = await vscode.window.showInformationMessage(
this.state.serverVersion == undefined ?
`Language server version ${this.package.version} is not installed.` :
`An update is available. Upgrade from ${this.state.serverVersion} to ${release.tag_name}?`,
`Language server version ${this.package.version} is not installed.` :
`An update is available. Upgrade from ${this.state.serverVersion} to ${release.tag_name}?`,
'Download now'
)
if (userResponse !== 'Download now') return
if (userResponse !== 'Download now') {
log.info('user chose not to download server...')
return
}
await download(artifact.browser_download_url, dest)
this.state.updateServerVersion(release.tag_name)
}
}
export const activate = new Extension().activate
export const activate = async (context: vscode.ExtensionContext) => {
try {
new Extension().activate(context)
} catch (e) {
log.error(`failed to activate extension: ${e}`)
throw(e)
}
}

View file

@ -9,33 +9,33 @@ export const log = new class {
// Hint: the type [T, ...T[]] means a non-empty array
debug(...msg: [unknown, ...unknown[]]): void {
log.write('DEBUG', ...msg)
log.write('DEBUG', ...msg)
}
info(...msg: [unknown, ...unknown[]]): void {
log.write('INFO ', ...msg)
log.write('INFO ', ...msg)
}
warn(...msg: [unknown, ...unknown[]]): void {
log.write('WARN ', ...msg)
log.write('WARN ', ...msg)
}
error(...msg: [unknown, ...unknown[]]): void {
log.write('ERROR', ...msg)
log.write('ERROR', ...msg)
}
write(label: string, ...messageParts: unknown[]): void {
const message = messageParts.map(log.stringify).join(' ')
const dateTime = new Date().toLocaleString()
log.output.appendLine(`${label} [${dateTime}]: ${message}`)
const message = messageParts.map(log.stringify).join(' ')
const dateTime = new Date().toLocaleString()
log.output.appendLine(`${label} [${dateTime}]: ${message}`)
}
private stringify(val: unknown): string {
if (typeof val === 'string') return val
if (typeof val === 'string') return val
return inspect(val, {
colors: false,
depth: 6, // heuristic
})
colors: false,
depth: 6, // heuristic
})
}
}

View file

@ -1,38 +1,37 @@
import * as path from 'path'
import { ConfigurationTarget, workspace } from 'vscode'
import * as lsp from 'vscode-languageclient'
import { Extension } from './extension'
import { lspOutputChannel } from './log'
import { log, lspOutputChannel } from './log'
import { ConfigUpdateParams, statusMethod, StatusParams, updateConfigMethod } from './lspExt'
export class LanguageClient extends lsp.LanguageClient {
private extension: Extension
constructor(ext: Extension) {
constructor(ext: Extension, lspBinary: string, filewatcherGlob: string) {
super('vscode-mc-shader', 'VSCode MC Shader', {
command: process.env['MCSHADER_DEBUG'] ?
ext.context.asAbsolutePath(path.join('server', 'target', 'debug', 'mcshader-lsp')) +
(process.platform === 'win32' ? '.exe' : '') :
path.join(ext.context.globalStoragePath, 'mcshader-lsp')
command: lspBinary
}, {
documentSelector: [{scheme: 'file', language: 'glsl'}],
documentSelector: [{ scheme: 'file', language: 'glsl' }],
outputChannel: lspOutputChannel,
synchronize: {
configurationSection: 'mcglsl',
fileEvents: workspace.createFileSystemWatcher('**/*.{fsh,gsh,vsh,glsl,inc}')
fileEvents: workspace.createFileSystemWatcher(filewatcherGlob)
},
})
this.extension = ext
log.info('server receiving events for file glob:\n\t', filewatcherGlob)
log.info('running with binary at path:\n\t', lspBinary)
}
public startServer = async (): Promise<LanguageClient> => {
this.extension.context.subscriptions.push(this.start())
await this.onReady()
this.onNotification(updateConfigMethod, this.onUpdateConfig)
this.onNotification(statusMethod, this.onStatusChange)
return this
}

View file

@ -12,5 +12,5 @@ export const status = new lsp.NotificationType<StatusParams>(statusMethod)
export const updateConfigMethod = 'mc-glsl/updateConfig'
export type ConfigUpdateParams = {
kv: {key: string, value: string}[]
kv: { key: string, value: string }[]
}

View file

@ -16,20 +16,21 @@ interface GithubRelease {
}
export async function getReleaseInfo(releaseTag: string): Promise<GithubRelease> {
log.info('fetching release info for tag', releaseTag)
const response = await fetch(`https://api.github.com/repos/strum355/mcshader-lsp/releases/tags/${releaseTag}`, {
headers: {Accept: 'application/vnd.github.v3+json'}
headers: { Accept: 'application/vnd.github.v3+json' }
})
const isRelease = (obj: unknown): obj is GithubRelease => {
return obj != null && typeof obj === 'object'
return obj != null && typeof obj === 'object'
&& typeof (obj as GithubRelease).tag_name === 'string'
&& Array.isArray((obj as GithubRelease).assets)
&& (obj as GithubRelease).assets.every((a) => typeof a.name === 'string' && typeof a.browser_download_url === 'string')
}
const json = await response.json()
if(!isRelease(json)) {
throw new TypeError('Received malformed request from Github Release API')
if (!isRelease(json)) {
throw new TypeError(`Received malformed request from Github Release API ${JSON.stringify(json)}`)
}
return json
}
@ -50,14 +51,14 @@ export async function download(url: string, downloadDest: string) {
message: `${newPercentage.toFixed(0)}%`,
increment: newPercentage - lastPercentage
})
lastPercentage = newPercentage
}
})
}
)
}
async function downloadFile(
url: string,
destFilePath: fs.PathLike,
@ -69,21 +70,21 @@ async function downloadFile(
log.error({ body: await res.text(), headers: res.headers })
throw new Error(`Got response ${res.status} when trying to download ${url}.`)
}
const totalBytes = Number(res.headers.get('content-length'))
log.debug('downloading file of', totalBytes, 'bytes size from', url, 'to', destFilePath)
log.debug('downloading file with', totalBytes, 'bytes size from', url, 'to', destFilePath)
let readBytes = 0
res.body.on('data', (chunk: Buffer) => {
readBytes += chunk.length
onProgress(readBytes, totalBytes)
})
const destFileStream = fs.createWriteStream(destFilePath, { mode: 0o755 })
await pipeline(res.body, destFileStream)
// Don't apply the workaround in fixed versions of nodejs, since the process
// freezes on them, the process waits for no-longer emitted `close` event.
// The fix was applied in commit 7eed9d6bcc in v13.11.0
@ -91,7 +92,7 @@ async function downloadFile(
// https://github.com/nodejs/node/blob/master/doc/changelogs/CHANGELOG_V13.md
const [, major, minor] = /v(\d+)\.(\d+)\.(\d+)/.exec(process.version)!
if (+major > 13 || (+major === 13 && +minor >= 11)) return
await new Promise<void>(resolve => {
destFileStream.on('close', resolve)
destFileStream.destroy()

View file

@ -12,6 +12,6 @@ export class PersistentState {
}
async updateServerVersion(value: string | undefined) {
await this.state.update('serverVersion', value)
await this.state.update('serverVersion', value)
}
}

BIN
logo-min.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

BIN
logo-mini.png Normal file

Binary file not shown.

BIN
logo.png

Binary file not shown.

Before

Width:  |  Height:  |  Size: 1.1 MiB

After

Width:  |  Height:  |  Size: 1.1 MiB

Before After
Before After

6766
package-lock.json generated

File diff suppressed because it is too large Load diff

View file

@ -2,16 +2,16 @@
"name": "vscode-mc-shader",
"displayName": "Minecraft GLSL Shaders",
"description": "A Visual Studio Code extension for linting/etc Minecraft GLSL Shaders",
"version": "0.9.1",
"version": "0.9.9",
"publisher": "Strum355",
"author": "Noah Santschi-Cooney (Strum355)",
"license": "MIT",
"icon": "logo.png",
"icon": "logo-min.png",
"repository": {
"url": "https://github.com/Strum355/vscode-mc-shader"
"url": "https://github.com/Strum355/mcshader-lsp"
},
"engines": {
"vscode": "^1.43.0"
"vscode": "^1.53.0"
},
"categories": [
"Linters",
@ -19,9 +19,7 @@
],
"activationEvents": [
"onLanguage:glsl",
"workspaceContains:**/*.fsh",
"workspaceContains:**/*.vsh",
"workspaceContains:**/*.gsh"
"workspaceContains:shaders/"
],
"extensionDependencies": [
"slevesque.shader"
@ -43,6 +41,11 @@
"command": "mcglsl.virtualMerge",
"title": "Show flattened file",
"category": "Minecraft Shader"
},
{
"command": "mcglsl.parseTree",
"title": "Show parse tree for file",
"category": "Minecraft Shader"
}
],
"languages": [
@ -61,7 +64,19 @@
],
"configuration": {
"title": "Minecraft GLSL Shaders",
"properties": {}
"properties": {
"mcglsl.skipBootstrap": {
"type": "boolean",
"default": false,
"description": "[DEBUG] Enable to skip bootstrapping the language server binary from Github. Set this to use a manually provided language server binary."
},
"mcglsl.logLevel": {
"type": "string",
"default": "info",
"enum": ["trace", "debug", "info", "warn", "error"],
"description": "Change the log level of the language server. This change happens live and does not require a restart."
}
}
}
},
"scripts": {
@ -74,22 +89,28 @@
"fix": "eslint 'client/**/*.ts' --fix"
},
"devDependencies": {
"@types/node": "^10.14.15",
"@typescript-eslint/parser": "^3.6.1",
"concurrently": "^5.1.0",
"eslint": "^7.4.0",
"typescript": "^3.9.7",
"vsce": "^1.77.0"
"@types/node": "^17.0.21",
"@typescript-eslint/parser": "^5.15.0",
"concurrently": "^7.0.0",
"eslint": "^8.11.0",
"typescript": "^4.6.2",
"vsce": "^2.7.0"
},
"eslintConfig": {
"parser": "@typescript-eslint/parser",
"parserOptions": {
"ecmaVersion": 2020,
"sourceType": "module"
"ecmaVersion": 2020,
"sourceType": "module"
},
"rules": {
"semi": ["warn", "never"],
"quotes": ["warn", "single"]
"semi": [
"warn",
"never"
],
"quotes": [
"warn",
"single"
]
}
}
}

1057
server/Cargo.lock generated

File diff suppressed because it is too large Load diff

View file

@ -1,31 +1,6 @@
[package]
name = "mcshader-lsp"
version = "0.1.0"
authors = ["Noah Santschi-Cooney <noah@santschi-cooney.ch>"]
edition = "2018"
[dependencies]
rust_lsp = { git = "https://github.com/Strum355/RustLSP", branch = "master" }
serde_json = "1.0.61"
serde = "1.0.123"
walkdir = "2.3.1"
petgraph = "0.5.1"
lazy_static = "1.4.0"
regex = "1.4.3"
chan = "0.1.23"
url = "2.2.0"
percent-encoding = "2.1.0"
anyhow = "1.0.38"
bit-set = "0.5.2"
thiserror = "1.0.23"
glutin = "0.26.0"
gl = "0.14.0"
ctor = "0.1.18"
mockall = "0.9.0"
path-slash = "0.1.4"
[dev-dependencies]
tempdir = "0.3.7"
fs_extra = "1.2.0"
hamcrest2 = "*"
pretty_assertions = "0.6.1"
[workspace]
members = [
"main",
"logging",
"logging_macro"
]

View file

@ -4,7 +4,7 @@ watchtest:
RUST_BACKTRACE=0 cargo watch -x test -i Makefile
test:
RUST_LIB_BACKTRACE=0 RUST_BACKTRACE=0 cargo test
RUST_LIB_BACKTRACE=0 RUST_BACKTRACE=0 cargo test -- --nocapture --color always
build:
cargo build

13
server/logging/Cargo.toml Normal file
View file

@ -0,0 +1,13 @@
[package]
name = "logging"
version = "0.9.9"
authors = ["Noah Santschi-Cooney <noah@santschi-cooney.ch>"]
edition = "2021"
[dependencies]
slog = { version = "2.7", features = [ "max_level_trace", "release_max_level_trace" ] }
slog-term = "2.9"
slog-scope = "4.4"
slog-atomic = "3.1"
rand = "0.8"
lazy_static = "1.4"

43
server/logging/src/lib.rs Normal file
View file

@ -0,0 +1,43 @@
use rand::{rngs, Rng};
use slog::slog_o;
use slog_scope::GlobalLoggerGuard;
use slog_term::{FullFormat, PlainSyncDecorator};
use std::{cell::RefCell, sync::Arc};
use std::io::Stderr;
use lazy_static::lazy_static;
use slog::*;
use slog_atomic::*;
fn new_trace_id() -> String {
let rng = CURRENT_RNG.with(|rng| rng.borrow_mut().gen::<[u8; 4]>());
return format!("{:04x}", u32::from_be_bytes(rng));
}
pub fn slog_with_trace_id<F: FnOnce()>(f: F) {
slog_scope::scope(&slog_scope::logger().new(slog_o!("trace" => new_trace_id())), f)
}
pub fn set_logger_with_level(level: Level) -> GlobalLoggerGuard {
let drain = Arc::new(logger_base(level).fuse());
DRAIN_SWITCH.ctrl().set(drain.clone());
slog_scope::set_global_logger(Logger::root(drain, o!()))
}
fn logger_base(level: Level) -> LevelFilter<Fuse<FullFormat<PlainSyncDecorator<Stderr>>>> {
let plain = slog_term::PlainSyncDecorator::new(std::io::stderr());
let drain = slog_term::FullFormat::new(plain).build().fuse();
drain.filter_level(level)
}
thread_local! {
static CURRENT_RNG: RefCell<rngs::ThreadRng> = RefCell::new(rngs::ThreadRng::default());
}
lazy_static! {
static ref DRAIN_SWITCH: AtomicSwitch<()> = {
let logger = logger_base(Level::Info).fuse();
AtomicSwitch::new(logger)
};
}

View file

@ -0,0 +1,12 @@
[package]
name = "logging_macro"
version = "0.9.9"
authors = ["Noah Santschi-Cooney <noah@santschi-cooney.ch>"]
edition = "2021"
[lib]
proc-macro = true
[dependencies]
quote = "1.0"
syn = { version = "1.0", features = [ "full" ] }

View file

@ -0,0 +1,24 @@
use proc_macro::TokenStream;
use quote::quote;
use syn::{parse_macro_input, parse_quote, ItemFn};
#[proc_macro_attribute]
pub fn log_scope(_args: TokenStream, function: TokenStream) -> TokenStream {
let mut function = parse_macro_input!(function as ItemFn);
let function_name = function.sig.ident.to_string();
let stmts = function.block.stmts;
function.block = Box::new(parse_quote!({
use slog::{slog_o, FnValue, Level};
use std::thread::current;
let _guard = logging::set_logger_with_level(Level::Trace);
slog_scope::scope(&slog_scope::logger().new(slog_o!("test_name" => #function_name, "thread_num" => FnValue(|_| format!("{:?}", current().id())))), || {
#(#stmts)*
});
}));
TokenStream::from(quote!(#function))
}

35
server/main/Cargo.toml Normal file
View file

@ -0,0 +1,35 @@
[package]
name = "mcshader-lsp"
version = "0.9.9"
authors = ["Noah Santschi-Cooney <noah@santschi-cooney.ch>"]
edition = "2021"
[dependencies]
rust_lsp = { git = "https://github.com/Strum355/RustLSP", branch = "master" }
serde_json = "1.0"
serde = "1.0"
walkdir = "2.3"
petgraph = "0.6"
lazy_static = "1.4"
regex = "1.4"
url = "2.2"
percent-encoding = "2.1"
anyhow = "1.0"
thiserror = "1.0"
glutin = "0.28"
gl = "0.14"
mockall = "0.11"
path-slash = "0.1"
slog = { version = "2.7", features = [ "max_level_trace", "release_max_level_trace" ] }
slog-scope = "4.4"
once_cell = "1.7"
tree-sitter = "0.20.6"
tree-sitter-glsl = "0.1.2"
logging = { path = "../logging" }
logging_macro = { path = "../logging_macro" }
[dev-dependencies]
tempdir = "0.3"
fs_extra = "1.2"
hamcrest2 = "*"
pretty_assertions = "1.1"

View file

@ -0,0 +1,52 @@
use std::cell::RefCell;
use std::fs::OpenOptions;
use std::io::prelude::*;
use std::path::Path;
use std::rc::Rc;
use petgraph::dot::Config;
use serde_json::Value;
use petgraph::dot;
use anyhow::{format_err, Result};
use slog_scope::info;
use crate::graph::CachedStableGraph;
use super::Invokeable;
pub struct GraphDotCommand {
pub graph: Rc<RefCell<CachedStableGraph>>,
}
impl Invokeable for GraphDotCommand {
fn run_command(&self, root: &Path, _: &[Value]) -> Result<Value> {
let filepath = root.join("graph.dot");
info!("generating dot file"; "path" => filepath.as_os_str().to_str());
let mut file = OpenOptions::new().truncate(true).write(true).create(true).open(filepath).unwrap();
let mut write_data_closure = || -> Result<(), std::io::Error> {
let graph = self.graph.as_ref();
file.seek(std::io::SeekFrom::Start(0))?;
file.write_all("digraph {\n\tgraph [splines=ortho]\n\tnode [shape=box]\n".as_bytes())?;
file.write_all(
dot::Dot::with_config(&graph.borrow().graph, &[Config::GraphContentOnly])
.to_string()
.as_bytes(),
)?;
file.write_all("\n}".as_bytes())?;
file.flush()?;
file.seek(std::io::SeekFrom::Start(0))?;
Ok(())
};
match write_data_closure() {
Err(err) => Err(format_err!("error generating graphviz data: {}", err)),
_ => Ok(Value::Null),
}
}
}

View file

@ -0,0 +1,114 @@
use std::cell::RefCell;
use std::rc::Rc;
use std::{
collections::HashMap,
path::{Path, PathBuf},
};
use serde_json::Value;
use petgraph::graph::NodeIndex;
use anyhow::{format_err, Result};
use std::fs;
use crate::dfs;
use crate::merge_views::FilialTuple;
use crate::source_mapper::SourceMapper;
use crate::{graph::CachedStableGraph, merge_views, url_norm::FromJson};
use super::Invokeable;
pub struct VirtualMergedDocument {
pub graph: Rc<RefCell<CachedStableGraph>>,
}
impl VirtualMergedDocument {
// TODO: DUPLICATE CODE
fn get_file_toplevel_ancestors(&self, uri: &Path) -> Result<Option<Vec<petgraph::stable_graph::NodeIndex>>> {
let curr_node = match self.graph.borrow_mut().find_node(uri) {
Some(n) => n,
None => return Err(format_err!("node not found {:?}", uri)),
};
let roots = self.graph.borrow().collect_root_ancestors(curr_node);
if roots.is_empty() {
return Ok(None);
}
Ok(Some(roots))
}
pub fn get_dfs_for_node(&self, root: NodeIndex) -> Result<Vec<FilialTuple>, dfs::error::CycleError> {
let graph_ref = self.graph.borrow();
let dfs = dfs::Dfs::new(&graph_ref, root);
dfs.collect::<Result<Vec<_>, _>>()
}
pub fn load_sources(&self, nodes: &[FilialTuple]) -> Result<HashMap<PathBuf, String>> {
let mut sources = HashMap::new();
for node in nodes {
let graph = self.graph.borrow();
let path = graph.get_node(node.child);
if sources.contains_key(&path) {
continue;
}
let source = match fs::read_to_string(&path) {
Ok(s) => s,
Err(e) => return Err(format_err!("error reading {:?}: {}", path, e)),
};
let source = source.replace("\r\n", "\n");
sources.insert(path.clone(), source);
}
Ok(sources)
}
}
impl Invokeable for VirtualMergedDocument {
fn run_command(&self, root: &Path, arguments: &[Value]) -> Result<Value> {
let path = PathBuf::from_json(arguments.get(0).unwrap())?;
let file_ancestors = match self.get_file_toplevel_ancestors(&path) {
Ok(opt) => match opt {
Some(ancestors) => ancestors,
None => vec![],
},
Err(e) => return Err(e),
};
//info!("ancestors for {}:\n\t{:?}", path, file_ancestors.iter().map(|e| self.graph.borrow().graph.node_weight(*e).unwrap().clone()).collect::<Vec<String>>());
// the set of all filepath->content. TODO: change to Url?
let mut all_sources: HashMap<PathBuf, String> = HashMap::new();
// if we are a top-level file (this has to be one of the set defined by Optifine, right?)
if file_ancestors.is_empty() {
// gather the list of all descendants
let root = self.graph.borrow_mut().find_node(&path).unwrap();
let tree = match self.get_dfs_for_node(root) {
Ok(tree) => tree,
Err(e) => return Err(e.into()),
};
let sources = match self.load_sources(&tree) {
Ok(s) => s,
Err(e) => return Err(e),
};
all_sources.extend(sources);
let mut source_mapper = SourceMapper::new(all_sources.len());
let graph = self.graph.borrow();
let view = merge_views::MergeViewBuilder::new(&tree, &all_sources, &graph, &mut source_mapper).build();
return Ok(serde_json::value::Value::String(view));
}
return Err(format_err!(
"{:?} is not a top-level file aka has ancestors",
path.strip_prefix(root).unwrap()
));
}
}

View file

@ -0,0 +1,36 @@
use std::{collections::HashMap, path::Path};
use serde_json::Value;
use anyhow::{format_err, Result};
use slog_scope::info;
pub mod graph_dot;
pub mod merged_includes;
pub mod parse_tree;
pub struct CustomCommandProvider {
commands: HashMap<String, Box<dyn Invokeable>>,
}
impl CustomCommandProvider {
pub fn new(commands: Vec<(&str, Box<dyn Invokeable>)>) -> CustomCommandProvider {
CustomCommandProvider {
commands: commands.into_iter().map(|tup| (tup.0.into(), tup.1)).collect(),
}
}
pub fn execute(&self, command: &str, args: &[Value], root_path: &Path) -> Result<Value> {
if self.commands.contains_key(command) {
info!("running command";
"command" => command,
"args" => format!("[{}]", args.iter().map(|v| serde_json::to_string(v).unwrap()).collect::<Vec<String>>().join(", ")));
return self.commands.get(command).unwrap().run_command(root_path, args);
}
Err(format_err!("command doesn't exist"))
}
}
pub trait Invokeable {
fn run_command(&self, root: &Path, arguments: &[Value]) -> Result<Value>;
}

View file

@ -0,0 +1,94 @@
use std::{
cell::RefCell,
fs,
path::{Path, PathBuf},
rc::Rc,
};
use anyhow::{format_err, Result};
use serde_json::Value;
use slog_scope::warn;
use tree_sitter::{Parser, TreeCursor};
use crate::url_norm::FromJson;
use super::Invokeable;
pub struct TreeSitterSExpr {
pub tree_sitter: Rc<RefCell<Parser>>,
}
impl Invokeable for TreeSitterSExpr {
fn run_command(&self, _: &Path, arguments: &[Value]) -> Result<Value> {
let path = PathBuf::from_json(arguments.get(0).unwrap())?;
warn!("parsing"; "path" => path.to_str().unwrap().to_string());
let source = fs::read_to_string(path)?;
let tree = match self.tree_sitter.borrow_mut().parse(source, None) {
Some(tree) => tree,
None => return Err(format_err!("tree-sitter parsing resulted in no parse tree")),
};
let mut cursor = tree.walk();
let rendered = render_parse_tree(&mut cursor);
Ok(serde_json::value::Value::String(rendered))
}
}
fn render_parse_tree(cursor: &mut TreeCursor) -> String {
let mut string = String::new();
let mut indent = 0;
let mut visited_children = false;
loop {
let node = cursor.node();
let display_name = if node.is_missing() {
format!("MISSING {}", node.kind())
} else if node.is_named() {
node.kind().to_string()
} else {
"".to_string()
};
if visited_children {
if cursor.goto_next_sibling() {
visited_children = false;
} else if cursor.goto_parent() {
visited_children = true;
indent -= 1;
} else {
break;
}
} else {
if !display_name.is_empty() {
let start = node.start_position();
let end = node.end_position();
let field_name = match cursor.field_name() {
Some(name) => name.to_string() + ": ",
None => "".to_string(),
};
string += (" ".repeat(indent)
+ format!("{}{} [{}, {}] - [{}, {}]\n", field_name, display_name, start.row, start.column, end.row, end.column)
.trim_start())
.as_str();
}
if cursor.goto_first_child() {
visited_children = false;
indent += 1;
} else {
visited_children = true;
}
}
}
string
}

View file

@ -0,0 +1,12 @@
use std::str::FromStr;
use slog::Level;
use slog_scope::error;
pub fn handle_log_level_change<F: FnOnce(Level)>(log_level: String, callback: F) {
match Level::from_str(log_level.as_str()) {
Ok(level) => callback(level),
Err(_) => error!("got unexpected log level from config"; "level" => log_level),
};
}

335
server/main/src/dfs.rs Normal file
View file

@ -0,0 +1,335 @@
use petgraph::stable_graph::NodeIndex;
use crate::{graph::CachedStableGraph, merge_views::FilialTuple};
use anyhow::Result;
struct VisitCount {
node: NodeIndex,
touch: usize,
children: usize,
}
/// Performs a depth-first search with duplicates
pub struct Dfs<'a> {
stack: Vec<NodeIndex>,
graph: &'a CachedStableGraph,
cycle: Vec<VisitCount>,
}
impl<'a> Dfs<'a> {
pub fn new(graph: &'a CachedStableGraph, start: NodeIndex) -> Self {
Dfs {
stack: vec![start],
graph,
cycle: Vec::new(),
}
}
fn reset_path_to_branch(&mut self) {
while let Some(par) = self.cycle.last_mut() {
par.touch += 1;
if par.touch > par.children {
self.cycle.pop();
} else {
break;
}
}
}
fn check_for_cycle(&self, children: &[NodeIndex]) -> Result<(), error::CycleError> {
for prev in &self.cycle {
for child in children {
if prev.node == *child {
let cycle_nodes: Vec<NodeIndex> = self.cycle.iter().map(|n| n.node).collect();
return Err(error::CycleError::new(&cycle_nodes, *child, self.graph));
}
}
}
Ok(())
}
}
impl<'a> Iterator for Dfs<'a> {
type Item = Result<FilialTuple, error::CycleError>;
fn next(&mut self) -> Option<Result<FilialTuple, error::CycleError>> {
let parent = self.cycle.last().map(|p| p.node);
if let Some(child) = self.stack.pop() {
self.cycle.push(VisitCount {
node: child,
children: self.graph.graph.edges(child).count(),
touch: 1,
});
let mut children: Vec<_> = self
.graph
.get_all_child_positions(child)
.collect();
children.reverse();
if !children.is_empty() {
let child_indexes: Vec<_> = children.iter().map(|c| c.0).collect();
match self.check_for_cycle(&child_indexes) {
Ok(_) => {}
Err(e) => return Some(Err(e)),
};
for child in children {
self.stack.push(child.0);
}
} else {
self.reset_path_to_branch();
}
return Some(Ok(FilialTuple { child, parent }));
}
None
}
}
pub mod error {
use petgraph::stable_graph::NodeIndex;
use std::{
error::Error as StdError,
fmt::{Debug, Display},
path::PathBuf,
};
use crate::{consts, graph::CachedStableGraph};
use rust_lsp::lsp_types::{Diagnostic, DiagnosticSeverity, Position, Range};
#[derive(Debug)]
pub struct CycleError(Vec<PathBuf>);
impl StdError for CycleError {}
impl CycleError {
pub fn new(nodes: &[NodeIndex], current_node: NodeIndex, graph: &CachedStableGraph) -> Self {
let mut resolved_nodes: Vec<PathBuf> = nodes.iter().map(|i| graph.get_node(*i)).collect();
resolved_nodes.push(graph.get_node(current_node));
CycleError(resolved_nodes)
}
}
impl Display for CycleError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut disp = String::new();
disp.push_str(format!("Include cycle detected:\n{:?} imports ", self.0[0]).as_str());
for p in &self.0[1..self.0.len() - 1] {
disp.push_str(format!("\n{:?}, which imports ", *p).as_str());
}
disp.push_str(format!("\n{:?}", self.0[self.0.len() - 1]).as_str());
f.write_str(disp.as_str())
}
}
impl From<CycleError> for Diagnostic {
fn from(e: CycleError) -> Diagnostic {
Diagnostic {
severity: Some(DiagnosticSeverity::ERROR),
range: Range::new(Position::new(0, 0), Position::new(0, 500)),
source: Some(consts::SOURCE.into()),
message: e.into(),
code: None,
tags: None,
related_information: None,
code_description: Option::None,
data: Option::None,
}
}
}
impl From<CycleError> for String {
fn from(e: CycleError) -> String {
format!("{}", e)
}
}
}
#[cfg(test)]
mod dfs_test {
use std::path::PathBuf;
use hamcrest2::prelude::*;
use hamcrest2::{assert_that, ok};
use petgraph::{algo::is_cyclic_directed, graph::NodeIndex};
use crate::graph::CachedStableGraph;
use crate::{dfs, IncludePosition};
#[test]
#[logging_macro::log_scope]
fn test_graph_dfs() {
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx0, idx2, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 5, start: 0, end: 0 });
let dfs = dfs::Dfs::new(&graph, idx0);
let mut collection = Vec::new();
for i in dfs {
assert_that!(&i, ok());
collection.push(i.unwrap());
}
let nodes: Vec<NodeIndex> = collection.iter().map(|n| n.child).collect();
let parents: Vec<Option<NodeIndex>> = collection.iter().map(|n| n.parent).collect();
// 0
// / \
// 1 2
// /
// 3
let expected_nodes = vec![idx0, idx1, idx3, idx2];
assert_eq!(expected_nodes, nodes);
let expected_parents = vec![None, Some(idx0), Some(idx1), Some(idx0)];
assert_eq!(expected_parents, parents);
assert!(!is_cyclic_directed(&graph.graph));
}
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
let idx4 = graph.add_node(&PathBuf::from("4"));
let idx5 = graph.add_node(&PathBuf::from("5"));
let idx6 = graph.add_node(&PathBuf::from("6"));
let idx7 = graph.add_node(&PathBuf::from("7"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx0, idx2, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 5, start: 0, end: 0 });
graph.add_edge(idx1, idx4, IncludePosition { line: 6, start: 0, end: 0 });
graph.add_edge(idx2, idx4, IncludePosition { line: 5, start: 0, end: 0 });
graph.add_edge(idx2, idx5, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx3, idx6, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx4, idx6, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx6, idx7, IncludePosition { line: 4, start: 0, end: 0 });
let dfs = dfs::Dfs::new(&graph, idx0);
let mut collection = Vec::new();
for i in dfs {
assert_that!(&i, ok());
collection.push(i.unwrap());
}
let nodes: Vec<NodeIndex> = collection.iter().map(|n| n.child).collect();
let parents: Vec<Option<NodeIndex>> = collection.iter().map(|n| n.parent).collect();
// 0
// / \
// 1 2
// / \ / \
// 3 4 5
// \ /
// 6 - 7
let expected_nodes = vec![idx0, idx1, idx3, idx6, idx7, idx4, idx6, idx7, idx2, idx5, idx4, idx6, idx7];
assert_eq!(expected_nodes, nodes);
let expected_parents = vec![
None,
Some(idx0),
Some(idx1),
Some(idx3),
Some(idx6),
Some(idx1),
Some(idx4),
Some(idx6),
Some(idx0),
Some(idx2),
Some(idx2),
Some(idx4),
Some(idx6),
];
assert_eq!(expected_parents, parents);
assert!(!is_cyclic_directed(&graph.graph));
}
}
#[test]
#[logging_macro::log_scope]
fn test_graph_dfs_cycle() {
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
let idx4 = graph.add_node(&PathBuf::from("4"));
let idx5 = graph.add_node(&PathBuf::from("5"));
let idx6 = graph.add_node(&PathBuf::from("6"));
let idx7 = graph.add_node(&PathBuf::from("7"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx0, idx2, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 5, start: 0, end: 0 });
graph.add_edge(idx1, idx4, IncludePosition { line: 6, start: 0, end: 0 });
graph.add_edge(idx2, idx4, IncludePosition { line: 5, start: 0, end: 0 });
graph.add_edge(idx2, idx5, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx3, idx6, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx4, idx6, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx6, idx7, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx7, idx4, IncludePosition { line: 4, start: 0, end: 0 });
let mut dfs = dfs::Dfs::new(&graph, idx0);
for _ in 0..5 {
if let Some(i) = dfs.next() {
assert_that!(&i, ok());
}
}
// 0
// / \
// 1 2
// / \ / \
// 3 4 5
// \ / \
// 6 - 7
let next = dfs.next().unwrap();
assert_that!(next, err());
assert!(is_cyclic_directed(&graph.graph));
}
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx1, idx0, IncludePosition { line: 2, start: 0, end: 0 });
let mut dfs = dfs::Dfs::new(&graph, idx1);
println!("{:?}", dfs.next());
println!("{:?}", dfs.next());
println!("{:?}", dfs.next());
}
}
}

View file

@ -0,0 +1,194 @@
use std::{collections::HashMap, cell::OnceCell, path::Path};
use regex::Regex;
use rust_lsp::lsp_types::{Diagnostic, DiagnosticSeverity, Position, Range};
use slog_scope::debug;
use url::Url;
use crate::{
consts,
graph::CachedStableGraph,
opengl,
source_mapper::{SourceMapper, SourceNum},
};
pub struct DiagnosticsParser<'a, T: opengl::ShaderValidator + ?Sized> {
line_offset: OnceCell<u32>,
line_regex: OnceCell<Regex>,
vendor_querier: &'a T,
}
impl<'a, T: opengl::ShaderValidator + ?Sized> DiagnosticsParser<'a, T> {
pub fn new(vendor_querier: &'a T) -> Self {
DiagnosticsParser {
line_offset: OnceCell::new(),
line_regex: OnceCell::new(),
vendor_querier,
}
}
fn get_line_regex(&self) -> &Regex {
self.line_regex.get_or_init(|| match self.vendor_querier.vendor().as_str() {
"NVIDIA Corporation" => {
Regex::new(r#"^(?P<filepath>\d+)\((?P<linenum>\d+)\) : (?P<severity>error|warning) [A-C]\d+: (?P<output>.+)"#).unwrap()
}
_ => Regex::new(r#"^(?P<severity>ERROR|WARNING): (?P<filepath>[^?<>*|"\n]+):(?P<linenum>\d+): (?:'.*' :|[a-z]+\(#\d+\)) +(?P<output>.+)$"#)
.unwrap(),
})
}
fn get_line_offset(&self) -> u32 {
*self.line_offset.get_or_init(|| match self.vendor_querier.vendor().as_str() {
"ATI Technologies" => 0,
_ => 1,
})
}
pub fn parse_diagnostics_output(
&self, output: String, uri: &Path, source_mapper: &SourceMapper, graph: &CachedStableGraph,
) -> HashMap<Url, Vec<Diagnostic>> {
let output_lines = output.split('\n').collect::<Vec<&str>>();
let mut diagnostics: HashMap<Url, Vec<Diagnostic>> = HashMap::with_capacity(output_lines.len());
debug!("diagnostics regex selected"; "regex" => self.get_line_regex() .as_str());
for line in output_lines {
let diagnostic_capture = match self.get_line_regex().captures(line) {
Some(d) => d,
None => continue,
};
debug!("found match for output line"; "line" => line, "capture" => format!("{:?}", diagnostic_capture));
let msg = diagnostic_capture.name("output").unwrap().as_str();
let line = match diagnostic_capture.name("linenum") {
Some(c) => c.as_str().parse::<u32>().unwrap_or(0),
None => 0,
} - self.get_line_offset();
// TODO: line matching maybe
/* let line_text = source_lines[line as usize];
let leading_whitespace = line_text.len() - line_text.trim_start().len(); */
let severity = match diagnostic_capture.name("severity") {
Some(c) => match c.as_str().to_lowercase().as_str() {
"error" => DiagnosticSeverity::ERROR,
"warning" => DiagnosticSeverity::WARNING,
_ => DiagnosticSeverity::INFORMATION,
},
_ => DiagnosticSeverity::INFORMATION,
};
let origin = match diagnostic_capture.name("filepath") {
Some(o) => {
let source_num: SourceNum = o.as_str().parse::<usize>().unwrap().into();
let graph_node = source_mapper.get_node(source_num);
graph.get_node(graph_node).to_str().unwrap().to_string()
}
None => uri.to_str().unwrap().to_string(),
};
let diagnostic = Diagnostic {
range: Range::new(
/* Position::new(line, leading_whitespace as u64),
Position::new(line, line_text.len() as u64) */
Position::new(line, 0),
Position::new(line, 1000),
),
code: None,
severity: Some(severity),
source: Some(consts::SOURCE.into()),
message: msg.trim().into(),
related_information: None,
tags: None,
code_description: Option::None,
data: Option::None,
};
let origin_url = Url::from_file_path(origin).unwrap();
match diagnostics.get_mut(&origin_url) {
Some(d) => d.push(diagnostic),
None => {
diagnostics.insert(origin_url, vec![diagnostic]);
}
};
}
diagnostics
}
}
#[cfg(test)]
mod diagnostics_test {
use std::path::PathBuf;
use slog::slog_o;
use url::Url;
use crate::{
diagnostics_parser::DiagnosticsParser, opengl::MockShaderValidator, source_mapper::SourceMapper, test::new_temp_server,
};
#[test]
#[logging_macro::log_scope]
fn test_nvidia_diagnostics() {
slog_scope::scope(&slog_scope::logger().new(slog_o!("driver" => "nvidia")), || {
let mut mockgl = MockShaderValidator::new();
mockgl.expect_vendor().returning(|| "NVIDIA Corporation".into());
let server = new_temp_server(Some(Box::new(mockgl)));
let output = "0(9) : error C0000: syntax error, unexpected '}', expecting ',' or ';' at token \"}\"";
#[cfg(target_family = "unix")]
let path: PathBuf = "/home/noah/.minecraft/shaderpacks/test/shaders/final.fsh".into();
#[cfg(target_family = "windows")]
let path: PathBuf = "c:\\home\\noah\\.minecraft\\shaderpacks\\test\\shaders\\final.fsh".into();
let mut source_mapper = SourceMapper::new(0);
source_mapper.get_num(server.graph.borrow_mut().add_node(&path));
let parser = DiagnosticsParser::new(server.opengl_context.as_ref());
let results =
parser.parse_diagnostics_output(output.to_string(), path.parent().unwrap(), &source_mapper, &server.graph.borrow());
assert_eq!(results.len(), 1);
let first = results.into_iter().next().unwrap();
assert_eq!(first.0, Url::from_file_path(path).unwrap());
server.endpoint.request_shutdown();
});
}
#[test]
#[logging_macro::log_scope]
fn test_amd_diagnostics() {
slog_scope::scope(&slog_scope::logger().new(slog_o!("driver" => "amd")), || {
let mut mockgl = MockShaderValidator::new();
mockgl.expect_vendor().returning(|| "ATI Technologies".into());
let server = new_temp_server(Some(Box::new(mockgl)));
let output = "ERROR: 0:1: '' : syntax error: #line
ERROR: 0:10: '' : syntax error: #line
ERROR: 0:15: 'varying' : syntax error: syntax error
";
#[cfg(target_family = "unix")]
let path: PathBuf = "/home/noah/.minecraft/shaderpacks/test/shaders/final.fsh".into();
#[cfg(target_family = "windows")]
let path: PathBuf = "c:\\home\\noah\\.minecraft\\shaderpacks\\test\\shaders\\final.fsh".into();
let mut source_mapper = SourceMapper::new(0);
source_mapper.get_num(server.graph.borrow_mut().add_node(&path));
let parser = DiagnosticsParser::new(server.opengl_context.as_ref());
let results =
parser.parse_diagnostics_output(output.to_string(), path.parent().unwrap(), &source_mapper, &server.graph.borrow());
assert_eq!(results.len(), 1);
let first = results.into_iter().next().unwrap();
assert_eq!(first.1.len(), 3);
server.endpoint.request_shutdown();
});
}
}

374
server/main/src/graph.rs Normal file
View file

@ -0,0 +1,374 @@
use petgraph::stable_graph::EdgeIndex;
use petgraph::stable_graph::NodeIndex;
use petgraph::stable_graph::StableDiGraph;
use petgraph::visit::EdgeRef;
use petgraph::Direction;
use std::{
collections::{HashMap, HashSet},
path::{Path, PathBuf},
str::FromStr,
};
use super::IncludePosition;
/// Wraps a `StableDiGraph` with caching behaviour for node search by maintaining
/// an index for node value to node index and a reverse index.
/// This allows for **O(1)** lookup for a value if it exists, else **O(n)**.
pub struct CachedStableGraph {
// StableDiGraph is used as it allows for String node values, essential for
// generating the GraphViz DOT render.
pub graph: StableDiGraph<String, IncludePosition>,
cache: HashMap<PathBuf, NodeIndex>,
// Maps a node index to its abstracted string representation.
// Mainly used as the graph is based on NodeIndex.
reverse_index: HashMap<NodeIndex, PathBuf>,
}
impl CachedStableGraph {
#[allow(clippy::new_without_default)]
pub fn new() -> CachedStableGraph {
CachedStableGraph {
graph: StableDiGraph::new(),
cache: HashMap::new(),
reverse_index: HashMap::new(),
}
}
/// Returns the `NodeIndex` for a given graph node with the value of `name`
/// and caches the result in the `HashMap`. Complexity is **O(1)** if the value
/// is cached (which should always be the case), else **O(n)** where **n** is
/// the number of node indices, as an exhaustive search must be done.
pub fn find_node(&mut self, name: &Path) -> Option<NodeIndex> {
match self.cache.get(name) {
Some(n) => Some(*n),
None => {
// If the string is not in cache, O(n) search the graph (i know...) and then cache the NodeIndex
// for later
let n = self.graph.node_indices().find(|n| self.graph[*n] == name.to_str().unwrap());
if let Some(n) = n {
self.cache.insert(name.into(), n);
}
n
}
}
}
// Returns the `PathBuf` for a given `NodeIndex`
pub fn get_node(&self, node: NodeIndex) -> PathBuf {
PathBuf::from_str(&self.graph[node]).unwrap()
}
/// Returns an iterator over all the `IncludePosition`'s between a parent and its child for all the positions
/// that the child may be imported into the parent, in order of import.
pub fn get_child_positions(&self, parent: NodeIndex, child: NodeIndex) -> impl Iterator<Item = IncludePosition> + '_ {
let mut edges = self
.graph
.edges(parent)
.filter_map(move |edge| {
let target = self.graph.edge_endpoints(edge.id()).unwrap().1;
if target != child {
return None;
}
Some(self.graph[edge.id()])
})
.collect::<Vec<IncludePosition>>();
edges.sort_by(|x, y| x.line.cmp(&y.line));
edges.into_iter()
}
/// Returns an iterator over all the `(NodeIndex, IncludePosition)` tuples between a node and all its children, in order
/// of import.
pub fn get_all_child_positions(&self, node: NodeIndex) -> impl Iterator<Item = (NodeIndex, IncludePosition)> + '_ {
let mut edges = self.graph.edges(node).map(|edge| {
let child = self.graph.edge_endpoints(edge.id()).unwrap().1;
(child, self.graph[edge.id()])
})
.collect::<Vec<_>>();
edges.sort_by(|x, y| x.1.line.cmp(&y.1.line));
edges.into_iter()
}
pub fn add_node(&mut self, name: &Path) -> NodeIndex {
if let Some(idx) = self.cache.get(name) {
return *idx;
}
let idx = self.graph.add_node(name.to_str().unwrap().to_string());
self.cache.insert(name.to_owned(), idx);
self.reverse_index.insert(idx, name.to_owned());
idx
}
pub fn add_edge(&mut self, parent: NodeIndex, child: NodeIndex, meta: IncludePosition) -> EdgeIndex {
self.graph.add_edge(parent, child, meta)
}
pub fn remove_edge(&mut self, parent: NodeIndex, child: NodeIndex, position: IncludePosition) {
self.graph
.edges(parent)
.find(|edge| self.graph.edge_endpoints(edge.id()).unwrap().1 == child && *edge.weight() == position)
.map(|edge| edge.id())
.and_then(|edge| self.graph.remove_edge(edge));
}
pub fn child_node_indexes(&self, node: NodeIndex) -> impl Iterator<Item = NodeIndex> + '_ {
self.graph.neighbors(node)
}
pub fn collect_root_ancestors(&self, node: NodeIndex) -> Vec<NodeIndex> {
let mut visited = HashSet::new();
self.get_root_ancestors(node, node, &mut visited)
}
// TODO: impl Iterator
fn parent_node_indexes(&self, node: NodeIndex) -> Vec<NodeIndex> {
self.graph.neighbors_directed(node, Direction::Incoming).collect()
}
fn get_root_ancestors(&self, initial: NodeIndex, node: NodeIndex, visited: &mut HashSet<NodeIndex>) -> Vec<NodeIndex> {
if node == initial && !visited.is_empty() {
return vec![];
}
let parents = self.parent_node_indexes(node);
let mut collection = Vec::with_capacity(parents.len());
for ancestor in &parents {
visited.insert(*ancestor);
}
for ancestor in &parents {
let ancestors = self.parent_node_indexes(*ancestor);
if !ancestors.is_empty() {
collection.extend(self.get_root_ancestors(initial, *ancestor, visited));
} else {
collection.push(*ancestor);
}
}
collection
}
}
#[cfg(test)]
impl CachedStableGraph {
fn parent_node_names(&self, node: NodeIndex) -> Vec<PathBuf> {
self.graph
.neighbors_directed(node, Direction::Incoming)
.map(|n| self.reverse_index.get(&n).unwrap().clone())
.collect()
}
fn child_node_names(&self, node: NodeIndex) -> Vec<PathBuf> {
self.graph
.neighbors(node)
.map(|n| self.reverse_index.get(&n).unwrap().clone())
.collect()
}
fn remove_node(&mut self, name: &Path) {
let idx = self.cache.remove(name);
if let Some(idx) = idx {
self.graph.remove_node(idx);
}
}
}
#[cfg(test)]
mod graph_test {
use std::path::PathBuf;
use petgraph::graph::NodeIndex;
use crate::{graph::CachedStableGraph, IncludePosition};
#[test]
#[logging_macro::log_scope]
fn test_graph_two_connected_nodes() {
let mut graph = CachedStableGraph::new();
let idx1 = graph.add_node(&PathBuf::from("sample"));
let idx2 = graph.add_node(&PathBuf::from("banana"));
graph.add_edge(idx1, idx2, IncludePosition { line: 3, start: 0, end: 0 });
let children = graph.child_node_names(idx1);
assert_eq!(children.len(), 1);
assert_eq!(children[0], Into::<PathBuf>::into("banana".to_string()));
let children: Vec<NodeIndex> = graph.child_node_indexes(idx1).collect();
assert_eq!(children.len(), 1);
assert_eq!(children[0], idx2);
let parents = graph.parent_node_names(idx1);
assert_eq!(parents.len(), 0);
let parents = graph.parent_node_names(idx2);
assert_eq!(parents.len(), 1);
assert_eq!(parents[0], Into::<PathBuf>::into("sample".to_string()));
let parents = graph.parent_node_indexes(idx2);
assert_eq!(parents.len(), 1);
assert_eq!(parents[0], idx1);
let ancestors = graph.collect_root_ancestors(idx2);
assert_eq!(ancestors.len(), 1);
assert_eq!(ancestors[0], idx1);
let ancestors = graph.collect_root_ancestors(idx1);
assert_eq!(ancestors.len(), 0);
graph.remove_node(&PathBuf::from("sample"));
assert_eq!(graph.graph.node_count(), 1);
assert!(graph.find_node(&PathBuf::from("sample")).is_none());
let neighbors = graph.child_node_names(idx2);
assert_eq!(neighbors.len(), 0);
}
#[test]
#[logging_macro::log_scope]
fn test_double_import() {
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx0, idx1, IncludePosition { line: 4, start: 0, end: 0 });
// 0
// / \
// 1 1
assert_eq!(2, graph.get_child_positions(idx0, idx1).count());
let mut edge_metas = graph.get_child_positions(idx0, idx1);
assert_eq!(Some(IncludePosition { line: 2, start: 0, end: 0 }), edge_metas.next());
assert_eq!(Some(IncludePosition { line: 4, start: 0, end: 0 }), edge_metas.next());
}
#[test]
#[logging_macro::log_scope]
fn test_collect_root_ancestors() {
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx1, idx2, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx3, idx1, IncludePosition { line: 4, start: 0, end: 0 });
// 0 3
// |/
// 1
// |
// 2
let roots = graph.collect_root_ancestors(idx2);
assert_eq!(roots, vec![idx3, idx0]);
let roots = graph.collect_root_ancestors(idx1);
assert_eq!(roots, vec![idx3, idx0]);
let roots = graph.collect_root_ancestors(idx0);
assert_eq!(roots, vec![]);
let roots = graph.collect_root_ancestors(idx3);
assert_eq!(roots, vec![]);
}
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx0, idx2, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 5, start: 0, end: 0 });
// 0
// / \
// 1 2
// /
// 3
let roots = graph.collect_root_ancestors(idx3);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx2);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx1);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx0);
assert_eq!(roots, vec![]);
}
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx2, idx3, IncludePosition { line: 3, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 5, start: 0, end: 0 });
// 0
// \
// 2 1
// \ /
// 3
let roots = graph.collect_root_ancestors(idx3);
assert_eq!(roots, vec![idx0, idx2]);
let roots = graph.collect_root_ancestors(idx2);
assert_eq!(roots, vec![]);
let roots = graph.collect_root_ancestors(idx1);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx0);
assert_eq!(roots, vec![]);
}
{
let mut graph = CachedStableGraph::new();
let idx0 = graph.add_node(&PathBuf::from("0"));
let idx1 = graph.add_node(&PathBuf::from("1"));
let idx2 = graph.add_node(&PathBuf::from("2"));
let idx3 = graph.add_node(&PathBuf::from("3"));
graph.add_edge(idx0, idx1, IncludePosition { line: 2, start: 0, end: 0 });
graph.add_edge(idx1, idx2, IncludePosition { line: 4, start: 0, end: 0 });
graph.add_edge(idx1, idx3, IncludePosition { line: 6, start: 0, end: 0 });
// 0
// |
// 1
// / \
// 2 3
let roots = graph.collect_root_ancestors(idx3);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx2);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx1);
assert_eq!(roots, vec![idx0]);
let roots = graph.collect_root_ancestors(idx0);
assert_eq!(roots, vec![]);
}
}
}

View file

@ -0,0 +1,80 @@
use rust_lsp::lsp_types::Position;
pub struct LineMap {
positions: Vec<usize>,
}
impl LineMap {
pub fn new(source: &str) -> Self {
let mut positions = vec![0];
for (i, char) in source.char_indices() {
if char == '\n' {
positions.push(i + 1);
}
}
LineMap { positions }
}
pub fn offset_for_position(&self, position: Position) -> usize {
self.positions[position.line as usize] + (position.character as usize)
}
}
#[cfg(test)]
mod test {
use rust_lsp::lsp_types::Position;
use crate::linemap::LineMap;
#[test]
#[logging_macro::log_scope]
fn test_linemap() {
struct Test {
string: &'static str,
pos: Position,
offset: usize,
}
let cases = vec![
Test {
string: "sample\ntext",
pos: Position { line: 1, character: 2 },
offset: 9,
},
Test {
string: "banana",
pos: Position { line: 0, character: 0 },
offset: 0,
},
Test {
string: "banana",
pos: Position { line: 0, character: 1 },
offset: 1,
},
Test {
string: "sample\ntext",
pos: Position { line: 1, character: 0 },
offset: 7,
},
Test {
string: "sample\n\ttext",
pos: Position { line: 1, character: 2 },
offset: 9,
},
Test {
string: "sample\r\ntext",
pos: Position { line: 1, character: 0 },
offset: 8,
},
];
for case in cases {
let linemap = LineMap::new(case.string);
let offset = linemap.offset_for_position(case.pos);
assert_eq!(offset, case.offset, "{:?}", case.string);
}
}
}

View file

@ -13,4 +13,4 @@ pub struct StatusParams {
pub status: String,
pub message: Option<String>,
pub icon: Option<String>,
}
}

959
server/main/src/main.rs Normal file
View file

@ -0,0 +1,959 @@
#![feature(once_cell)]
#![feature(option_get_or_insert_default)]
use merge_views::FilialTuple;
use rust_lsp::jsonrpc::{method_types::*, *};
use rust_lsp::lsp::*;
use rust_lsp::lsp_types::{notification::*, *};
use petgraph::stable_graph::NodeIndex;
use path_slash::PathExt;
use serde::Deserialize;
use serde_json::{from_value, Value};
use tree_sitter::Parser;
use url_norm::FromUrl;
use walkdir::WalkDir;
use std::collections::{HashMap, HashSet};
use std::convert::TryFrom;
use std::fmt::{Debug, Display, Formatter};
use std::fs;
use std::io::{stdin, stdout, BufRead, BufReader};
use std::iter::{Extend, FromIterator};
use std::rc::Rc;
use std::str::FromStr;
use std::{
cell::RefCell,
path::{Path, PathBuf},
};
use slog::Level;
use slog_scope::{debug, error, info, warn};
use path_slash::PathBufExt;
use anyhow::{anyhow, Result};
use regex::Regex;
use lazy_static::lazy_static;
mod commands;
mod configuration;
mod consts;
mod dfs;
mod diagnostics_parser;
mod graph;
mod linemap;
mod lsp_ext;
mod merge_views;
mod navigation;
mod opengl;
mod source_mapper;
mod url_norm;
#[cfg(test)]
mod test;
pub fn is_top_level(path: &Path) -> bool {
let path = path.to_slash().unwrap();
if !RE_WORLD_FOLDER.is_match(&path) {
return false;
}
let parts: Vec<&str> = path.split("/").collect();
let len = parts.len();
(len == 3 || len == 2) && TOPLEVEL_FILES.contains(parts[len - 1])
}
lazy_static! {
static ref RE_INCLUDE: Regex = Regex::new(r#"^(?:\s)*?(?:#include) "(.+)"\r?"#).unwrap();
static ref RE_WORLD_FOLDER: Regex = Regex::new(r#"^shaders(/world-?\d+)?"#).unwrap();
static ref TOPLEVEL_FILES: HashSet<String> = {
let mut set = HashSet::with_capacity(1716);
for ext in ["fsh", "vsh", "gsh", "csh"] {
set.insert(format!("composite.{}", ext));
set.insert(format!("deferred.{}", ext));
set.insert(format!("prepare.{}", ext));
set.insert(format!("shadowcomp.{}", ext));
for i in 1..=99 {
set.insert(format!("composite{}.{}", i, ext));
set.insert(format!("deferred{}.{}", i, ext));
set.insert(format!("prepare{}.{}", i, ext));
set.insert(format!("shadowcomp{}.{}", i, ext));
}
set.insert(format!("composite_pre.{}", ext));
set.insert(format!("deferred_pre.{}", ext));
set.insert(format!("final.{}", ext));
set.insert(format!("gbuffers_armor_glint.{}", ext));
set.insert(format!("gbuffers_basic.{}", ext));
set.insert(format!("gbuffers_beaconbeam.{}", ext));
set.insert(format!("gbuffers_block.{}", ext));
set.insert(format!("gbuffers_clouds.{}", ext));
set.insert(format!("gbuffers_damagedblock.{}", ext));
set.insert(format!("gbuffers_entities.{}", ext));
set.insert(format!("gbuffers_entities_glowing.{}", ext));
set.insert(format!("gbuffers_hand.{}", ext));
set.insert(format!("gbuffers_hand_water.{}", ext));
set.insert(format!("gbuffers_item.{}", ext));
set.insert(format!("gbuffers_line.{}", ext));
set.insert(format!("gbuffers_skybasic.{}", ext));
set.insert(format!("gbuffers_skytextured.{}", ext));
set.insert(format!("gbuffers_spidereyes.{}", ext));
set.insert(format!("gbuffers_terrain.{}", ext));
set.insert(format!("gbuffers_terrain_cutout.{}", ext));
set.insert(format!("gbuffers_terrain_cutout_mip.{}", ext));
set.insert(format!("gbuffers_terrain_solid.{}", ext));
set.insert(format!("gbuffers_textured.{}", ext));
set.insert(format!("gbuffers_textured_lit.{}", ext));
set.insert(format!("gbuffers_water.{}", ext));
set.insert(format!("gbuffers_weather.{}", ext));
set.insert(format!("shadow.{}", ext));
set.insert(format!("shadow_cutout.{}", ext));
set.insert(format!("shadow_solid.{}", ext));
}
let base_char_num = 'a' as u8;
for suffix_num in 0u8..=25u8 {
let suffix_char = (base_char_num + suffix_num) as char;
set.insert(format!("composite_{}.csh", suffix_char));
set.insert(format!("deferred_{}.csh", suffix_char));
set.insert(format!("prepare_{}.csh", suffix_char));
set.insert(format!("shadowcomp_{}.csh", suffix_char));
for i in 1..=99 {
let total_suffix = format!("{}_{}", i, suffix_char);
set.insert(format!("composite{}.csh", total_suffix));
set.insert(format!("deferred{}.csh", total_suffix));
set.insert(format!("prepare{}.csh", total_suffix));
set.insert(format!("shadowcomp{}.csh", total_suffix));
}
}
set
};
}
fn main() {
let guard = logging::set_logger_with_level(Level::Info);
let endpoint_output = LSPEndpoint::create_lsp_output_with_output_stream(stdout);
let cache_graph = graph::CachedStableGraph::new();
let mut parser = Parser::new();
parser.set_language(tree_sitter_glsl::language()).unwrap();
let mut langserver = MinecraftShaderLanguageServer {
endpoint: endpoint_output.clone(),
graph: Rc::new(RefCell::new(cache_graph)),
root: "".into(),
command_provider: None,
opengl_context: Rc::new(opengl::OpenGlContext::new()),
tree_sitter: Rc::new(RefCell::new(parser)),
log_guard: Some(guard),
};
langserver.command_provider = Some(commands::CustomCommandProvider::new(vec![
(
"graphDot",
Box::new(commands::graph_dot::GraphDotCommand {
graph: langserver.graph.clone(),
}),
),
(
"virtualMerge",
Box::new(commands::merged_includes::VirtualMergedDocument {
graph: langserver.graph.clone(),
}),
),
(
"parseTree",
Box::new(commands::parse_tree::TreeSitterSExpr {
tree_sitter: langserver.tree_sitter.clone(),
}),
),
]));
LSPEndpoint::run_server_from_input(&mut stdin().lock(), endpoint_output, langserver);
}
pub struct MinecraftShaderLanguageServer {
endpoint: Endpoint,
graph: Rc<RefCell<graph::CachedStableGraph>>,
root: PathBuf,
command_provider: Option<commands::CustomCommandProvider>,
opengl_context: Rc<dyn opengl::ShaderValidator>,
tree_sitter: Rc<RefCell<Parser>>,
log_guard: Option<slog_scope::GlobalLoggerGuard>,
}
#[derive(Clone, Copy, PartialEq, Eq, Hash)]
pub struct IncludePosition {
// the 0-indexed line on which the include lives.
line: usize,
// the 0-indexed char offset defining the start of the include path string.
start: usize,
// the 0-indexed char offset defining the end of the include path string.
end: usize,
}
impl Debug for IncludePosition {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(f, "{{line: {}}}", self.line)
}
}
impl Display for IncludePosition {
fn fmt(&self, f: &mut Formatter<'_>) -> Result<(), std::fmt::Error> {
write!(f, "{{line: {}}}", self.line)
}
}
#[derive(Debug)]
pub enum TreeType {
Fragment,
Vertex,
Geometry,
Compute,
}
impl MinecraftShaderLanguageServer {
pub fn error_not_available<DATA>(data: DATA) -> MethodError<DATA> {
let msg = "Functionality not implemented.".to_string();
MethodError::<DATA> {
code: 1,
message: msg,
data,
}
}
fn build_initial_graph(&self) {
info!("generating graph for current root"; "root" => self.root.to_str().unwrap());
// filter directories and files not ending in any of the 3 extensions
WalkDir::new(&self.root)
.into_iter()
.filter_map(|entry| {
if entry.is_err() {
return None;
}
let entry = entry.unwrap();
let path = entry.path();
if path.is_dir() {
return None;
}
let ext = match path.extension() {
Some(e) => e,
None => return None,
};
// TODO: include user added extensions with a set
if ext != "vsh" && ext != "fsh" && ext != "csh" && ext != "gsh" && ext != "glsl" && ext != "inc" {
return None;
}
Some(entry.into_path())
})
.for_each(|path| {
// iterate all valid found files, search for includes, add a node into the graph for each
// file and add a file->includes KV into the map
self.add_file_and_includes_to_graph(&path);
});
info!("finished building project include graph");
}
fn add_file_and_includes_to_graph(&self, path: &Path) {
let includes = self.find_includes(path);
let idx = self.graph.borrow_mut().add_node(path);
debug!("adding includes for new file"; "file" => path.to_str().unwrap(), "includes" => format!("{:?}", includes));
for include in includes {
self.add_include(include, idx);
}
}
fn add_include(&self, include: (PathBuf, IncludePosition), node: NodeIndex) {
let child = self.graph.borrow_mut().add_node(&include.0);
self.graph.borrow_mut().add_edge(node, child, include.1);
}
pub fn find_includes(&self, file: &Path) -> Vec<(PathBuf, IncludePosition)> {
let mut includes = Vec::default();
let buf = BufReader::new(std::fs::File::open(file).unwrap());
buf.lines()
.enumerate()
.filter_map(|line| match line.1 {
Ok(t) => Some((line.0, t)),
Err(_e) => None,
})
.filter(|line| RE_INCLUDE.is_match(line.1.as_str()))
.for_each(|line| {
let cap = RE_INCLUDE.captures(line.1.as_str()).unwrap().get(1).unwrap();
let start = cap.start();
let end = cap.end();
let mut path: String = cap.as_str().into();
let full_include = if path.starts_with('/') {
path = path.strip_prefix('/').unwrap().to_string();
self.root.join("shaders").join(PathBuf::from_slash(&path))
} else {
file.parent().unwrap().join(PathBuf::from_slash(&path))
};
includes.push((full_include, IncludePosition { line: line.0, start, end }));
});
includes
}
fn update_includes(&self, file: &Path) {
let includes = self.find_includes(file);
info!("includes found for file"; "file" => file.to_str().unwrap(), "includes" => format!("{:?}", includes));
let idx = match self.graph.borrow_mut().find_node(file) {
None => return,
Some(n) => n,
};
let prev_children: HashSet<_> = HashSet::from_iter(self.graph.borrow().get_all_child_positions(idx).map(|tup| {
(self.graph.borrow().get_node(tup.0), tup.1)
}));
let new_children: HashSet<_> = includes.iter().cloned().collect();
let to_be_added = new_children.difference(&prev_children);
let to_be_removed = prev_children.difference(&new_children);
debug!(
"include sets diff'd";
"for removal" => format!("{:?}", to_be_removed),
"for addition" => format!("{:?}", to_be_added)
);
for removal in to_be_removed {
let child = self.graph.borrow_mut().find_node(&removal.0).unwrap();
self.graph.borrow_mut().remove_edge(idx, child, removal.1);
}
for insertion in to_be_added {
self.add_include(includes.iter().find(|f| f.0 == *insertion.0).unwrap().clone(), idx);
}
}
pub fn lint(&self, uri: &Path) -> Result<HashMap<Url, Vec<Diagnostic>>> {
// get all top level ancestors of this file
let file_ancestors = match self.get_file_toplevel_ancestors(uri) {
Ok(opt) => match opt {
Some(ancestors) => ancestors,
None => vec![],
},
Err(e) => return Err(e),
};
info!(
"top-level file ancestors found";
"uri" => uri.to_str().unwrap(),
"ancestors" => format!("{:?}", file_ancestors
.iter()
.map(|e| PathBuf::from_str(
&self.graph.borrow().graph[*e].clone()
)
.unwrap())
.collect::<Vec<PathBuf>>())
);
// the set of all filepath->content.
let mut all_sources: HashMap<PathBuf, String> = HashMap::new();
// the set of filepath->list of diagnostics to report
let mut diagnostics: HashMap<Url, Vec<Diagnostic>> = HashMap::new();
// we want to backfill the diagnostics map with all linked sources
let back_fill = |all_sources: &HashMap<PathBuf, String>, diagnostics: &mut HashMap<Url, Vec<Diagnostic>>| {
for path in all_sources.keys() {
diagnostics.entry(Url::from_file_path(path).unwrap()).or_default();
}
};
// if we are a top-level file (this has to be one of the set defined by Optifine, right?)
if file_ancestors.is_empty() {
// gather the list of all descendants
let root = self.graph.borrow_mut().find_node(uri).unwrap();
let tree = match self.get_dfs_for_node(root) {
Ok(tree) => tree,
Err(e) => {
diagnostics.insert(Url::from_file_path(uri).unwrap(), vec![e.into()]);
return Ok(diagnostics);
}
};
all_sources.extend(self.load_sources(&tree)?);
let mut source_mapper = source_mapper::SourceMapper::new(all_sources.len());
let view = {
let graph = self.graph.borrow();
let merged_string = {
merge_views::MergeViewBuilder::new(&tree, &all_sources, &graph, &mut source_mapper).build()
};
merged_string
};
let root_path = self.graph.borrow().get_node(root);
let ext = match root_path.extension() {
Some(ext) => ext.to_str().unwrap(),
None => {
back_fill(&all_sources, &mut diagnostics);
return Ok(diagnostics);
}
};
if !is_top_level(root_path.strip_prefix(&self.root).unwrap()) {
warn!("got a non-valid toplevel file"; "root_ancestor" => root_path.to_str().unwrap(), "stripped" => root_path.strip_prefix(&self.root).unwrap().to_str().unwrap());
back_fill(&all_sources, &mut diagnostics);
return Ok(diagnostics);
}
let tree_type = if ext == "fsh" {
TreeType::Fragment
} else if ext == "vsh" {
TreeType::Vertex
} else if ext == "gsh" {
TreeType::Geometry
} else if ext == "csh" {
TreeType::Compute
} else {
unreachable!();
};
let stdout = match self.compile_shader_source(&view, tree_type, &root_path) {
Some(s) => s,
None => {
back_fill(&all_sources, &mut diagnostics);
return Ok(diagnostics);
}
};
let diagnostics_parser = diagnostics_parser::DiagnosticsParser::new(self.opengl_context.as_ref());
diagnostics.extend(diagnostics_parser.parse_diagnostics_output(stdout, uri, &source_mapper, &self.graph.borrow()));
} else {
let mut all_trees: Vec<(TreeType, Vec<FilialTuple>)> = Vec::new();
for root in &file_ancestors {
let nodes = match self.get_dfs_for_node(*root) {
Ok(nodes) => nodes,
Err(e) => {
diagnostics.insert(Url::from_file_path(uri).unwrap(), vec![e.into()]);
back_fill(&all_sources, &mut diagnostics); // TODO: confirm
return Ok(diagnostics);
}
};
let root_path = self.graph.borrow().get_node(*root).clone();
let ext = match root_path.extension() {
Some(ext) => ext.to_str().unwrap(),
None => continue,
};
if !is_top_level(root_path.strip_prefix(&self.root).unwrap()) {
warn!("got a non-valid toplevel file"; "root_ancestor" => root_path.to_str().unwrap(), "stripped" => root_path.strip_prefix(&self.root).unwrap().to_str().unwrap());
continue;
}
let tree_type = if ext == "fsh" {
TreeType::Fragment
} else if ext == "vsh" {
TreeType::Vertex
} else if ext == "gsh" {
TreeType::Geometry
} else if ext == "csh" {
TreeType::Compute
} else {
unreachable!();
};
let sources = self.load_sources(&nodes)?;
all_trees.push((tree_type, nodes));
all_sources.extend(sources);
}
for tree in all_trees {
// bit over-zealous in allocation but better than having to resize
let mut source_mapper = source_mapper::SourceMapper::new(all_sources.len());
let view = {
let graph = self.graph.borrow();
let merged_string = {
merge_views::MergeViewBuilder::new(&tree.1, &all_sources, &graph, &mut source_mapper).build()
};
merged_string
};
let root_path = self.graph.borrow().get_node(tree.1.first().unwrap().child);
let stdout = match self.compile_shader_source(&view, tree.0, &root_path) {
Some(s) => s,
None => continue,
};
let diagnostics_parser = diagnostics_parser::DiagnosticsParser::new(self.opengl_context.as_ref());
diagnostics.extend(diagnostics_parser.parse_diagnostics_output(stdout, uri, &source_mapper, &self.graph.borrow()));
}
};
back_fill(&all_sources, &mut diagnostics);
Ok(diagnostics)
}
fn compile_shader_source(&self, source: &str, tree_type: TreeType, path: &Path) -> Option<String> {
let result = self.opengl_context.clone().validate(tree_type, source);
match &result {
Some(output) => {
info!("compilation errors reported"; "errors" => format!("`{}`", output.replace('\n', "\\n")), "tree_root" => path.to_str().unwrap())
}
None => info!("compilation reported no errors"; "tree_root" => path.to_str().unwrap()),
};
result
}
pub fn get_dfs_for_node(&self, root: NodeIndex) -> Result<Vec<FilialTuple>, dfs::error::CycleError> {
let graph_ref = self.graph.borrow();
let dfs = dfs::Dfs::new(&graph_ref, root);
dfs.collect::<Result<_, _>>()
}
pub fn load_sources(&self, nodes: &[FilialTuple]) -> Result<HashMap<PathBuf, String>> {
let mut sources = HashMap::new();
for node in nodes {
let graph = self.graph.borrow();
let path = graph.get_node(node.child);
if sources.contains_key(&path) {
continue;
}
let source = match fs::read_to_string(&path) {
Ok(s) => s,
Err(e) => return Err(anyhow!("error reading {:?}: {}", path, e)),
};
let source = source.replace("\r\n", "\n");
sources.insert(path.clone(), source);
}
Ok(sources)
}
fn get_file_toplevel_ancestors(&self, uri: &Path) -> Result<Option<Vec<petgraph::stable_graph::NodeIndex>>> {
let curr_node = match self.graph.borrow_mut().find_node(uri) {
Some(n) => n,
None => return Err(anyhow!("node not found {:?}", uri)),
};
let roots = self.graph.borrow().collect_root_ancestors(curr_node);
if roots.is_empty() {
return Ok(None);
}
Ok(Some(roots))
}
pub fn publish_diagnostic(&self, diagnostics: HashMap<Url, Vec<Diagnostic>>, document_version: Option<i32>) {
// info!("DIAGNOSTICS:\n{:?}", diagnostics);
for (uri, diagnostics) in diagnostics {
self.endpoint
.send_notification(
PublishDiagnostics::METHOD,
PublishDiagnosticsParams {
uri,
diagnostics,
version: document_version,
},
)
.expect("failed to publish diagnostics");
}
}
fn set_status(&self, status: impl Into<String>, message: impl Into<String>, icon: impl Into<String>) {
self.endpoint
.send_notification(
lsp_ext::Status::METHOD,
lsp_ext::StatusParams {
status: status.into(),
message: Some(message.into()),
icon: Some(icon.into()),
},
)
.unwrap_or(());
}
}
impl LanguageServerHandling for MinecraftShaderLanguageServer {
fn initialize(&mut self, params: InitializeParams, completable: MethodCompletable<InitializeResult, InitializeError>) {
logging::slog_with_trace_id(|| {
info!("starting server...");
let capabilities = ServerCapabilities {
definition_provider: Some(OneOf::Left(true)),
references_provider: Some(OneOf::Left(true)),
document_symbol_provider: Some(OneOf::Left(true)),
document_link_provider: Some(DocumentLinkOptions {
resolve_provider: None,
work_done_progress_options: WorkDoneProgressOptions { work_done_progress: None },
}),
execute_command_provider: Some(ExecuteCommandOptions {
commands: vec!["graphDot".into()],
work_done_progress_options: WorkDoneProgressOptions { work_done_progress: None },
}),
text_document_sync: Some(TextDocumentSyncCapability::Options(TextDocumentSyncOptions {
open_close: Some(true),
will_save: None,
will_save_wait_until: None,
change: Some(TextDocumentSyncKind::FULL),
save: Some(TextDocumentSyncSaveOptions::SaveOptions(SaveOptions { include_text: Some(true) })),
})),
..ServerCapabilities::default()
};
let root = match params.root_uri {
Some(uri) => PathBuf::from_url(uri),
None => {
completable.complete(Err(MethodError {
code: 42069,
message: "Must be in workspace".into(),
data: InitializeError { retry: false },
}));
return;
}
};
completable.complete(Ok(InitializeResult {
capabilities,
server_info: None,
}));
self.set_status("loading", "Building dependency graph...", "$(loading~spin)");
self.root = root;
self.build_initial_graph();
self.set_status("ready", "Project initialized", "$(check)");
});
}
fn shutdown(&mut self, _: (), completable: LSCompletable<()>) {
warn!("shutting down language server...");
completable.complete(Ok(()));
}
fn exit(&mut self, _: ()) {
self.endpoint.request_shutdown();
}
fn workspace_change_configuration(&mut self, params: DidChangeConfigurationParams) {
logging::slog_with_trace_id(|| {
#[derive(Deserialize)]
struct Configuration {
#[serde(alias = "logLevel")]
log_level: String,
}
if let Some(settings) = params.settings.as_object().unwrap().get("mcglsl") {
let config: Configuration = from_value(settings.to_owned()).unwrap();
info!("got updated configuration"; "config" => params.settings.as_object().unwrap().get("mcglsl").unwrap().to_string());
configuration::handle_log_level_change(config.log_level, |level| {
self.log_guard = None; // set to None so Drop is invoked
self.log_guard = Some(logging::set_logger_with_level(level));
})
}
});
}
fn did_open_text_document(&mut self, params: DidOpenTextDocumentParams) {
logging::slog_with_trace_id(|| {
//info!("opened doc {}", params.text_document.uri);
let path = PathBuf::from_url(params.text_document.uri);
if !path.starts_with(&self.root) {
return;
}
if self.graph.borrow_mut().find_node(&path) == None {
self.add_file_and_includes_to_graph(&path);
}
match self.lint(&path) {
Ok(diagnostics) => self.publish_diagnostic(diagnostics, None),
Err(e) => error!("error linting"; "error" => format!("{:?}", e), "path" => path.to_str().unwrap()),
}
});
}
fn did_change_text_document(&mut self, _: DidChangeTextDocumentParams) {}
fn did_close_text_document(&mut self, _: DidCloseTextDocumentParams) {}
fn did_save_text_document(&mut self, params: DidSaveTextDocumentParams) {
logging::slog_with_trace_id(|| {
let path = PathBuf::from_url(params.text_document.uri);
if !path.starts_with(&self.root) {
return;
}
self.update_includes(&path);
match self.lint(&path) {
Ok(diagnostics) => self.publish_diagnostic(diagnostics, None),
Err(e) => error!("error linting"; "error" => format!("{:?}", e), "path" => path.to_str().unwrap()),
}
});
}
fn did_change_watched_files(&mut self, _: DidChangeWatchedFilesParams) {}
fn completion(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<CompletionList>) {
completable.complete(Err(Self::error_not_available(())));
}
fn resolve_completion_item(&mut self, _: CompletionItem, completable: LSCompletable<CompletionItem>) {
completable.complete(Err(Self::error_not_available(())));
}
fn hover(&mut self, _: TextDocumentPositionParams, _: LSCompletable<Hover>) {
/* completable.complete(Ok(Hover{
contents: HoverContents::Markup(MarkupContent{
kind: MarkupKind::Markdown,
value: String::from("# Hello World"),
}),
range: None,
})); */
}
fn execute_command(&mut self, params: ExecuteCommandParams, completable: LSCompletable<Option<Value>>) {
logging::slog_with_trace_id(|| {
match self
.command_provider
.as_ref()
.unwrap()
.execute(&params.command, &params.arguments, &self.root)
{
Ok(resp) => {
info!("executed command successfully"; "command" => params.command.clone());
self.endpoint
.send_notification(
ShowMessage::METHOD,
ShowMessageParams {
typ: MessageType::INFO,
message: format!("Command {} executed successfully.", params.command),
},
)
.expect("failed to send popup/show message notification");
completable.complete(Ok(Some(resp)))
}
Err(err) => {
error!("failed to execute command"; "command" => params.command.clone(), "error" => format!("{:?}", err));
self.endpoint
.send_notification(
ShowMessage::METHOD,
ShowMessageParams {
typ: MessageType::ERROR,
message: format!("Failed to execute `{}`. Reason: {}", params.command, err),
},
)
.expect("failed to send popup/show message notification");
completable.complete(Err(MethodError::new(32420, err.to_string(), ())))
}
}
});
}
fn signature_help(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<SignatureHelp>) {
completable.complete(Err(Self::error_not_available(())));
}
fn goto_definition(&mut self, params: TextDocumentPositionParams, completable: LSCompletable<Vec<Location>>) {
logging::slog_with_trace_id(|| {
let path = PathBuf::from_url(params.text_document.uri);
if !path.starts_with(&self.root) {
return;
}
let parser = &mut self.tree_sitter.borrow_mut();
let parser_ctx = match navigation::ParserContext::new(parser, &path) {
Ok(ctx) => ctx,
Err(e) => {
return completable.complete(Err(MethodError {
code: 42069,
message: format!("error building parser context: error={}, path={:?}", e, path),
data: (),
}))
}
};
match parser_ctx.find_definitions(&path, params.position) {
Ok(locations) => completable.complete(Ok(locations.unwrap_or_default())),
Err(e) => completable.complete(Err(MethodError {
code: 42069,
message: format!("error finding definitions: error={}, path={:?}", e, path),
data: (),
})),
}
});
}
fn references(&mut self, params: ReferenceParams, completable: LSCompletable<Vec<Location>>) {
logging::slog_with_trace_id(|| {
let path = PathBuf::from_url(params.text_document_position.text_document.uri);
if !path.starts_with(&self.root) {
return;
}
let parser = &mut self.tree_sitter.borrow_mut();
let parser_ctx = match navigation::ParserContext::new(parser, &path) {
Ok(ctx) => ctx,
Err(e) => {
return completable.complete(Err(MethodError {
code: 42069,
message: format!("error building parser context: error={}, path={:?}", e, path),
data: (),
}))
}
};
match parser_ctx.find_references(&path, params.text_document_position.position) {
Ok(locations) => completable.complete(Ok(locations.unwrap_or_default())),
Err(e) => completable.complete(Err(MethodError {
code: 42069,
message: format!("error finding definitions: error={}, path={:?}", e, path),
data: (),
})),
}
});
}
fn document_highlight(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<Vec<DocumentHighlight>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn document_symbols(&mut self, params: DocumentSymbolParams, completable: LSCompletable<DocumentSymbolResponse>) {
logging::slog_with_trace_id(|| {
let path = PathBuf::from_url(params.text_document.uri);
if !path.starts_with(&self.root) {
return;
}
let parser = &mut self.tree_sitter.borrow_mut();
let parser_ctx = match navigation::ParserContext::new(parser, &path) {
Ok(ctx) => ctx,
Err(e) => {
return completable.complete(Err(MethodError {
code: 42069,
message: format!("error building parser context: error={}, path={:?}", e, path),
data: (),
}))
}
};
match parser_ctx.list_symbols(&path) {
Ok(symbols) => completable.complete(Ok(DocumentSymbolResponse::from(symbols.unwrap_or_default()))),
Err(e) => {
return completable.complete(Err(MethodError {
code: 42069,
message: format!("error finding definitions: error={}, path={:?}", e, path),
data: (),
}))
}
}
});
}
fn workspace_symbols(&mut self, _: WorkspaceSymbolParams, completable: LSCompletable<DocumentSymbolResponse>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_action(&mut self, _: CodeActionParams, completable: LSCompletable<Vec<Command>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_lens(&mut self, _: CodeLensParams, completable: LSCompletable<Vec<CodeLens>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_lens_resolve(&mut self, _: CodeLens, completable: LSCompletable<CodeLens>) {
completable.complete(Err(Self::error_not_available(())));
}
fn document_link(&mut self, params: DocumentLinkParams, completable: LSCompletable<Vec<DocumentLink>>) {
logging::slog_with_trace_id(|| {
// node for current document
let curr_doc = PathBuf::from_url(params.text_document.uri);
let node = match self.graph.borrow_mut().find_node(&curr_doc) {
Some(n) => n,
None => {
warn!("document not found in graph"; "path" => curr_doc.to_str().unwrap());
completable.complete(Ok(vec![]));
return;
}
};
let edges: Vec<DocumentLink> = self
.graph
.borrow()
.child_node_indexes(node)
.filter_map::<Vec<DocumentLink>, _>(|child| {
let graph = self.graph.borrow();
graph.get_child_positions(node, child).map(|value| {
let path = graph.get_node(child);
let url = match Url::from_file_path(&path) {
Ok(url) => url,
Err(e) => {
error!("error converting into url"; "path" => path.to_str().unwrap(), "error" => format!("{:?}", e));
return None;
}
};
Some(DocumentLink {
range: Range::new(
Position::new(u32::try_from(value.line).unwrap(), u32::try_from(value.start).unwrap()),
Position::new(u32::try_from(value.line).unwrap(), u32::try_from(value.end).unwrap()),
),
target: Some(url.clone()),
tooltip: Some(url.path().to_string()),
data: None,
})
}).collect()
})
.flatten()
.collect();
debug!("document link results";
"links" => format!("{:?}", edges.iter().map(|e| (e.range, e.target.as_ref().unwrap().path())).collect::<Vec<_>>()),
"path" => curr_doc.to_str().unwrap(),
);
completable.complete(Ok(edges));
});
}
fn document_link_resolve(&mut self, _: DocumentLink, completable: LSCompletable<DocumentLink>) {
completable.complete(Err(Self::error_not_available(())));
}
fn formatting(&mut self, _: DocumentFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn range_formatting(&mut self, _: DocumentRangeFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn on_type_formatting(&mut self, _: DocumentOnTypeFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn rename(&mut self, _: RenameParams, completable: LSCompletable<WorkspaceEdit>) {
completable.complete(Err(Self::error_not_available(())));
}
}

View file

@ -0,0 +1,645 @@
use std::cmp::min;
use std::iter::Peekable;
use std::{
collections::{HashMap, LinkedList, VecDeque},
path::{Path, PathBuf},
};
use core::slice::Iter;
use petgraph::stable_graph::NodeIndex;
use slog_scope::debug;
use crate::graph::CachedStableGraph;
use crate::source_mapper::SourceMapper;
use crate::IncludePosition;
/// FilialTuple represents a tuple (not really) of a child and any legitimate
/// parent. Parent can be nullable in the case of the child being a top level
/// node in the tree.
#[derive(Hash, PartialEq, Eq, Debug, Clone, Copy)]
pub struct FilialTuple {
pub child: NodeIndex,
pub parent: Option<NodeIndex>,
}
/// Merges the source strings according to the nodes comprising a tree of imports into a GLSL source string
/// that can be handed off to the GLSL compiler.
pub struct MergeViewBuilder<'a> {
nodes: &'a [FilialTuple],
nodes_peeker: Peekable<Iter<'a, FilialTuple>>,
sources: &'a HashMap<PathBuf, String>,
graph: &'a CachedStableGraph,
source_mapper: &'a mut SourceMapper,
// holds the offset into the child which has been added to the merge list for a parent.
// A child can have multiple parents for a given tree, and be included multiple times
// by the same parent, hence we have to track it for a ((child, parent), line) tuple
// instead of just the child or (child, parent).
last_offset_set: HashMap<FilialTuple, usize>,
// holds, for any given filial tuple, the iterator yielding all the positions at which the child
// is included into the parent in line-sorted order. This is necessary for files that are imported
// more than once into the same parent, so we can easily get the next include position.
parent_child_edge_iterator: HashMap<FilialTuple, Box<(dyn Iterator<Item = IncludePosition> + 'a)>>,
}
impl<'a> MergeViewBuilder<'a> {
pub fn new(
nodes: &'a [FilialTuple], sources: &'a HashMap<PathBuf, String>, graph: &'a CachedStableGraph, source_mapper: &'a mut SourceMapper,
) -> Self {
MergeViewBuilder {
nodes,
nodes_peeker: nodes.iter().peekable(),
sources,
graph,
source_mapper,
last_offset_set: HashMap::new(),
parent_child_edge_iterator: HashMap::new(),
}
}
pub fn build(&mut self) -> String {
// contains additionally inserted lines such as #line and other directives, preamble defines etc
let mut extra_lines: Vec<String> = Vec::new();
extra_lines.reserve((self.nodes.len() * 2) + 2);
// list of source code views onto the below sources
let mut merge_list: LinkedList<&'a str> = LinkedList::new();
// invariant: nodes_iter always has _at least_ one element. Can't save a not-file :B
let first = self.nodes_peeker.next().unwrap().child;
let first_path = self.graph.get_node(first);
let first_source = self.sources.get(&first_path).unwrap();
// seed source_mapper with top-level file
self.source_mapper.get_num(first);
let version_line_offset = self.find_version_offset(first_source);
let _version_char_offsets = self.char_offset_for_line(version_line_offset, first_source);
// add_preamble(
// version_line_offset,
// version_char_offsets.1,
// &first_path,
// first,
// first_source,
// &mut merge_list,
// &mut extra_lines,
// source_mapper,
// );
// last_offset_set.insert((first, None), version_char_offsets.1);
self.set_last_offset_for_tuple(None, first, 0);
// stack to keep track of the depth first traversal
let mut stack = VecDeque::<NodeIndex>::new();
self.create_merge_views(&mut merge_list, &mut extra_lines, &mut stack);
// now we add a view of the remainder of the root file
let offset = self.get_last_offset_for_tuple(None, first).unwrap();
let len = first_source.len();
merge_list.push_back(&first_source[min(offset, len)..]);
let total_len = merge_list.iter().fold(0, |a, b| a + b.len());
let mut merged = String::with_capacity(total_len);
merged.extend(merge_list);
merged
}
fn create_merge_views(&mut self, merge_list: &mut LinkedList<&'a str>, extra_lines: &mut Vec<String>, stack: &mut VecDeque<NodeIndex>) {
loop {
let n = match self.nodes_peeker.next() {
Some(n) => n,
None => return,
};
// invariant: never None as only the first element in `nodes` should have a None, which is popped off in the calling function
let (parent, child) = (n.parent.unwrap(), n.child);
// gets the next include position for the filial tuple, seeding if this is the first time querying this tuple
let edge = self
.parent_child_edge_iterator
.entry(*n)
.or_insert_with(|| {
let child_positions = self.graph.get_child_positions(parent, child);
Box::new(child_positions)
})
.next()
.unwrap();
let parent_path = self.graph.get_node(parent).clone();
let child_path = self.graph.get_node(child).clone();
let parent_source = self.sources.get(&parent_path).unwrap();
let (char_for_line, char_following_line) = self.char_offset_for_line(edge.line, parent_source);
let offset = *self
.set_last_offset_for_tuple(stack.back().copied(), parent, char_following_line)
.get_or_insert(0);
debug!("creating view to start child file";
"parent" => parent_path.to_str().unwrap(), "child" => child_path.to_str().unwrap(),
"grandparent" => stack.back().copied().map(|g| self.graph.get_node(g).to_str().unwrap().to_string()), // self.graph.get_node().to_str().unwrap(),
"last_parent_offset" => offset, "line" => edge.line, "char_for_line" => char_for_line,
"char_following_line" => char_following_line,
);
merge_list.push_back(&parent_source[offset..char_for_line]);
self.add_opening_line_directive(&child_path, child, merge_list, extra_lines);
match self.nodes_peeker.peek() {
Some(next) => {
let next = *next;
// if the next pair's parent is not a child of the current pair, we dump the rest of this childs source
if next.parent.unwrap() != child {
let child_source = self.sources.get(&child_path).unwrap();
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
let offset = {
match child_source.ends_with('\n') {
true => child_source.len() - 1,
false => child_source.len(),
}
};
merge_list.push_back(&child_source[..offset]);
self.set_last_offset_for_tuple(Some(parent), child, 0);
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
self.add_closing_line_directive(edge.line + 2, &parent_path, parent, merge_list, extra_lines);
// if the next pair's parent is not the current pair's parent, we need to bubble up
if stack.contains(&next.parent.unwrap()) {
return;
}
continue;
}
stack.push_back(parent);
self.create_merge_views(merge_list, extra_lines, stack);
stack.pop_back();
let offset = self.get_last_offset_for_tuple(Some(parent), child).unwrap();
let child_source = self.sources.get(&child_path).unwrap();
// this evaluates to false once the file contents have been exhausted aka offset = child_source.len() + 1
let end_offset = match child_source.ends_with('\n') {
true => 1,
false => 0,
};
if offset < child_source.len() - end_offset {
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
merge_list.push_back(&child_source[offset..child_source.len() - end_offset]);
self.set_last_offset_for_tuple(Some(parent), child, 0);
}
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
self.add_closing_line_directive(edge.line + 2, &parent_path, parent, merge_list, extra_lines);
// we need to check the next item at the point of original return further down the callstack
if self.nodes_peeker.peek().is_some() && stack.contains(&self.nodes_peeker.peek().unwrap().parent.unwrap()) {
return;
}
}
None => {
let child_source = self.sources.get(&child_path).unwrap();
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
let offset = match child_source.ends_with('\n') {
true => child_source.len() - 1,
false => child_source.len(),
};
merge_list.push_back(&child_source[..offset]);
self.set_last_offset_for_tuple(Some(parent), child, 0);
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
self.add_closing_line_directive(edge.line + 2, &parent_path, parent, merge_list, extra_lines);
}
}
}
}
fn set_last_offset_for_tuple(&mut self, parent: Option<NodeIndex>, child: NodeIndex, offset: usize) -> Option<usize> {
debug!("inserting last offset";
"parent" => parent.map(|p| self.graph.get_node(p).to_str().unwrap().to_string()),
"child" => self.graph.get_node(child).to_str().unwrap().to_string(),
"offset" => offset);
self.last_offset_set.insert(FilialTuple { child, parent }, offset)
}
fn get_last_offset_for_tuple(&self, parent: Option<NodeIndex>, child: NodeIndex) -> Option<usize> {
self.last_offset_set.get(&FilialTuple { child, parent }).copied()
}
// returns the character offset + 1 of the end of line number `line` and the character
// offset + 1 for the end of the line after the previous one
fn char_offset_for_line(&self, line_num: usize, source: &str) -> (usize, usize) {
let mut char_for_line: usize = 0;
let mut char_following_line: usize = 0;
for (n, line) in source.lines().enumerate() {
if n == line_num {
char_following_line += line.len() + 1;
break;
}
char_for_line += line.len() + 1;
char_following_line = char_for_line;
}
(char_for_line, char_following_line)
}
fn find_version_offset(&self, source: &str) -> usize {
source
.lines()
.enumerate()
.find(|(_, line)| line.starts_with("#version "))
.map_or(0, |(i, _)| i)
}
// fn add_preamble<'a>(
// version_line_offset: usize, version_char_offset: usize, path: &Path, node: NodeIndex, source: &'a str,
// merge_list: &mut LinkedList<&'a str>, extra_lines: &mut Vec<String>, source_mapper: &mut SourceMapper,
// ) {
// // TODO: Optifine #define preabmle
// merge_list.push_back(&source[..version_char_offset]);
// let google_line_directive = format!(
// "#extension GL_GOOGLE_cpp_style_line_directive : enable\n#line {} {} // {}\n",
// // +2 because 0 indexed but #line is 1 indexed and references the *following* line
// version_line_offset + 2,
// source_mapper.get_num(node),
// path.to_str().unwrap().replace('\\', "\\\\"),
// );
// extra_lines.push(google_line_directive);
// unsafe_get_and_insert(merge_list, extra_lines);
// }
fn add_opening_line_directive(
&mut self, path: &Path, node: NodeIndex, merge_list: &mut LinkedList<&str>, extra_lines: &mut Vec<String>,
) {
let line_directive = format!(
"#line 1 {} // {}\n",
self.source_mapper.get_num(node),
path.to_str().unwrap().replace('\\', "\\\\")
);
extra_lines.push(line_directive);
self.unsafe_get_and_insert(merge_list, extra_lines);
}
fn add_closing_line_directive(
&mut self, line: usize, path: &Path, node: NodeIndex, merge_list: &mut LinkedList<&str>, extra_lines: &mut Vec<String>,
) {
// Optifine doesn't seem to add a leading newline if the previous line was a #line directive
let line_directive = if let Some(l) = merge_list.back() {
if l.trim().starts_with("#line") {
format!(
"#line {} {} // {}\n",
line,
self.source_mapper.get_num(node),
path.to_str().unwrap().replace('\\', "\\\\")
)
} else {
format!(
"\n#line {} {} // {}\n",
line,
self.source_mapper.get_num(node),
path.to_str().unwrap().replace('\\', "\\\\")
)
}
} else {
format!(
"\n#line {} {} // {}\n",
line,
self.source_mapper.get_num(node),
path.to_str().unwrap().replace('\\', "\\\\")
)
};
extra_lines.push(line_directive);
self.unsafe_get_and_insert(merge_list, extra_lines);
}
fn unsafe_get_and_insert(&self, merge_list: &mut LinkedList<&str>, extra_lines: &[String]) {
// :^)
unsafe {
let vec_ptr_offset = extra_lines.as_ptr().add(extra_lines.len() - 1);
merge_list.push_back(&vec_ptr_offset.as_ref().unwrap()[..]);
}
}
}
#[cfg(test)]
mod merge_view_test {
use std::fs;
use std::path::PathBuf;
use crate::merge_views::MergeViewBuilder;
use crate::source_mapper::SourceMapper;
use crate::test::{copy_to_and_set_root, new_temp_server};
use crate::IncludePosition;
#[test]
#[logging_macro::log_scope]
fn test_generate_merge_list_01() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_and_set_root("./testdata/01", &mut server);
server.endpoint.request_shutdown();
let final_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("final.fsh"));
let common_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("common.glsl"));
server
.graph
.borrow_mut()
.add_edge(final_idx, common_idx, IncludePosition { line: 2, start: 0, end: 0 });
let nodes = server.get_dfs_for_node(final_idx).unwrap();
let sources = server.load_sources(&nodes).unwrap();
let graph_borrow = server.graph.borrow();
let mut source_mapper = SourceMapper::new(0);
let result = MergeViewBuilder::new(&nodes, &sources, &graph_borrow, &mut source_mapper).build();
let merge_file = tmp_path.join("shaders").join("final.fsh.merge");
let mut truth = fs::read_to_string(merge_file).unwrap();
// truth = truth.replacen(
// "!!",
// &tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
// 1,
// );
truth = truth.replacen(
"!!",
&tmp_path.join("shaders").join("common.glsl").to_str().unwrap().replace('\\', "\\\\"),
1,
);
truth = truth.replace(
"!!",
&tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
);
assert_eq!(result, truth);
}
#[test]
#[logging_macro::log_scope]
fn test_generate_merge_list_02() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_and_set_root("./testdata/02", &mut server);
server.endpoint.request_shutdown();
let final_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("final.fsh"));
let test_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("test.glsl"));
let burger_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("burger.glsl"));
let sample_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("sample.glsl"));
server
.graph
.borrow_mut()
.add_edge(final_idx, sample_idx, IncludePosition { line: 2, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(sample_idx, burger_idx, IncludePosition { line: 4, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(sample_idx, test_idx, IncludePosition { line: 6, start: 0, end: 0 });
let nodes = server.get_dfs_for_node(final_idx).unwrap();
let sources = server.load_sources(&nodes).unwrap();
let graph_borrow = server.graph.borrow();
let mut source_mapper = SourceMapper::new(0);
let result = MergeViewBuilder::new(&nodes, &sources, &graph_borrow, &mut source_mapper).build();
let merge_file = tmp_path.join("shaders").join("final.fsh.merge");
let mut truth = fs::read_to_string(merge_file).unwrap();
// truth = truth.replacen(
// "!!",
// &tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
// 1,
// );
for file in &["sample.glsl", "burger.glsl", "sample.glsl", "test.glsl", "sample.glsl"] {
let path = tmp_path.clone();
truth = truth.replacen(
"!!",
&path
.join("shaders")
.join("utils")
.join(file)
.to_str()
.unwrap()
.replace('\\', "\\\\"),
1,
);
}
truth = truth.replacen(
"!!",
&tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
1,
);
assert_eq!(result, truth);
}
#[test]
#[logging_macro::log_scope]
fn test_generate_merge_list_03() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_and_set_root("./testdata/03", &mut server);
server.endpoint.request_shutdown();
let final_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("final.fsh"));
let test_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("test.glsl"));
let burger_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("burger.glsl"));
let sample_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("sample.glsl"));
server
.graph
.borrow_mut()
.add_edge(final_idx, sample_idx, IncludePosition { line: 2, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(sample_idx, burger_idx, IncludePosition { line: 4, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(sample_idx, test_idx, IncludePosition { line: 6, start: 0, end: 0 });
let nodes = server.get_dfs_for_node(final_idx).unwrap();
let sources = server.load_sources(&nodes).unwrap();
let graph_borrow = server.graph.borrow();
let mut source_mapper = SourceMapper::new(0);
let result = MergeViewBuilder::new(&nodes, &sources, &graph_borrow, &mut source_mapper).build();
let merge_file = tmp_path.join("shaders").join("final.fsh.merge");
let mut truth = fs::read_to_string(merge_file).unwrap();
// truth = truth.replacen(
// "!!",
// &tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
// 1,
// );
for file in &["sample.glsl", "burger.glsl", "sample.glsl", "test.glsl", "sample.glsl"] {
let path = tmp_path.clone();
truth = truth.replacen(
"!!",
&path
.join("shaders")
.join("utils")
.join(file)
.to_str()
.unwrap()
.replace('\\', "\\\\"),
1,
);
}
truth = truth.replacen(
"!!",
&tmp_path.join("shaders").join("final.fsh").to_str().unwrap().replace('\\', "\\\\"),
1,
);
assert_eq!(result, truth);
}
#[test]
#[logging_macro::log_scope]
fn test_generate_merge_list_04() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_and_set_root("./testdata/04", &mut server);
server.endpoint.request_shutdown();
let final_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("final.fsh"));
let utilities_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("utilities.glsl"));
let stuff1_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("stuff1.glsl"));
let stuff2_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("utils").join("stuff2.glsl"));
let matrices_idx = server
.graph
.borrow_mut()
.add_node(&tmp_path.join("shaders").join("lib").join("matrices.glsl"));
server
.graph
.borrow_mut()
.add_edge(final_idx, utilities_idx, IncludePosition { line: 2, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(utilities_idx, stuff1_idx, IncludePosition { line: 0, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(utilities_idx, stuff2_idx, IncludePosition { line: 1, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(final_idx, matrices_idx, IncludePosition { line: 3, start: 0, end: 0 });
let nodes = server.get_dfs_for_node(final_idx).unwrap();
let sources = server.load_sources(&nodes).unwrap();
let graph_borrow = server.graph.borrow();
let mut source_mapper = SourceMapper::new(0);
let result = MergeViewBuilder::new(&nodes, &sources, &graph_borrow, &mut source_mapper).build();
let merge_file = tmp_path.join("shaders").join("final.fsh.merge");
let mut truth = fs::read_to_string(merge_file).unwrap();
for file in &[
// PathBuf::new().join("final.fsh").to_str().unwrap(),
PathBuf::new().join("utils").join("utilities.glsl").to_str().unwrap(),
PathBuf::new().join("utils").join("stuff1.glsl").to_str().unwrap(),
PathBuf::new().join("utils").join("utilities.glsl").to_str().unwrap(),
PathBuf::new().join("utils").join("stuff2.glsl").to_str().unwrap(),
PathBuf::new().join("utils").join("utilities.glsl").to_str().unwrap(),
PathBuf::new().join("final.fsh").to_str().unwrap(),
PathBuf::new().join("lib").join("matrices.glsl").to_str().unwrap(),
PathBuf::new().join("final.fsh").to_str().unwrap(),
] {
let path = tmp_path.clone();
truth = truth.replacen("!!", &path.join("shaders").join(file).to_str().unwrap().replace('\\', "\\\\"), 1);
}
assert_eq!(result, truth);
}
#[test]
#[logging_macro::log_scope]
fn test_generate_merge_list_06() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_and_set_root("./testdata/06", &mut server);
server.endpoint.request_shutdown();
let final_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("final.fsh"));
let test_idx = server.graph.borrow_mut().add_node(&tmp_path.join("shaders").join("test.glsl"));
server
.graph
.borrow_mut()
.add_edge(final_idx, test_idx, IncludePosition { line: 3, start: 0, end: 0 });
server
.graph
.borrow_mut()
.add_edge(final_idx, test_idx, IncludePosition { line: 5, start: 0, end: 0 });
let nodes = server.get_dfs_for_node(final_idx).unwrap();
let sources = server.load_sources(&nodes).unwrap();
let graph_borrow = server.graph.borrow();
let mut source_mapper = SourceMapper::new(0);
let result = MergeViewBuilder::new(&nodes, &sources, &graph_borrow, &mut source_mapper).build();
let merge_file = tmp_path.join("shaders").join("final.fsh.merge");
let mut truth = fs::read_to_string(merge_file).unwrap();
for file in &[
// PathBuf::new().join("final.fsh").to_str().unwrap(),
PathBuf::new().join("test.glsl").to_str().unwrap(),
PathBuf::new().join("final.fsh").to_str().unwrap(),
PathBuf::new().join("test.glsl").to_str().unwrap(),
PathBuf::new().join("final.fsh").to_str().unwrap(),
] {
let path = tmp_path.clone();
truth = truth.replacen("!!", &path.join("shaders").join(file).to_str().unwrap().replace('\\', "\\\\"), 1);
}
assert_eq!(result, truth);
}
}

View file

@ -0,0 +1,429 @@
use std::{collections::HashMap, fs::read_to_string, path::Path, vec};
use anyhow::Result;
use rust_lsp::lsp_types::{DocumentSymbol, Location, Position, Range, SymbolKind};
use slog_scope::{debug, info, trace};
use tree_sitter::{Node, Parser, Point, Query, QueryCursor, Tree};
use url::Url;
use crate::linemap::LineMap;
#[derive(Clone, Debug, Hash, PartialEq, Eq, Default)]
struct SymbolName(String);
impl SymbolName {
// construct a new SymbolName from a node and its node ID for overload disambiguating.
fn new(node: &Node, source: &str, node_id: usize) -> Self {
let mut fqname = vec![format!("{}[{}]", node.utf8_text(source.as_bytes()).unwrap(), node_id)];
// first node will always have a parent
let mut prev = *node;
let mut node = node.parent().unwrap();
loop {
match (node.kind(), prev.kind()) {
("function_definition", "compound_statement") => {
let func_ident = node.child_by_field_name("declarator").unwrap().child(0).unwrap();
fqname.push(format!("{}[{}]", func_ident.utf8_text(source.as_bytes()).unwrap(), func_ident.id()));
}
("struct_specifier", "field_declaration_list") => {
let struct_ident = node.child_by_field_name("name").unwrap();
fqname.push(format!(
"{}[{}]",
struct_ident.utf8_text(source.as_bytes()).unwrap(),
struct_ident.id()
));
}
_ => (),
}
prev = node;
node = match node.parent() {
Some(n) => n,
None => break,
};
}
fqname.reverse();
SymbolName(fqname.join("/"))
}
fn parent(&self) -> Option<Self> {
self.0.rsplit_once('/').map(|(left, _)| SymbolName(left.to_string()))
}
}
impl slog::Value for SymbolName {
fn serialize(&self, record: &slog::Record, key: slog::Key, serializer: &mut dyn slog::Serializer) -> slog::Result {
self.0.serialize(record, key, serializer)
}
}
macro_rules! find_function_def_str {
() => {
r#"
(
(function_declarator
(identifier) @function)
(#match? @function "^{}$")
)
"#
};
}
macro_rules! find_function_refs_str {
() => {
r#"
(
(call_expression
(identifier) @call)
(#match? @call "^{}$")
)
"#
};
}
macro_rules! find_variable_def_str {
() => {
r#"
[
(init_declarator
(identifier) @variable)
(parameter_declaration
(identifier) @variable)
(declaration
(identifier) @variable)
(#match? @variable "^{}$")
]
"#
};
}
const LIST_SYMBOLS_STR: &str = r#"
; global consts
(declaration
(type_qualifier) @const_qualifier
(init_declarator
(identifier) @const_ident))
(#match? @const_qualifier "^const")
; global uniforms, varyings, struct variables etc
(translation_unit
(declaration
(identifier) @ident))
; #defines
(preproc_def
(identifier) @define_ident)
; function definitions
(function_declarator
(identifier) @func_ident)
; struct definitions
(struct_specifier
(type_identifier) @struct_ident)
; struct fields
(struct_specifier
(field_declaration_list
(field_declaration
[
(field_identifier) @field_ident
(array_declarator
(field_identifier) @field_ident)
])) @field_list)
"#;
pub struct ParserContext<'a> {
source: String,
tree: Tree,
linemap: LineMap,
parser: &'a mut Parser,
}
impl<'a> ParserContext<'a> {
pub fn new(parser: &'a mut Parser, path: &Path) -> Result<Self> {
let source = read_to_string(path)?;
let tree = parser.parse(&source, None).unwrap();
let linemap = LineMap::new(&source);
Ok(ParserContext {
source,
tree,
linemap,
parser,
})
}
pub fn list_symbols(&self, _path: &Path) -> Result<Option<Vec<DocumentSymbol>>> {
let query = Query::new(tree_sitter_glsl::language(), LIST_SYMBOLS_STR)?;
let mut query_cursor = QueryCursor::new();
let mut parent_child_vec: Vec<(Option<SymbolName>, DocumentSymbol)> = vec![];
let mut fqname_to_index: HashMap<SymbolName, usize> = HashMap::new();
for (m, _) in query_cursor.captures(&query, self.root_node(), self.source.as_bytes()) {
if m.captures.is_empty() {
continue;
}
let mut capture_iter = m.captures.iter();
let capture = capture_iter.next().unwrap();
let capture_name = query.capture_names()[capture.index as usize].as_str();
trace!("next capture name"; "name" => capture_name, "capture" => format!("{:?}", capture));
let (kind, node) = match capture_name {
"const_qualifier" => (SymbolKind::CONSTANT, capture_iter.next().unwrap().node),
"ident" => (SymbolKind::VARIABLE, capture.node),
"func_ident" => (SymbolKind::FUNCTION, capture.node),
"define_ident" => (SymbolKind::STRING, capture.node),
"struct_ident" => (SymbolKind::STRUCT, capture.node),
"field_list" => (SymbolKind::FIELD, capture_iter.next().unwrap().node),
_ => (SymbolKind::NULL, capture.node),
};
let range = Range {
start: Position {
line: node.start_position().row as u32,
character: node.start_position().column as u32,
},
end: Position {
line: node.end_position().row as u32,
character: node.end_position().column as u32,
},
};
let name = node.utf8_text(self.source.as_bytes()).unwrap().to_string();
let fqname = SymbolName::new(&node, self.source.as_str(), node.id());
debug!("found symbol"; "node_name" => &name, "kind" => format!("{:?}", kind), "fqname" => &fqname);
let child_symbol = DocumentSymbol {
name,
detail: None,
kind,
tags: None,
deprecated: None,
range,
selection_range: range,
children: None,
};
parent_child_vec.push((fqname.parent(), child_symbol));
trace!("inserting fqname"; "fqname" => &fqname, "index" => parent_child_vec.len() - 1);
fqname_to_index.insert(fqname, parent_child_vec.len() - 1);
}
// let mut symbols = vec![];
for i in 1..parent_child_vec.len() {
let (left, right) = parent_child_vec.split_at_mut(i);
let parent = &right[0].0;
let child = &right[0].1;
if let Some(parent) = parent {
trace!("finding parent"; "parent_symbol_name" => &parent, "child" => format!("{:?}", child), "split_point" => i, "left_len" => left.len(), "right_len" => right.len());
let parent_index = fqname_to_index.get(parent).unwrap();
let parent_sym = &mut left[*parent_index];
parent_sym.1.children.get_or_insert_default().push(right[0].1.clone())
}
}
let symbols = parent_child_vec
.iter()
.filter(|tuple| tuple.0.is_none())
.map(|tuple| tuple.1.clone())
.collect();
Ok(Some(symbols))
}
pub fn find_definitions(&self, path: &Path, point: Position) -> Result<Option<Vec<Location>>> {
let current_node = match self.find_node_at_point(point) {
Some(node) => node,
None => return Ok(None),
};
let parent = match current_node.parent() {
Some(parent) => parent,
None => return Ok(None),
};
debug!("matching location lookup method for parent-child tuple"; "parent" => parent.kind(), "child" => current_node.kind());
let locations = match (current_node.kind(), parent.kind()) {
(_, "call_expression") => {
let query_str = format!(find_function_def_str!(), current_node.utf8_text(self.source.as_bytes())?);
self.simple_global_search(path, &query_str)?
}
("identifier", "argument_list")
| ("identifier", "field_expression")
| ("identifier", "binary_expression")
| ("identifier", "assignment_expression") => self.tree_climbing_search(path, current_node)?,
_ => return Ok(None),
};
info!("finished searching for definitions"; "count" => locations.len(), "definitions" => format!("{:?}", locations));
Ok(Some(locations))
}
pub fn find_references(&self, path: &Path, point: Position) -> Result<Option<Vec<Location>>> {
let current_node = match self.find_node_at_point(point) {
Some(node) => node,
None => return Ok(None),
};
let parent = match current_node.parent() {
Some(parent) => parent,
None => return Ok(None),
};
let locations = match (current_node.kind(), parent.kind()) {
(_, "function_declarator") => {
let query_str = format!(find_function_refs_str!(), current_node.utf8_text(self.source.as_bytes())?);
self.simple_global_search(path, &query_str)?
}
_ => return Ok(None),
};
info!("finished searching for references"; "count" => locations.len(), "references" => format!("{:?}", locations));
Ok(Some(locations))
}
fn tree_climbing_search(&self, path: &Path, start_node: Node) -> Result<Vec<Location>> {
let mut locations = vec![];
let node_text = start_node.utf8_text(self.source.as_bytes())?;
let query_str = format!(find_variable_def_str!(), node_text);
debug!("built query string"; "query" => &query_str);
let mut parent = start_node.parent();
loop {
if parent.is_none() {
trace!("no more parent left, found nothing");
break;
}
let query = Query::new(tree_sitter_glsl::language(), &query_str)?;
let mut query_cursor = QueryCursor::new();
trace!("running tree-sitter query for node"; "node" => format!("{:?}", parent.unwrap()), "node_text" => parent.unwrap().utf8_text(self.source.as_bytes()).unwrap());
for m in query_cursor.matches(&query, parent.unwrap(), self.source.as_bytes()) {
for capture in m.captures {
let start = capture.node.start_position();
let end = capture.node.end_position();
locations.push(Location {
uri: Url::from_file_path(path).unwrap(),
range: Range {
start: Position {
line: start.row as u32,
character: start.column as u32,
},
end: Position {
line: end.row as u32,
character: end.column as u32,
},
},
});
}
}
if !locations.is_empty() {
break;
}
parent = parent.unwrap().parent();
}
Ok(locations)
}
fn simple_global_search(&self, path: &Path, query_str: &str) -> Result<Vec<Location>> {
let query = Query::new(tree_sitter_glsl::language(), query_str)?;
let mut query_cursor = QueryCursor::new();
let mut locations = vec![];
for m in query_cursor.matches(&query, self.root_node(), self.source.as_bytes()) {
for capture in m.captures {
let start = capture.node.start_position();
let end = capture.node.end_position();
locations.push(Location {
uri: Url::from_file_path(path).unwrap(),
range: Range {
start: Position {
line: start.row as u32,
character: start.column as u32,
},
end: Position {
line: end.row as u32,
character: end.column as u32,
},
},
});
}
}
Ok(locations)
}
fn root_node(&self) -> Node {
self.tree.root_node()
}
fn find_node_at_point(&self, pos: Position) -> Option<Node> {
// if we're at the end of an ident, we need to look _back_ one char instead
// for tree-sitter to find the right node.
let look_behind = {
let offset = self.linemap.offset_for_position(pos);
let char_at = self.source.as_bytes()[offset];
trace!("looking for non-alpha for point adjustment";
"offset" => offset,
"char" => char_at as char,
"point" => format!("{:?}", pos),
"look_behind" => !char_at.is_ascii_alphabetic());
!char_at.is_ascii_alphabetic()
};
let mut start = Point {
row: pos.line as usize,
column: pos.character as usize,
};
let mut end = Point {
row: pos.line as usize,
column: pos.character as usize,
};
if look_behind {
start.column -= 1;
} else {
end.column += 1;
}
match self.root_node().named_descendant_for_point_range(start, end) {
Some(node) => {
debug!("found a node";
"node" => format!("{:?}", node),
"text" => node.utf8_text(self.source.as_bytes()).unwrap(),
"start" => format!("{}", start),
"end" => format!("{}", end));
Some(node)
}
None => None,
}
}
}

View file

@ -1,42 +1,48 @@
use std::ffi::{CStr, CString};
use std::ptr;
use std::ffi::{CString, CStr};
use slog_scope::info;
#[cfg(test)]
use mockall::automock;
#[cfg_attr(test, automock)]
pub trait ShaderValidator {
fn validate(&self, tree_type: super::TreeType, source: String) -> Option<String>;
fn validate(&self, tree_type: super::TreeType, source: &str) -> Option<String>;
fn vendor(&self) -> String;
}
pub struct OpenGLContext {
_ctx: glutin::Context<glutin::PossiblyCurrent>
pub struct OpenGlContext {
_ctx: glutin::Context<glutin::PossiblyCurrent>,
}
impl OpenGLContext {
pub fn new() -> OpenGLContext {
impl OpenGlContext {
pub fn new() -> OpenGlContext {
let events_loop = glutin::event_loop::EventLoop::new();
let gl_window = glutin::ContextBuilder::new().build_headless(&*events_loop, glutin::dpi::PhysicalSize::new(1, 1)).unwrap();
let gl_window = glutin::ContextBuilder::new()
.build_headless(&*events_loop, glutin::dpi::PhysicalSize::new(1, 1))
.unwrap();
let gl_window = unsafe {
let gl_window = gl_window.make_current().unwrap();
gl::load_with(|symbol| gl_window.get_proc_address(symbol) as *const _);
gl_window
};
let gl_ctx = OpenGlContext { _ctx: gl_window };
unsafe {
eprintln!(
"Using OpenGL device {} {} {}",
String::from_utf8(CStr::from_ptr(gl::GetString(gl::VENDOR) as *const _).to_bytes().to_vec()).unwrap(),
String::from_utf8(CStr::from_ptr(gl::GetString(gl::VERSION) as *const _).to_bytes().to_vec()).unwrap(),
String::from_utf8(CStr::from_ptr(gl::GetString(gl::RENDERER) as *const _).to_bytes().to_vec()).unwrap()
info!(
"OpenGL device";
"vendor" => gl_ctx.vendor(),
"version" => String::from_utf8(CStr::from_ptr(gl::GetString(gl::VERSION) as *const _).to_bytes().to_vec()).unwrap(),
"renderer" => String::from_utf8(CStr::from_ptr(gl::GetString(gl::RENDERER) as *const _).to_bytes().to_vec()).unwrap()
);
}
OpenGLContext{
_ctx: gl_window,
}
gl_ctx
}
unsafe fn compile_and_get_shader_log(&self, shader: gl::types::GLuint, source: String) -> Option<String> {
unsafe fn compile_and_get_shader_log(&self, shader: gl::types::GLuint, source: &str) -> Option<String> {
let mut success = i32::from(gl::FALSE);
let c_str_frag = CString::new(source).unwrap();
gl::ShaderSource(shader, 1, &c_str_frag.as_ptr(), ptr::null());
@ -48,7 +54,12 @@ impl OpenGLContext {
let mut info_len: gl::types::GLint = 0;
gl::GetShaderiv(shader, gl::INFO_LOG_LENGTH, &mut info_len);
let mut info = vec![0u8; info_len as usize];
gl::GetShaderInfoLog(shader, info_len as gl::types::GLsizei, ptr::null_mut(), info.as_mut_ptr() as *mut gl::types::GLchar);
gl::GetShaderInfoLog(
shader,
info_len as gl::types::GLsizei,
ptr::null_mut(),
info.as_mut_ptr() as *mut gl::types::GLchar,
);
info.set_len((info_len - 1) as usize); // ignore null for str::from_utf8
Some(String::from_utf8(info).unwrap())
} else {
@ -59,8 +70,8 @@ impl OpenGLContext {
}
}
impl ShaderValidator for OpenGLContext {
fn validate(&self, tree_type: super::TreeType, source: String) -> Option<String> {
impl ShaderValidator for OpenGlContext {
fn validate(&self, tree_type: super::TreeType, source: &str) -> Option<String> {
unsafe {
match tree_type {
crate::TreeType::Fragment => {
@ -78,7 +89,16 @@ impl ShaderValidator for OpenGLContext {
let geometry_shader = gl::CreateShader(gl::GEOMETRY_SHADER);
self.compile_and_get_shader_log(geometry_shader, source)
}
crate::TreeType::Compute => {
// Compute shader
let compute_shader = gl::CreateShader(gl::COMPUTE_SHADER);
self.compile_and_get_shader_log(compute_shader, source)
}
}
}
}
fn vendor(&self) -> String {
unsafe { String::from_utf8(CStr::from_ptr(gl::GetString(gl::VENDOR) as *const _).to_bytes().to_vec()).unwrap() }
}
}

View file

@ -0,0 +1,52 @@
use std::{collections::HashMap, fmt::Display};
use petgraph::graph::NodeIndex;
#[derive(Clone, Copy, PartialEq, Eq, Hash)]
pub struct SourceNum(usize);
impl Display for SourceNum {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
f.write_str(format!("{}", self.0).as_str())
}
}
impl From<usize> for SourceNum {
fn from(val: usize) -> Self {
SourceNum(val)
}
}
// Maps from a graph node index to a virtual OpenGL
// source number (for when building the merged source view),
// and in reverse (for when mapping from GLSL error source numbers to their source path).
// What is a source number: https://community.khronos.org/t/what-is-source-string-number/70976
pub struct SourceMapper {
next: SourceNum,
mapping: HashMap<NodeIndex, SourceNum>,
reverse_mapping: Vec<NodeIndex>,
}
impl SourceMapper {
pub fn new(capacity: usize) -> Self {
SourceMapper {
next: SourceNum(0),
mapping: HashMap::with_capacity(capacity),
reverse_mapping: Vec::with_capacity(capacity),
}
}
pub fn get_num(&mut self, node: NodeIndex) -> SourceNum {
let num = &*self.mapping.entry(node).or_insert_with(|| {
let next = self.next;
self.next.0 += 1;
self.reverse_mapping.push(node);
next
});
*num
}
pub fn get_node(&self, num: SourceNum) -> NodeIndex {
self.reverse_mapping[num.0]
}
}

281
server/main/src/test.rs Normal file
View file

@ -0,0 +1,281 @@
use super::*;
use std::fs;
use std::io;
use std::io::Result;
use pretty_assertions::assert_eq;
use tempdir::TempDir;
use fs_extra::{copy_items, dir};
use jsonrpc_common::*;
use jsonrpc_response::*;
struct StdoutNewline {
s: Box<dyn io::Write>,
}
impl io::Write for StdoutNewline {
fn write(&mut self, buf: &[u8]) -> Result<usize> {
let res = self.s.write(buf);
if buf[buf.len() - 1] == b"}"[0] {
#[allow(unused_variables)]
let res = self.s.write(b"\n\n");
}
res
}
fn flush(&mut self) -> Result<()> {
self.s.flush()
}
}
pub fn new_temp_server(opengl_context: Option<Box<dyn opengl::ShaderValidator>>) -> MinecraftShaderLanguageServer {
let endpoint = LSPEndpoint::create_lsp_output_with_output_stream(|| StdoutNewline { s: Box::new(io::sink()) });
let context = opengl_context.unwrap_or_else(|| Box::new(opengl::MockShaderValidator::new()));
MinecraftShaderLanguageServer {
endpoint,
graph: Rc::new(RefCell::new(graph::CachedStableGraph::new())),
root: "".into(),
command_provider: None,
opengl_context: context.into(),
log_guard: None,
tree_sitter: Rc::new(RefCell::new(Parser::new())),
}
}
fn copy_files(files: &str, dest: &TempDir) {
let opts = &dir::CopyOptions::new();
let files = fs::read_dir(files)
.unwrap()
.map(|e| String::from(e.unwrap().path().to_str().unwrap()))
.collect::<Vec<String>>();
copy_items(&files, dest.path().join("shaders"), opts).unwrap();
}
pub fn copy_to_and_set_root(test_path: &str, server: &mut MinecraftShaderLanguageServer) -> (Rc<TempDir>, PathBuf) {
let (_tmp_dir, tmp_path) = copy_to_tmp_dir(test_path);
server.root = tmp_path.clone(); //format!("{}{}", "file://", tmp_path);
(_tmp_dir, tmp_path)
}
fn copy_to_tmp_dir(test_path: &str) -> (Rc<TempDir>, PathBuf) {
let tmp_dir = Rc::new(TempDir::new("mcshader").unwrap());
fs::create_dir(tmp_dir.path().join("shaders")).unwrap();
copy_files(test_path, &tmp_dir);
let tmp_clone = tmp_dir.clone();
let tmp_path = tmp_clone.path().to_str().unwrap();
(tmp_dir, tmp_path.into())
}
#[allow(deprecated)]
#[test]
#[logging_macro::log_scope]
fn test_empty_initialize() {
let mut server = new_temp_server(None);
let tmp_dir = TempDir::new("mcshader").unwrap();
let tmp_path = tmp_dir.path();
let initialize_params = InitializeParams {
process_id: None,
root_path: None,
root_uri: Some(Url::from_directory_path(tmp_path).unwrap()),
client_info: None,
initialization_options: None,
capabilities: ClientCapabilities {
workspace: None,
text_document: None,
experimental: None,
window: None,
general: Option::None,
},
trace: None,
workspace_folders: None,
locale: Option::None,
};
let on_response = |resp: Option<Response>| {
assert!(resp.is_some());
let respu = resp.unwrap();
match respu.result_or_error {
ResponseResult::Result(_) => {}
ResponseResult::Error(e) => {
panic!("expected ResponseResult::Result(..), got {:?}", e)
}
}
};
let completable = MethodCompletable::new(ResponseCompletable::new(Some(Id::Number(1)), Box::new(on_response)));
server.initialize(initialize_params, completable);
assert_eq!(server.root, tmp_path);
assert_eq!(server.graph.borrow().graph.edge_count(), 0);
assert_eq!(server.graph.borrow().graph.node_count(), 0);
server.endpoint.request_shutdown();
}
#[allow(deprecated)]
#[test]
#[logging_macro::log_scope]
fn test_01_initialize() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_tmp_dir("./testdata/01");
let initialize_params = InitializeParams {
process_id: None,
root_path: None,
root_uri: Some(Url::from_directory_path(tmp_path.clone()).unwrap()),
client_info: None,
initialization_options: None,
capabilities: ClientCapabilities {
workspace: None,
text_document: None,
experimental: None,
window: None,
general: Option::None,
},
trace: None,
workspace_folders: None,
locale: Option::None,
};
let on_response = |resp: Option<Response>| {
assert!(resp.is_some());
let respu = resp.unwrap();
match respu.result_or_error {
ResponseResult::Result(_) => {}
ResponseResult::Error(e) => {
panic!("expected ResponseResult::Result(..), got {:?}", e)
}
}
};
let completable = MethodCompletable::new(ResponseCompletable::new(Some(Id::Number(1)), Box::new(on_response)));
server.initialize(initialize_params, completable);
server.endpoint.request_shutdown();
// Assert there is one edge between two nodes
assert_eq!(server.graph.borrow().graph.edge_count(), 1);
let edge = server.graph.borrow().graph.edge_indices().next().unwrap();
let (node1, node2) = server.graph.borrow().graph.edge_endpoints(edge).unwrap();
// Assert the values of the two nodes in the tree
assert_eq!(
server.graph.borrow().graph[node1],
//format!("{:?}/{}/{}", tmp_path, "shaders", "final.fsh")
tmp_path.join("shaders").join("final.fsh").to_str().unwrap().to_string()
);
assert_eq!(
server.graph.borrow().graph[node2],
//format!("{:?}/{}/{}", tmp_path, "shaders", "common.glsl")
tmp_path.join("shaders").join("common.glsl").to_str().unwrap().to_string()
);
assert_eq!(server.graph.borrow().graph.edge_weight(edge).unwrap().line, 2);
}
#[allow(deprecated)]
#[test]
#[logging_macro::log_scope]
fn test_05_initialize() {
let mut server = new_temp_server(None);
let (_tmp_dir, tmp_path) = copy_to_tmp_dir("./testdata/05");
let initialize_params = InitializeParams {
process_id: None,
root_path: None,
root_uri: Some(Url::from_directory_path(tmp_path.clone()).unwrap()),
client_info: None,
initialization_options: None,
capabilities: ClientCapabilities {
workspace: None,
text_document: None,
experimental: None,
window: None,
general: Option::None,
},
trace: None,
workspace_folders: None,
locale: Option::None,
};
let on_response = |resp: Option<Response>| {
assert!(resp.is_some());
let respu = resp.unwrap();
match respu.result_or_error {
ResponseResult::Result(_) => {}
ResponseResult::Error(e) => {
panic!("expected ResponseResult::Result(..), got {:?}", e)
}
}
};
let completable = MethodCompletable::new(ResponseCompletable::new(Some(Id::Number(1)), Box::new(on_response)));
server.initialize(initialize_params, completable);
server.endpoint.request_shutdown();
// Assert there is one edge between two nodes
assert_eq!(server.graph.borrow().graph.edge_count(), 3);
assert_eq!(server.graph.borrow().graph.node_count(), 4);
let pairs: HashSet<(PathBuf, PathBuf)> = vec![
(
tmp_path.join("shaders").join("final.fsh").to_str().unwrap().to_string().into(),
tmp_path.join("shaders").join("common.glsl").to_str().unwrap().to_string().into(),
),
(
tmp_path.join("shaders").join("final.fsh").to_str().unwrap().to_string().into(),
tmp_path
.join("shaders")
.join("test")
.join("banana.glsl")
.to_str()
.unwrap()
.to_string()
.into(),
),
(
tmp_path
.join("shaders")
.join("test")
.join("banana.glsl")
.to_str()
.unwrap()
.to_string()
.into(),
tmp_path
.join("shaders")
.join("test")
.join("burger.glsl")
.to_str()
.unwrap()
.to_string()
.into(),
),
]
.into_iter()
.collect();
for edge in server.graph.borrow().graph.edge_indices() {
let endpoints = server.graph.borrow().graph.edge_endpoints(edge).unwrap();
let first = server.graph.borrow().get_node(endpoints.0);
let second = server.graph.borrow().get_node(endpoints.1);
let contains = pairs.contains(&(first.clone(), second.clone()));
assert!(contains, "doesn't contain ({:?}, {:?})", first, second);
}
}

View file

@ -1,55 +1,73 @@
use std::path::PathBuf;
use slog_scope::trace;
use anyhow::Result;
use path_slash::PathBufExt;
use url::Url;
use anyhow::Result;
pub trait FromUrl {
fn from_url(u: Url) -> Self;
}
pub trait FromJSON {
fn from_json(v: &serde_json::value::Value) -> Result<Self> where Self: Sized;
pub trait FromJson {
fn from_json(v: &serde_json::value::Value) -> Result<Self>
where
Self: Sized;
}
impl FromUrl for PathBuf {
#[cfg(target_family = "windows")]
fn from_url(u: Url) -> Self {
let path = percent_encoding::percent_decode_str(u.path().strip_prefix("/").unwrap()).decode_utf8().unwrap();
let path = percent_encoding::percent_decode_str(u.path().strip_prefix('/').unwrap())
.decode_utf8()
.unwrap();
trace!("converted win path from url"; "old" => u.as_str(), "new" => path.to_string());
PathBuf::from_slash(path)
}
#[cfg(target_family = "unix")]
fn from_url(u: Url) -> Self {
let path = percent_encoding::percent_decode_str(u.path()).decode_utf8().unwrap();
trace!("converted unix path from url"; "old" => u.as_str(), "new" => path.to_string());
PathBuf::from_slash(path)
}
}
impl FromJSON for PathBuf {
impl FromJson for PathBuf {
#[cfg(target_family = "windows")]
fn from_json(v: &serde_json::value::Value) -> Result<Self>
where Self: Sized {
where
Self: Sized,
{
if !v.is_string() {
return Err(anyhow::format_err!("cannot convert {:?} to PathBuf", v));
}
let path = v.to_string();
let path = percent_encoding::percent_decode_str(
path.trim_start_matches('"').trim_end_matches('"').strip_prefix("/").unwrap()
).decode_utf8()?;
let path = percent_encoding::percent_decode_str(path.trim_start_matches('"').trim_end_matches('"').strip_prefix('/').unwrap())
.decode_utf8()?;
trace!("converted win path from json"; "old" => v.to_string(), "new" => path.to_string());
Ok(PathBuf::from_slash(path))
}
#[cfg(target_family = "unix")]
fn from_json(v: &serde_json::value::Value) -> Result<Self>
where Self: Sized {
where
Self: Sized,
{
if !v.is_string() {
return Err(anyhow::format_err!("cannot convert {:?} to PathBuf", v));
}
let path = v.to_string();
let path = percent_encoding::percent_decode_str(
path.trim_start_matches('"').trim_end_matches('"')
).decode_utf8()?;
let path = percent_encoding::percent_decode_str(path.trim_start_matches('"').trim_end_matches('"')).decode_utf8()?;
trace!("converted unix path from json"; "old" => v.to_string(), "new" => path.to_string());
Ok(PathBuf::from_slash(path))
}
}
}

View file

@ -1,10 +1,10 @@
#version 120
#line 1 "!!"
#line 1 1 // !!
float test() {
return 0.5;
}
#line 4 "!!"
#line 4 0 // !!
void main() {
gl_FragColor[0] = vec4(0.0);

View file

@ -1,26 +1,26 @@
#version 120
#line 1 "!!"
#line 1 1 // !!
int sample() {
return 5;
}
#line 1 "!!"
#line 1 2 // !!
void burger() {
// sample text
}
#line 6 "!!"
#line 6 1 // !!
#line 1 "!!"
#line 1 3 // !!
float test() {
return 3.0;
}
#line 8 "!!"
#line 8 1 // !!
int sample_more() {
return 5;
}
#line 4 "!!"
#line 4 0 // !!
void main() {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);

View file

@ -1,22 +1,22 @@
#version 120
#line 1 "!!"
#line 1 1 // !!
int sample() {
return 5;
}
#line 1 "!!"
#line 1 2 // !!
void burger() {
// sample text
}
#line 6 "!!"
#line 6 1 // !!
#line 1 "!!"
#line 1 3 // !!
float test() {
return 3.0;
}
#line 8 "!!"
#line 4 "!!"
#line 8 1 // !!
#line 4 0 // !!
void main() {
gl_FragColor = vec4(1.0, 1.0, 1.0, 1.0);

23
server/main/testdata/04/final.fsh.merge vendored Normal file
View file

@ -0,0 +1,23 @@
#version 120
#line 1 1 // !!
#line 1 2 // !!
void stuff1() {
}
#line 2 1 // !!
#line 1 3 // !!
void stuff2() {
}
#line 3 1 // !!
#line 4 0 // !!
#line 1 4 // !!
void matrix() {
}
#line 5 0 // !!
void main() {
}

View file

@ -1,5 +1,7 @@
#version 120
#line 2 "!!"
#line 1 "!!"
float test() {
return 0.5;

9
server/main/testdata/06/final.fsh vendored Normal file
View file

@ -0,0 +1,9 @@
#version 120
#ifdef BANANA
#include "test.glsl"
#else
#include "test.glsl"
#endif
void main() {}

17
server/main/testdata/06/final.fsh.merge vendored Normal file
View file

@ -0,0 +1,17 @@
#version 120
#ifdef BANANA
#line 1 1 // !!
int test() {
return 1;
}
#line 5 0 // !!
#else
#line 1 1 // !!
int test() {
return 1;
}
#line 7 0 // !!
#endif
void main() {}

3
server/main/testdata/06/test.glsl vendored Normal file
View file

@ -0,0 +1,3 @@
int test() {
return 1;
}

View file

@ -1,161 +0,0 @@
use std::{collections::HashMap, path::PathBuf};
use std::rc::Rc;
use std::cell::RefCell;
use std::fs::OpenOptions;
use std::io::prelude::*;
use serde_json::Value;
use petgraph::{dot, graph::NodeIndex};
use anyhow::{Result, format_err};
use std::fs;
use crate::{graph::CachedStableGraph, merge_views, url_norm::FromJSON};
use crate::dfs;
pub struct CustomCommandProvider {
commands: HashMap<String, Box<dyn Invokeable>>
}
impl CustomCommandProvider {
pub fn new(commands: Vec<(&str, Box<dyn Invokeable>)>) -> CustomCommandProvider {
CustomCommandProvider{
commands: commands.into_iter().map(|tup| {
(tup.0.into(), tup.1)
}).collect(),
}
}
pub fn execute(&self, command: &str, args: Vec<Value>, root_path: &PathBuf) -> Result<Value> {
if self.commands.contains_key(command) {
return self.commands.get(command).unwrap().run_command(root_path, args);
}
Err(format_err!("command doesn't exist"))
}
}
pub trait Invokeable {
fn run_command(&self, root: &PathBuf, arguments: Vec<Value>) -> Result<Value>;
}
pub struct GraphDotCommand {
pub graph: Rc<RefCell<CachedStableGraph>>
}
impl Invokeable for GraphDotCommand {
fn run_command(&self, root: &PathBuf, _: Vec<Value>) -> Result<Value> {
let filepath = root.join("graph.dot");
eprintln!("generating dot file at {:?}", filepath);
let mut file = OpenOptions::new()
.truncate(true)
.write(true)
.create(true)
.open(filepath)
.unwrap();
let mut write_data_closure = || -> Result<(), std::io::Error> {
let graph = self.graph.as_ref();
file.seek(std::io::SeekFrom::Start(0))?;
file.write_all(dot::Dot::new(&graph.borrow().graph).to_string().as_bytes())?;
file.flush()?;
file.seek(std::io::SeekFrom::Start(0))?;
Ok(())
};
match write_data_closure() {
Err(err) => Err(format_err!("Error generating graphviz data: {}", err)),
_ => Ok(Value::Null)
}
}
}
pub struct VirtualMergedDocument {
pub graph: Rc<RefCell<CachedStableGraph>>
}
impl VirtualMergedDocument {
// TODO: DUPLICATE CODE
fn get_file_toplevel_ancestors(&self, uri: &PathBuf) -> Result<Option<Vec<petgraph::stable_graph::NodeIndex>>> {
let curr_node = match self.graph.borrow_mut().find_node(uri) {
Some(n) => n,
None => return Err(format_err!("node not found {:?}", uri)),
};
let roots = self.graph.borrow().collect_root_ancestors(curr_node);
if roots.is_empty() {
return Ok(None);
}
Ok(Some(roots))
}
pub fn get_dfs_for_node(&self, root: NodeIndex) -> Result<Vec<(NodeIndex, Option<NodeIndex>)>, dfs::error::CycleError> {
let graph_ref = self.graph.borrow();
let dfs = dfs::Dfs::new(&graph_ref, root);
dfs.collect::<Result<Vec<_>, _>>()
}
pub fn load_sources(&self, nodes: &[(NodeIndex, Option<NodeIndex>)]) -> Result<HashMap<PathBuf, String>> {
let mut sources = HashMap::new();
for node in nodes {
let graph = self.graph.borrow();
let path = graph.get_node(node.0);
if sources.contains_key(&path) {
continue;
}
let source = match fs::read_to_string(&path) {
Ok(s) => s,
Err(e) => return Err(format_err!("error reading {:?}: {}", path, e))
};
sources.insert(path.clone(), source);
}
Ok(sources)
}
}
impl Invokeable for VirtualMergedDocument {
fn run_command(&self, root: &PathBuf, arguments: Vec<Value>) -> Result<Value> {
let path = PathBuf::from_json(arguments.get(0).unwrap())?;
let file_ancestors = match self.get_file_toplevel_ancestors(&path) {
Ok(opt) => match opt {
Some(ancestors) => ancestors,
None => vec![],
},
Err(e) => return Err(e),
};
//eprintln!("ancestors for {}:\n\t{:?}", path, file_ancestors.iter().map(|e| self.graph.borrow().graph.node_weight(*e).unwrap().clone()).collect::<Vec<String>>());
// the set of all filepath->content. TODO: change to Url?
let mut all_sources: HashMap<PathBuf, String> = HashMap::new();
// if we are a top-level file (this has to be one of the set defined by Optifine, right?)
if file_ancestors.is_empty() {
// gather the list of all descendants
let root = self.graph.borrow_mut().find_node(&path).unwrap();
let tree = match self.get_dfs_for_node(root) {
Ok(tree) => tree,
Err(e) => return Err(e.into()),
};
let sources = match self.load_sources(&tree) {
Ok(s) => s,
Err(e) => return Err(e)
};
all_sources.extend(sources);
let graph = self.graph.borrow();
let view = merge_views::generate_merge_list(&tree, &all_sources, &graph);
return Ok(serde_json::value::Value::String(view));
}
return Err(format_err!("{:?} is not a top-level file aka has ancestors", path.strip_prefix(root).unwrap()))
}
}

View file

@ -1,156 +0,0 @@
use petgraph::stable_graph::NodeIndex;
use crate::graph::CachedStableGraph;
use anyhow::Result;
struct VisitCount {
node: NodeIndex,
touch: usize,
children: usize,
}
/// Performs a depth-first search with duplicates
pub struct Dfs<'a> {
stack: Vec<NodeIndex>,
graph: &'a CachedStableGraph,
cycle: Vec<VisitCount>
}
impl <'a> Dfs<'a> {
pub fn new(graph: &'a CachedStableGraph, start: NodeIndex) -> Self {
Dfs {
stack: vec![start],
graph,
cycle: Vec::new()
}
}
fn reset_path_to_branch(&mut self) {
while let Some(par) = self.cycle.last_mut() {
par.touch += 1;
if par.touch > par.children {
self.cycle.pop();
} else {
break;
}
}
}
fn check_for_cycle(&self, children: &[NodeIndex]) -> Result<(), error::CycleError> {
for prev in &self.cycle {
for child in children {
if prev.node == *child {
let cycle_nodes: Vec<NodeIndex> = self.cycle.iter().map(|n| n.node).collect();
return Err(
error::CycleError::new(&cycle_nodes, *child, self.graph)
);
}
}
}
Ok(())
}
}
impl <'a> Iterator for Dfs<'a> {
type Item = Result<(NodeIndex, Option<NodeIndex>), error::CycleError>;
fn next(&mut self) -> Option<Result<(NodeIndex, Option<NodeIndex>), error::CycleError>> {
let parent = match self.cycle.last() {
Some(p) => Some(p.node),
None => None,
};
if let Some(node) = self.stack.pop() {
self.cycle.push(VisitCount{
node,
children: self.graph.graph.edges(node).count(),
touch: 1,
});
let mut children = self.graph.child_node_indexes(node);
if !children.is_empty() {
// sort by line number in parent
children.sort_by(|x, y| {
let graph = &self.graph.graph;
let edge1 = graph.edge_weight(graph.find_edge(node, *x).unwrap()).unwrap();
let edge2 = graph.edge_weight(graph.find_edge(node, *y).unwrap()).unwrap();
edge2.line.cmp(&edge1.line)
});
match self.check_for_cycle(&children) {
Ok(_) => {}
Err(e) => return Some(Err(e)),
};
for child in children {
self.stack.push(child);
}
} else {
self.reset_path_to_branch();
}
return Some(Ok((node, parent)));
}
None
}
}
pub mod error {
use petgraph::stable_graph::NodeIndex;
use std::{fmt::{Debug, Display}, path::PathBuf, error::Error as StdError};
use crate::{graph::CachedStableGraph, consts};
use rust_lsp::lsp_types::{Diagnostic, DiagnosticSeverity, Position, Range};
#[derive(Debug)]
pub struct CycleError(Vec<PathBuf>);
impl StdError for CycleError {}
impl CycleError {
pub fn new(nodes: &[NodeIndex], current_node: NodeIndex, graph: &CachedStableGraph) -> Self {
let mut resolved_nodes: Vec<PathBuf> = nodes.iter().map(|i| graph.get_node(*i).clone()).collect();
resolved_nodes.push(graph.get_node(current_node).clone());
CycleError(resolved_nodes)
}
}
impl Display for CycleError {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
let mut disp = String::new();
disp.push_str(format!("Include cycle detected:\n{:?} imports ", self.0[0]).as_str());
for p in &self.0[1..self.0.len()-1] {
disp.push_str(format!("\n{:?}, which imports ", *p).as_str());
}
disp.push_str(format!("\n{:?}", self.0[self.0.len()-1]).as_str());
f.write_str(disp.as_str())
}
}
impl Into<Diagnostic> for CycleError {
fn into(self) -> Diagnostic {
Diagnostic{
severity: Some(DiagnosticSeverity::Error),
range: Range::new(Position::new(0, 0), Position::new(0, 500)),
source: Some(consts::SOURCE.into()),
message: self.into(),
code: None,
tags: None,
related_information: None,
code_description: Option::None,
data: Option::None,
}
}
}
impl Into<String> for CycleError {
fn into(self) -> String {
format!("{}", self)
}
}
}

View file

@ -1,150 +0,0 @@
use petgraph::stable_graph::StableDiGraph;
use petgraph::stable_graph::NodeIndex;
use petgraph::Direction;
use petgraph::stable_graph::EdgeIndex;
use std::{collections::{HashMap, HashSet}, path::PathBuf, str::FromStr};
use super::IncludePosition;
/// Wraps a `StableDiGraph` with caching behaviour for node search by maintaining
/// an index for node value to node index and a reverse index.
/// This allows for **O(1)** lookup for a value if it exists, else **O(n)**.
pub struct CachedStableGraph {
// StableDiGraph is used as it allows for String node values, essential for
// generating the GraphViz DOT render.
pub graph: StableDiGraph<String, IncludePosition>,
cache: HashMap<PathBuf, NodeIndex>,
// Maps a node index to its abstracted string representation.
// Mainly used as the graph is based on NodeIndex and
reverse_index: HashMap<NodeIndex, PathBuf>,
}
impl CachedStableGraph {
pub fn new() -> CachedStableGraph {
CachedStableGraph{
graph: StableDiGraph::new(),
cache: HashMap::new(),
reverse_index: HashMap::new(),
}
}
/// Returns the `NodeIndex` for a given graph node with the value of `name`
/// and caches the result in the `HashMap`. Complexity is **O(1)** if the value
/// is cached (which should always be the case), else **O(n)** where **n** is
/// the number of node indices, as an exhaustive search must be done.
pub fn find_node(&mut self, name: &PathBuf) -> Option<NodeIndex> {
match self.cache.get(name) {
Some(n) => Some(*n),
None => {
// If the string is not in cache, O(n) search the graph (i know...) and then cache the NodeIndex
// for later
let n = self.graph.node_indices().find(|n| self.graph[*n] == name.to_str().unwrap().to_string());
if let Some(n) = n {
self.cache.insert(name.into(), n);
}
n
}
}
}
pub fn get_node(&self, node: NodeIndex) -> PathBuf {
PathBuf::from_str(&self.graph[node]).unwrap()
}
pub fn get_edge_meta(&self, parent: NodeIndex, child: NodeIndex) -> &IncludePosition {
self.graph.edge_weight(self.graph.find_edge(parent, child).unwrap()).unwrap()
}
#[allow(dead_code)]
pub fn remove_node(&mut self, name: &PathBuf) {
let idx = self.cache.remove(name);
if let Some(idx) = idx {
self.graph.remove_node(idx);
}
}
pub fn add_node(&mut self, name: &PathBuf) -> NodeIndex {
if let Some(idx) = self.cache.get(name) {
return *idx;
}
let idx = self.graph.add_node(name.to_str().unwrap().to_string());
self.cache.insert(name.clone(), idx);
self.reverse_index.insert(idx, name.clone());
idx
}
pub fn add_edge(&mut self, parent: NodeIndex, child: NodeIndex, meta: IncludePosition) -> EdgeIndex {
self.graph.add_edge(parent, child, meta)
}
pub fn remove_edge(&mut self, parent: NodeIndex, child: NodeIndex) {
let edge = self.graph.find_edge(parent, child).unwrap();
self.graph.remove_edge(edge);
}
#[allow(dead_code)]
pub fn edge_weights(&self, node: NodeIndex) -> Vec<IncludePosition> {
self.graph.edges(node).map(|e| e.weight().clone()).collect()
}
#[allow(dead_code)]
pub fn child_node_names(&self, node: NodeIndex) -> Vec<PathBuf> {
self.graph.neighbors(node).map(|n| self.reverse_index.get(&n).unwrap().clone()).collect()
}
pub fn child_node_meta(&self, node: NodeIndex) -> Vec<(PathBuf, IncludePosition)> {
self.graph.neighbors(node).map(|n| {
let edge = self.graph.find_edge(node, n).unwrap();
let edge_meta = self.graph.edge_weight(edge).unwrap();
return (self.reverse_index.get(&n).unwrap().clone(), edge_meta.clone())
}).collect()
}
pub fn child_node_indexes(&self, node: NodeIndex) -> Vec<NodeIndex> {
self.graph.neighbors(node).collect()
}
#[allow(dead_code)]
pub fn parent_node_names(&self, node: NodeIndex) -> Vec<PathBuf> {
self.graph.neighbors_directed(node, Direction::Incoming).map(|n| self.reverse_index.get(&n).unwrap().clone()).collect()
}
pub fn parent_node_indexes(&self, node: NodeIndex) -> Vec<NodeIndex> {
self.graph.neighbors_directed(node, Direction::Incoming).collect()
}
#[allow(dead_code)]
pub fn get_include_meta(&self, node: NodeIndex) -> Vec<IncludePosition> {
self.graph.edges(node).map(|e| e.weight().clone()).collect()
}
pub fn collect_root_ancestors(&self, node: NodeIndex) -> Vec<NodeIndex> {
let mut visited = HashSet::new();
self.get_root_ancestors(node, node, &mut visited)
}
fn get_root_ancestors(&self, initial: NodeIndex, node: NodeIndex, visited: &mut HashSet<NodeIndex>) -> Vec<NodeIndex> {
if node == initial && !visited.is_empty() {
return vec![];
}
let parents = self.parent_node_indexes(node);
let mut collection = Vec::with_capacity(parents.len());
for ancestor in &parents {
visited.insert(*ancestor);
}
for ancestor in &parents {
let ancestors = self.parent_node_indexes(*ancestor);
if !ancestors.is_empty() {
collection.extend(self.get_root_ancestors(initial, *ancestor, visited));
} else {
collection.push(*ancestor);
}
}
collection
}
}

View file

@ -1,749 +0,0 @@
use rust_lsp::jsonrpc::{*, method_types::*};
use rust_lsp::lsp::*;
use rust_lsp::lsp_types::{*, notification::*};
use petgraph::stable_graph::NodeIndex;
use serde_json::Value;
use url_norm::FromUrl;
use walkdir::WalkDir;
use std::{cell::RefCell, path::PathBuf, str::FromStr};
use std::collections::{HashMap, HashSet};
use std::collections::hash_map::RandomState;
use std::convert::TryFrom;
use std::fmt::{Display, Formatter, Debug};
use std::io::{stdin, stdout, BufRead, BufReader};
use std::rc::Rc;
use std::fs;
use std::iter::{Extend, FromIterator};
use path_slash::PathBufExt;
use anyhow::{Result, anyhow};
use chan::WaitGroup;
use regex::Regex;
use lazy_static::lazy_static;
mod graph;
mod commands;
mod lsp_ext;
mod dfs;
mod merge_views;
mod consts;
mod opengl;
mod url_norm;
#[cfg(test)]
mod test;
lazy_static! {
static ref RE_DIAGNOSTIC: Regex = Regex::new(r#"^(?P<filepath>[^?<>*|"]+)\((?P<linenum>\d+)\) : (?P<severity>error|warning) [A-C]\d+: (?P<output>.+)"#).unwrap();
static ref RE_VERSION: Regex = Regex::new(r#"#version [\d]{3}"#).unwrap();
static ref RE_INCLUDE: Regex = Regex::new(r#"^(?:\s)*?(?:#include) "(.+)"\r?"#).unwrap();
static ref RE_INCLUDE_EXTENSION: Regex = Regex::new(r#"#extension GL_GOOGLE_include_directive ?: ?require"#).unwrap();
}
fn main() {
let stdin = stdin();
let endpoint_output = LSPEndpoint::create_lsp_output_with_output_stream(stdout);
let cache_graph = graph::CachedStableGraph::new();
let mut langserver = MinecraftShaderLanguageServer {
endpoint: endpoint_output.clone(),
graph: Rc::new(RefCell::new(cache_graph)),
wait: WaitGroup::new(),
root: "".into(),
command_provider: None,
opengl_context: Rc::new(opengl::OpenGLContext::new())
};
langserver.command_provider = Some(commands::CustomCommandProvider::new(vec![
(
"graphDot",
Box::new(commands::GraphDotCommand {
graph: Rc::clone(&langserver.graph),
}),
),
(
"virtualMerge",
Box::new(commands::VirtualMergedDocument{
graph: Rc::clone(&langserver.graph)
})
)
]));
LSPEndpoint::run_server_from_input(&mut stdin.lock(), endpoint_output, langserver);
}
struct MinecraftShaderLanguageServer {
endpoint: Endpoint,
graph: Rc<RefCell<graph::CachedStableGraph>>,
wait: WaitGroup,
root: PathBuf,
command_provider: Option<commands::CustomCommandProvider>,
opengl_context: Rc<dyn opengl::ShaderValidator>
}
#[derive(Clone, PartialEq, Eq, Hash)]
pub struct IncludePosition {
line: usize,
start: usize,
end: usize,
}
impl Debug for IncludePosition {
fn fmt(&self, f: &mut Formatter<'_>) -> std::fmt::Result {
write!(f, "{{line: {}}}", self.line)
}
}
impl Display for IncludePosition {
fn fmt(&self, f: &mut Formatter<'_>) -> Result<(), std::fmt::Error> {
write!(f, "{{line: {}}}", self.line)
}
}
pub enum TreeType {
Fragment, Vertex, Geometry
}
impl MinecraftShaderLanguageServer {
pub fn error_not_available<DATA>(data: DATA) -> MethodError<DATA> {
let msg = "Functionality not implemented.".to_string();
MethodError::<DATA> {
code: 1,
message: msg,
data,
}
}
pub fn gen_initial_graph(&self) {
eprintln!("root of project is {:?}", self.root);
// filter directories and files not ending in any of the 3 extensions
WalkDir::new(&self.root).into_iter().filter_map(|entry| {
if entry.is_err() {
return None;
}
let entry = entry.unwrap();
let path = entry.path();
if path.is_dir() {
return None;
}
let ext = match path.extension() {
Some(e) => e,
None => return None,
};
if ext != "vsh" && ext != "fsh" && ext != "glsl" && ext != "inc" {
return None;
}
Some(entry.into_path())
}).for_each(|path| {
// iterate all valid found files, search for includes, add a node into the graph for each
// file and add a file->includes KV into the map
self.add_file_and_includes_to_graph(&path);
});
eprintln!("finished building project include graph");
}
fn add_file_and_includes_to_graph(&self, path: &PathBuf) {
let includes = self.find_includes(path);
let idx = self.graph.borrow_mut().add_node(&path);
//eprintln!("adding {:?} with {:?}", path, includes);
for include in includes {
self.add_include(include, idx);
}
}
fn add_include(&self, include: (PathBuf, IncludePosition), node: NodeIndex) {
let child = self.graph.borrow_mut().add_node(&include.0);
self.graph.borrow_mut().add_edge(node, child, include.1);
}
pub fn find_includes(&self, file: &PathBuf) -> Vec<(PathBuf, IncludePosition)> {
let mut includes = Vec::default();
let buf = BufReader::new(std::fs::File::open(file).unwrap());
buf.lines()
.enumerate()
.filter_map(|line| match line.1 {
Ok(t) => Some((line.0, t)),
Err(_e) => None,
})
.filter(|line| RE_INCLUDE.is_match(line.1.as_str()))
.for_each(|line| {
let cap = RE_INCLUDE
.captures(line.1.as_str())
.unwrap()
.get(1)
.unwrap();
let start = cap.start();
let end = cap.end();
let mut path: String = cap.as_str().into();
// TODO: difference between / and not
let full_include = if path.starts_with('/') {
path = path.strip_prefix('/').unwrap().to_string();
self.root.join("shaders").join(PathBuf::from_slash(&path))
} else {
file.parent().unwrap().join(PathBuf::from_slash(&path))
};
includes.push((
full_include,
IncludePosition {
line: line.0,
start,
end,
}
));
});
includes
}
fn update_includes(&self, file: &PathBuf) {
let includes = self.find_includes(file);
eprintln!("updating {:?} with {:?}", file, includes);
let idx = match self.graph.borrow_mut().find_node(&file) {
None => {
return
},
Some(n) => n,
};
let prev_children: HashSet<_, RandomState> = HashSet::from_iter(self.graph.borrow().child_node_meta(idx));
let new_children: HashSet<_, RandomState> = HashSet::from_iter(includes.iter().map(|e| e.clone()));
let to_be_added = new_children.difference(&prev_children);
let to_be_removed = prev_children.difference(&new_children);
eprintln!("removing:\n\t{:?}\nadding:\n\t{:?}", to_be_removed, to_be_added);
for removal in to_be_removed {
let child = self.graph.borrow_mut().find_node(&removal.0).unwrap();
self.graph.borrow_mut().remove_edge(idx, child);
}
for insertion in to_be_added {
self.add_include(includes.iter().find(|f| f.0 == *insertion.0).unwrap().clone(), idx);
}
}
pub fn lint(&self, uri: &PathBuf) -> Result<HashMap<Url, Vec<Diagnostic>>> {
// get all top level ancestors of this file
let file_ancestors = match self.get_file_toplevel_ancestors(uri) {
Ok(opt) => match opt {
Some(ancestors) => ancestors,
None => vec![],
},
Err(e) => return Err(e),
};
eprintln!("ancestors for {:?}:\n\t{:?}", uri, file_ancestors.iter().map(|e| PathBuf::from_str(&self.graph.borrow().graph.node_weight(*e).unwrap().clone()).unwrap()).collect::<Vec<PathBuf>>());
// the set of all filepath->content. TODO: change to Url?
let mut all_sources: HashMap<PathBuf, String> = HashMap::new();
// the set of filepath->list of diagnostics to report
let mut diagnostics: HashMap<Url, Vec<Diagnostic>> = HashMap::new();
// we want to backfill the diagnostics map with all linked sources
let back_fill = |all_sources, diagnostics: &mut HashMap<Url, Vec<Diagnostic>>| {
for (path, _) in all_sources {
diagnostics.entry(Url::from_file_path(path).unwrap()).or_default();
}
};
// if we are a top-level file (this has to be one of the set defined by Optifine, right?)
if file_ancestors.is_empty() {
// gather the list of all descendants
let root = self.graph.borrow_mut().find_node(&uri).unwrap();
let tree = match self.get_dfs_for_node(root) {
Ok(tree) => tree,
Err(e) => {
diagnostics.insert(Url::from_file_path(uri).unwrap(), vec![e.into()]);
return Ok(diagnostics);
}
};
all_sources.extend( self.load_sources(&tree)?);
let view = {
let graph = self.graph.borrow();
merge_views::generate_merge_list(&tree, &all_sources, &graph)
};
let root_path = self.graph.borrow().get_node(root).clone();
let tree_type = if root_path.extension().unwrap() == "fsh" {
TreeType::Fragment
} else if root_path.extension().unwrap() == "vsh" {
TreeType::Vertex
} else if root_path.extension().unwrap() == "gsh" {
TreeType::Geometry
} else {
eprintln!("got a non fsh|vsh ({:?}) as a file root ancestor: {:?}", root_path.extension().unwrap(), root_path);
back_fill(&all_sources, &mut diagnostics);
return Ok(diagnostics)
};
let stdout = match self.opengl_context.clone().validate(tree_type, view) {
Some(s) => s,
None => {
back_fill(&all_sources, &mut diagnostics);
return Ok(diagnostics)
},
};
diagnostics.extend(self.parse_validator_stdout(uri, stdout, ""));
} else {
let mut all_trees: Vec<(TreeType, Vec<(NodeIndex, Option<_>)>)> = Vec::new();
for root in &file_ancestors {
let nodes = match self.get_dfs_for_node(*root) {
Ok(nodes) => nodes,
Err(e) => {
diagnostics.insert(Url::from_file_path(uri).unwrap(), vec![e.into()]);
back_fill(&all_sources, &mut diagnostics); // TODO: confirm
return Ok(diagnostics);
}
};
let root_path = self.graph.borrow().get_node(*root).clone();
let tree_type = if root_path.extension().unwrap() == "fsh" {
TreeType::Fragment
} else if root_path.extension().unwrap() == "vsh" {
TreeType::Vertex
} else if root_path.extension().unwrap() == "gsh" {
TreeType::Geometry
} else {
eprintln!("got a non fsh|vsh ({:?}) as a file root ancestor: {:?}", root_path.extension().unwrap(), root_path);
continue;
};
let sources = self.load_sources(&nodes)?;
all_trees.push((tree_type, nodes));
all_sources.extend(sources);
}
for tree in all_trees {
let view = {
let graph = self.graph.borrow();
merge_views::generate_merge_list(&tree.1, &all_sources, &graph)
};
let stdout = match self.opengl_context.clone().validate(tree.0, view) {
Some(s) => s,
None => continue,
};
diagnostics.extend(self.parse_validator_stdout(uri, stdout, ""));
}
};
back_fill(&all_sources, &mut diagnostics);
Ok(diagnostics)
}
fn parse_validator_stdout(&self, uri: &PathBuf, stdout: String, _source: &str) -> HashMap<Url, Vec<Diagnostic>> {
let stdout_lines = stdout.split('\n');
let mut diagnostics: HashMap<Url, Vec<Diagnostic>> = HashMap::with_capacity(stdout_lines.count());
let stdout_lines = stdout.split('\n');
for line in stdout_lines {
let diagnostic_capture = match RE_DIAGNOSTIC.captures(line) {
Some(d) => d,
None => continue
};
eprintln!("match {:?}", diagnostic_capture);
let msg = diagnostic_capture.name("output").unwrap().as_str();
let line = match diagnostic_capture.name("linenum") {
Some(c) => match c.as_str().parse::<u32>() {
Ok(i) => i,
Err(_) => 0,
},
None => 0,
} - 2;
// TODO: line matching maybe
/* let line_text = source_lines[line as usize];
let leading_whitespace = line_text.len() - line_text.trim_start().len(); */
let severity = match diagnostic_capture.name("severity") {
Some(c) => match c.as_str() {
"error" => DiagnosticSeverity::Error,
"warning" => DiagnosticSeverity::Warning,
_ => DiagnosticSeverity::Information,
}
_ => DiagnosticSeverity::Information,
};
let origin = match diagnostic_capture.name("filepath") {
Some(o) => {
if o.as_str().to_string() == "0" {
uri.to_str().unwrap().to_string()
} else {
o.as_str().to_string()
}
},
None => uri.to_str().unwrap().to_string(),
};
let diagnostic = Diagnostic {
range: Range::new(
/* Position::new(line, leading_whitespace as u64),
Position::new(line, line_text.len() as u64) */
Position::new(line, 0),
Position::new(line, 1000),
),
code: None,
severity: Some(severity),
source: Some(consts::SOURCE.into()),
message: msg.trim().into(),
related_information: None,
tags: None,
code_description: Option::None,
data: Option::None,
};
let origin_url = Url::from_file_path(origin).unwrap();
match diagnostics.get_mut(&origin_url) {
Some(d) => d.push(diagnostic),
None => {
diagnostics.insert(origin_url, vec![diagnostic]);
},
};
}
diagnostics
}
pub fn get_dfs_for_node(&self, root: NodeIndex) -> Result<Vec<(NodeIndex, Option<NodeIndex>)>, dfs::error::CycleError> {
let graph_ref = self.graph.borrow();
let dfs = dfs::Dfs::new(&graph_ref, root);
dfs.collect::<Result<Vec<_>, _>>()
}
pub fn load_sources(&self, nodes: &[(NodeIndex, Option<NodeIndex>)]) -> Result<HashMap<PathBuf, String>> {
let mut sources = HashMap::new();
for node in nodes {
let graph = self.graph.borrow();
let path = graph.get_node(node.0);
if sources.contains_key(&path) {
continue;
}
let source = match fs::read_to_string(&path) {
Ok(s) => s,
Err(e) => return Err(anyhow!("error reading {:?}: {}", path, e))
};
sources.insert(path.clone(), source);
}
Ok(sources)
}
fn get_file_toplevel_ancestors(&self, uri: &PathBuf) -> Result<Option<Vec<petgraph::stable_graph::NodeIndex>>> {
let curr_node = match self.graph.borrow_mut().find_node(uri) {
Some(n) => n,
None => return Err(anyhow!("node not found {:?}", uri)),
};
let roots = self.graph.borrow().collect_root_ancestors(curr_node);
if roots.is_empty() {
return Ok(None);
}
Ok(Some(roots))
}
pub fn publish_diagnostic(&self, diagnostics: HashMap<Url, Vec<Diagnostic>>, document_version: Option<i32>) {
eprintln!("DIAGNOSTICS:\n{:?}", diagnostics);
for (uri, diagnostics) in diagnostics {
self.endpoint.send_notification(PublishDiagnostics::METHOD, PublishDiagnosticsParams {
uri,
diagnostics,
version: document_version,
}).expect("failed to publish diagnostics");
}
}
fn set_status(&self, status: impl Into<String>, message: impl Into<String>, icon: impl Into<String>) {
self.endpoint.send_notification(lsp_ext::Status::METHOD, lsp_ext::StatusParams {
status: status.into(),
message: Some(message.into()),
icon: Some(icon.into()),
}).unwrap_or(());
}
}
impl LanguageServerHandling for MinecraftShaderLanguageServer {
fn initialize(&mut self, params: InitializeParams, completable: MethodCompletable<InitializeResult, InitializeError>) {
self.wait.add(1);
let mut capabilities = ServerCapabilities::default();
capabilities.hover_provider = None;
capabilities.document_link_provider = Some(DocumentLinkOptions {
resolve_provider: None,
work_done_progress_options: WorkDoneProgressOptions {
work_done_progress: None,
},
});
capabilities.execute_command_provider = Some(ExecuteCommandOptions {
commands: vec!["graphDot".into()],
work_done_progress_options: WorkDoneProgressOptions {
work_done_progress: None,
},
});
capabilities.text_document_sync = Some(TextDocumentSyncCapability::Options(
TextDocumentSyncOptions {
open_close: Some(true),
will_save: None,
will_save_wait_until: None,
change: Some(TextDocumentSyncKind::Full),
save: Some(TextDocumentSyncSaveOptions::SaveOptions(SaveOptions {
include_text: Some(true),
}))
},
));
let root = match params.root_uri {
Some(uri) => PathBuf::from_url(uri),
None => {
completable.complete(Err(MethodError {
code: 42069,
message: "Must be in workspace".into(),
data: InitializeError {
retry: false,
},
}));
return;
}
};
completable.complete(Ok(InitializeResult {
capabilities,
server_info: None,
}));
self.set_status("loading", "Building dependency graph...", "$(loading~spin)");
self.root = root;
self.gen_initial_graph();
self.set_status("ready", "Project initialized", "$(check)");
}
fn shutdown(&mut self, _: (), completable: LSCompletable<()>) {
eprintln!("shutting down language server...");
completable.complete(Ok(()));
}
fn exit(&mut self, _: ()) {
self.endpoint.request_shutdown();
}
fn workspace_change_configuration(&mut self, params: DidChangeConfigurationParams) {
//let config = params.settings.as_object().unwrap().get("mcglsl").unwrap();
eprintln!("{:?}", params.settings.as_object().unwrap());
self.wait.done();
}
fn did_open_text_document(&mut self, params: DidOpenTextDocumentParams) {
//eprintln!("opened doc {}", params.text_document.uri);
let path = PathBuf::from_url(params.text_document.uri);
if self.graph.borrow_mut().find_node(&path) == None {
self.add_file_and_includes_to_graph(&path);
}
match self.lint(&path) {
Ok(diagnostics) => self.publish_diagnostic(diagnostics, None),
Err(e) => eprintln!("error linting: {}", e),
}
}
fn did_change_text_document(&mut self, _: DidChangeTextDocumentParams) {}
fn did_close_text_document(&mut self, _: DidCloseTextDocumentParams) {}
fn did_save_text_document(&mut self, params: DidSaveTextDocumentParams) {
eprintln!("saved doc {}", params.text_document.uri);
let path = PathBuf::from_url(params.text_document.uri);
self.update_includes(&path);
match self.lint(&path) {
Ok(diagnostics) => self.publish_diagnostic(diagnostics, None),
Err(e) => eprintln!("error linting: {}", e),
}
}
fn did_change_watched_files(&mut self, _: DidChangeWatchedFilesParams) {}
fn completion(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<CompletionList>) {
completable.complete(Err(Self::error_not_available(())));
}
fn resolve_completion_item(&mut self, _: CompletionItem, completable: LSCompletable<CompletionItem>) {
completable.complete(Err(Self::error_not_available(())));
}
fn hover(&mut self, _: TextDocumentPositionParams, _: LSCompletable<Hover>) {
self.wait.wait();
/* completable.complete(Ok(Hover{
contents: HoverContents::Markup(MarkupContent{
kind: MarkupKind::Markdown,
value: String::from("# Hello World"),
}),
range: None,
})); */
}
fn execute_command(&mut self, params: ExecuteCommandParams, completable: LSCompletable<Option<Value>>) {
match self.command_provider.as_ref().unwrap().execute(&params.command, params.arguments, &self.root) {
Ok(resp) => {
eprintln!("executed {} successfully", params.command);
self.endpoint.send_notification(ShowMessage::METHOD, ShowMessageParams {
typ: MessageType::Info,
message: format!("Command {} executed successfully.", params.command),
}).expect("failed to send popup/show message notification");
completable.complete(Ok(Some(resp)))
},
Err(err) => {
self.endpoint.send_notification(ShowMessage::METHOD, ShowMessageParams {
typ: MessageType::Error,
message: format!("Failed to execute `{}`. Reason: {}", params.command, err),
}).expect("failed to send popup/show message notification");
eprintln!("failed to execute {}: {}", params.command, err);
completable.complete(Err(MethodError::new(32420, err.to_string(), ())))
},
}
}
fn signature_help(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<SignatureHelp>) {
completable.complete(Err(Self::error_not_available(())));
}
fn goto_definition(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<Vec<Location>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn references(&mut self, _: ReferenceParams, completable: LSCompletable<Vec<Location>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn document_highlight(&mut self, _: TextDocumentPositionParams, completable: LSCompletable<Vec<DocumentHighlight>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn document_symbols(&mut self, _: DocumentSymbolParams, completable: LSCompletable<Vec<SymbolInformation>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn workspace_symbols(&mut self, _: WorkspaceSymbolParams, completable: LSCompletable<Vec<SymbolInformation>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_action(&mut self, _: CodeActionParams, completable: LSCompletable<Vec<Command>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_lens(&mut self, _: CodeLensParams, completable: LSCompletable<Vec<CodeLens>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn code_lens_resolve(&mut self, _: CodeLens, completable: LSCompletable<CodeLens>) {
completable.complete(Err(Self::error_not_available(())));
}
fn document_link(&mut self, params: DocumentLinkParams, completable: LSCompletable<Vec<DocumentLink>>) {
eprintln!("document link file: {:?}", params.text_document.uri.to_file_path().unwrap());
// node for current document
let curr_doc = params
.text_document
.uri
.to_file_path()
.unwrap();
let node = match self.graph.borrow_mut().find_node(&curr_doc) {
Some(n) => n,
None => {
completable.complete(Ok(vec![]));
return
},
};
let edges: Vec<DocumentLink> = self
.graph
.borrow()
.child_node_indexes(node)
.into_iter()
.filter_map(|child| {
let graph = self.graph.borrow();
let value = graph.get_edge_meta(node, child);
let path = graph.get_node(child);
let url = match Url::from_file_path(&path) {
Ok(url) => url,
Err(e) => {
eprintln!("error converting {:?} into url: {:?}", path, e);
return None;
}
};
Some(DocumentLink {
range: Range::new(
Position::new(
u32::try_from(value.line).unwrap(),
u32::try_from(value.start).unwrap()),
Position::new(
u32::try_from(value.line).unwrap(),
u32::try_from(value.end).unwrap()),
),
target: Some(url),
//tooltip: Some(url.path().to_string().strip_prefix(self.root.clone().unwrap().as_str()).unwrap().to_string()),
tooltip: None,
data: None,
})
}).collect();
eprintln!("links: {:?}", edges);
completable.complete(Ok(edges));
}
fn document_link_resolve(&mut self, _: DocumentLink, completable: LSCompletable<DocumentLink>) {
completable.complete(Err(Self::error_not_available(())));
}
fn formatting(&mut self, _: DocumentFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn range_formatting(&mut self, _: DocumentRangeFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn on_type_formatting(&mut self, _: DocumentOnTypeFormattingParams, completable: LSCompletable<Vec<TextEdit>>) {
completable.complete(Err(Self::error_not_available(())));
}
fn rename(&mut self, _: RenameParams, completable: LSCompletable<WorkspaceEdit>) {
completable.complete(Err(Self::error_not_available(())));
}
}

View file

@ -1,208 +0,0 @@
use std::{collections::{HashMap, LinkedList, VecDeque}, path::PathBuf};
use std::iter::Peekable;
use std::cmp::min;
use core::slice::Iter;
use petgraph::stable_graph::NodeIndex;
use crate::graph::CachedStableGraph;
pub fn generate_merge_list<'a>(
nodes: &'a [(NodeIndex, Option<NodeIndex>)],
sources: &'a HashMap<PathBuf, String>,
graph: &'a CachedStableGraph
) -> String {
let mut line_directives: Vec<String> = Vec::new();
// list of source code views onto the below sources
let mut merge_list: LinkedList<&'a str> = LinkedList::new();
line_directives.reserve(nodes.len() * 2);
let mut last_offset_set: HashMap<PathBuf, usize> = HashMap::new();
let mut nodes_iter = nodes.iter().peekable();
let first = nodes_iter.next().unwrap().0;
let first_path = graph.get_node(first).clone();
last_offset_set.insert(first_path.clone(), 0);
let line_ending_offset = if is_crlf(sources.get(&first_path).unwrap()) {
2
} else {
1
};
// stack to keep track of the depth first traversal
let mut stack = VecDeque::<NodeIndex>::new();
create_merge_views(&mut nodes_iter, &mut merge_list, &mut last_offset_set, graph, sources, &mut line_directives, &mut stack, line_ending_offset);
// now we add a view of the remainder of the root file
let offset = *last_offset_set.get(&first_path).unwrap();
let len = sources.get(&first_path).unwrap().len();
merge_list.push_back(&sources.get(&first_path).unwrap()[min(offset, len) ..]);
let total_len = merge_list.iter().fold(0, |a, b| {
a + b.len()
});
let mut merged = String::with_capacity(total_len);
for slice in merge_list {
merged.push_str(slice);
}
merged
}
fn is_crlf(source: &String) -> bool {
source.contains("\r\n")
}
fn create_merge_views<'a>(
nodes: &mut Peekable<Iter<(NodeIndex, Option<NodeIndex>)>>,
merge_list: &mut LinkedList<&'a str>,
last_offset_set: &mut HashMap<PathBuf, usize>,
graph: &'a CachedStableGraph,
sources: &'a HashMap<PathBuf, String>,
line_directives: &mut Vec<String>,
stack: &mut VecDeque<NodeIndex>,
line_ending_offset: usize,
) {
loop {
let n = match nodes.next() {
Some(n) => n,
None => return,
};
let parent = n.1.unwrap();
let child = n.0;
let edge = graph.get_edge_meta(parent, child);
let parent_path = graph.get_node(parent).clone();
let child_path = graph.get_node(child).clone();
let parent_source = sources.get(&parent_path).unwrap();
let (char_for_line, char_following_line) = char_offset_for_line(edge.line, parent_source, line_ending_offset);
let offset = *last_offset_set.insert(parent_path.clone(), char_following_line).get_or_insert(0);
merge_list.push_back(&parent_source[offset..char_for_line]);
add_opening_line_directive(&child_path, merge_list, line_directives);
match nodes.peek() {
Some(next) => {
let next = *next;
// if the next pair's parent is not a child of the current pair, we dump the rest of this childs source
if next.1.unwrap() != child {
let child_source = sources.get(&child_path).unwrap();
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
let offset = {
match child_source.ends_with("\n") {
true => child_source.len()-line_ending_offset,
false => child_source.len(),
}
};
merge_list.push_back(&child_source[..offset]);
last_offset_set.insert(child_path.clone(), 0);
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
add_closing_line_directive(edge.line+2, &parent_path, merge_list, line_directives);
// if the next pair's parent is not the current pair's parent, we need to bubble up
if stack.contains(&next.1.unwrap()) {
return;
}
continue;
}
stack.push_back(parent);
create_merge_views(nodes, merge_list, last_offset_set, graph, sources, line_directives, stack, line_ending_offset);
stack.pop_back();
let offset = *last_offset_set.get(&child_path).unwrap();
let child_source = sources.get(&child_path).unwrap();
// this evaluates to false once the file contents have been exhausted aka offset = child_source.len() + 1
let end_offset = {
match child_source.ends_with("\n") {
true => line_ending_offset/* child_source.len()-1 */,
false => 0/* child_source.len() */,
}
};
if offset < child_source.len()-end_offset {
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
merge_list.push_back(&child_source[offset../* std::cmp::max( */child_source.len()-end_offset/* , offset) */]);
last_offset_set.insert(child_path.clone(), 0);
}
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
add_closing_line_directive(edge.line+2, &parent_path, merge_list, line_directives);
// we need to check the next item at the point of original return further down the callstack
if nodes.peek().is_some() && stack.contains(&nodes.peek().unwrap().1.unwrap()) {
return;
}
},
None => {
let child_source = sources.get(&child_path).unwrap();
// if ends in \n\n, we want to exclude the last \n for some reason. Ask optilad
let offset = {
match child_source.ends_with("\n") {
true => child_source.len()-line_ending_offset,
false => child_source.len(),
}
};
merge_list.push_back(&child_source[..offset]);
last_offset_set.insert(child_path.clone(), 0);
// +2 because edge.line is 0 indexed but #line is 1 indexed and references the *following* line
add_closing_line_directive(edge.line+2, &parent_path, merge_list, line_directives);
}
}
}
}
// returns the character offset + 1 of the end of line number `line` and the character
// offset + 1 for the end of the line after the previous one
fn char_offset_for_line(line_num: usize, source: &str, line_ending_offset: usize) -> (usize, usize) {
let mut char_for_line: usize = 0;
let mut char_following_line: usize = 0;
for (n, line) in source.lines().enumerate() {
if n == line_num {
char_following_line += line.len()+line_ending_offset;
break;
}
char_for_line += line.len()+line_ending_offset;
char_following_line = char_for_line;
}
(char_for_line, char_following_line)
}
fn add_opening_line_directive(path: &PathBuf, merge_list: &mut LinkedList<&str>, line_directives: &mut Vec<String>) {
let line_directive = format!("#line 1 \"{}\"\n", path.to_str().unwrap().replace("\\", "\\\\"));
line_directives.push(line_directive);
unsafe_get_and_insert(merge_list, line_directives);
}
fn add_closing_line_directive(line: usize, path: &PathBuf, merge_list: &mut LinkedList<&str>, line_directives: &mut Vec<String>) {
// Optifine doesn't seem to add a leading newline if the previous line was a #line directive
let line_directive = if let Some(l) = merge_list.back() {
if l.trim().starts_with("#line") {
format!("#line {} \"{}\"\n", line, path.to_str().unwrap().replace("\\", "\\\\"))
} else {
format!("\n#line {} \"{}\"\n", line, path.to_str().unwrap().replace("\\", "\\\\"))
}
} else {
format!("\n#line {} \"{}\"\n", line, path.to_str().unwrap().replace("\\", "\\\\"))
};
line_directives.push(line_directive);
unsafe_get_and_insert(merge_list, line_directives);
}
fn unsafe_get_and_insert(merge_list: &mut LinkedList<&str>, line_directives: &Vec<String>) {
// :^)
unsafe {
let vec_ptr_offset = line_directives.as_ptr().add(line_directives.len()-1);
merge_list.push_back(&vec_ptr_offset.as_ref().unwrap()[..]);
}
}

File diff suppressed because it is too large Load diff

View file

@ -1,23 +0,0 @@
#version 120
#line 1 "!!"
#line 1 "!!"
void stuff1() {
}
#line 2 "!!"
#line 1 "!!"
void stuff2() {
}
#line 3 "!!"
#line 4 "!!"
#line 1 "!!"
void matrix() {
}
#line 5 "!!"
void main() {
}