Compare commits

..

44 Commits

Author SHA1 Message Date
Michael Davis
61491af15e Remove unused Result wrapper for Path->Url conversion 2024-12-20 13:33:47 -05:00
Michael Davis
a36806e326 Handle conversion to/from new LSP URL type 2024-12-20 13:33:46 -05:00
Michael Davis
b84c9a893c Replace url::Url with a String wrapper 2024-12-20 13:33:46 -05:00
Michael Davis
652e316925 LSP: Use PathBufs for workspace folders
Internally the LSP client should hold workspace folders as paths. Using
URLs for this type is inconvenient (since we compare it to paths) and
might cause mismatches because of URLs not being normalized. The URLs
must be paths anyways so we can convert these types lazily when we need
to send them to a server.
2024-12-20 13:33:46 -05:00
Nikita Revenco
ba6e6dc3dd Colors for items in the completion menu (#12299) 2024-12-20 10:16:15 -06:00
cornishon
a91263d604 Odin textobjects (#12302)
Co-authored-by: Adam Zadrożny <zadroznyadam@protonmail.com>
2024-12-20 09:59:28 -06:00
Ian Hobson
06d0f33c94 Add Koto language support (#12307) 2024-12-20 09:56:13 -06:00
Eduardo Rittner Coelho
eaff0c3cd6 Document diagnostic severity levels (#12306) 2024-12-20 09:47:06 -06:00
uncenter
1e9412269a Sync Catppuccin theme changes (#12304) 2024-12-20 09:43:45 -06:00
Nikita Revenco
355e381626 feat: use ui.text.directory for path completion item if its a folder (#12295) 2024-12-19 14:36:54 -06:00
Tobias Hunger
cbc06d1f15 chore: Update slint tree-sitter grammar to version 1.9 (#12297) 2024-12-19 10:16:12 -06:00
Eduardo Rittner Coelho
9e4da4b950 Show parser availability in --health [LANG] (#12228) 2024-12-18 11:21:58 -06:00
Christian Schneider
13e5a2ee5a Outdent array literals for php [] (#12286)
Co-authored-by: Christian Schneider <schneider@search.ch>
2024-12-18 08:52:20 -06:00
David Else
0134bb7063 Update dark_plus theme for inactive text and improve jump label (#12289) 2024-12-18 08:32:41 -06:00
Peter Ingram
ec65cc4913 Adds colored directories to everforest themes (#12287)
Co-authored-by: Peter Ingram <p.ingram@mrx.technology>
2024-12-18 08:31:40 -06:00
Nikita Revenco
91a5d407da Allow theming directory prompt completions (#12205) 2024-12-17 21:13:42 -06:00
Michael Davis
6eb186eb7b helix-lsp-types: use bitflags::bitflags rather than extern crate
This seems to be a historical artifact in `lsp_types` - we can use a
regular `use` statement to pull in the `bitflags!` macro rather than
an external crate definition. This fixes rust-analyzer's ability to find
the macro at least on rust-analyzer 2024-02-26.
2024-12-17 15:42:36 -05:00
Michael Davis
1980bd5992 helix-lsp-types: Prefer crate::Url to url::Url
This is a cosmetic change to replace all direct `use`s of the `url::Url`
type in the `helix-lsp-types` crate with `use crate::Url;`. The types
are the same type currently: this refactor will make a future
replacement of the Url type less noisy.

Connects https://github.com/helix-editor/helix/pull/11889
2024-12-17 15:42:28 -05:00
Tim Sampson
cc3b77b584 dockerfile: bump tree-sitter grammar to gain support for heredocs (#12230) 2024-12-17 13:26:49 -06:00
Christian Schneider
fcded6ce1e Trim trailing colons from paths to allow copy/pasting git grep -n output (#9963)
Co-authored-by: Christian Schneider <schneider@search.ch>
2024-12-17 13:02:06 -06:00
Pascal Kuthe
1badd9e434 implement snippet tabstop support 2024-12-17 13:34:40 -05:00
Pascal Kuthe
66fb1e67c0 add fallback onNextKey
adds a variant of on_next_key callbacks that are only called when no other
mapping matches a key
2024-12-17 13:34:40 -05:00
Pascal Kuthe
609c29bf7e add DocumentFocusLost event 2024-12-17 13:34:40 -05:00
Pascal Kuthe
5537e68b5e add changes and ghost_transaction to DocumentDidChange events 2024-12-17 13:34:40 -05:00
Pascal Kuthe
c8c0d04168 add snippet system to helix core 2024-12-17 13:34:39 -05:00
Pascal Kuthe
db959274d4 Add range type to helix stdx 2024-12-17 13:34:39 -05:00
dependabot[bot]
312c64f0c2 build(deps): bump the rust-dependencies group with 10 updates (#12277)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-16 18:48:13 -06:00
André Sá
67535804a5 Fix build from source with Spade tree-sitter grammar (#12276) 2024-12-16 14:44:28 -06:00
Michael Davis
bae6a58c3c Add block-comment-tokens configuration for Java
Ref https://github.com/helix-editor/helix/pull/12266#issuecomment-2546370787
2024-12-16 14:02:35 -05:00
Integral
250d9fa8fe Avoid allocating the --help message (#12243) 2024-12-16 11:16:48 -06:00
Aaalibaba
3b36cf1a15 Expand tildes in :read command (#12271) 2024-12-16 11:10:35 -06:00
Nikita Revenco
99fdbce566 docs: remove mention that - requires special handling (#12250) 2024-12-16 10:01:14 -06:00
David Else
9b14750e56 Add ltex-ls-plus language server (#12251) 2024-12-16 09:37:49 -06:00
TornaxO7
4e5b0644a2 language: add comment token for java files (#12266) 2024-12-16 09:24:04 -06:00
Takumi Matsuura
e14c346ee7 Fix panic in kill_to_end_of_line when handling multibyte characters (#12237) 2024-12-13 14:04:52 -06:00
RoloEdits
617f538d41 feat(highlights): add COMPLIANCE to error (#12244) 2024-12-13 13:26:08 -06:00
Yuki Kobayashi
ce133a2889 languages(v): use vlang/v-analyzer instead of v-analyzer/v-analyzer (#12236)
* use vlang/v-analyzer instead of v-analyzer/v-analyzer

* revert rev, because CI failed (couldn't repro working query-check locally, so not sure if this will work)
2024-12-13 12:09:24 +09:00
TornaxO7
89a7cde2f0 Fix continuing comment token for first line (#12215) 2024-12-10 13:24:34 -06:00
TornaxO7
51ac3e05e0 doc: fix default value in doc for continue-comments (#12235) 2024-12-10 13:19:31 -06:00
TornaxO7
5005c14e99 Add config option for continue commenting (#12213)
Co-authored-by: Michael Davis <mcarsondavis@gmail.com>
2024-12-09 17:31:41 -06:00
Michael Davis
2f74530328 helix-lsp-types: Remove Cargo.lock
This lockfile is unused since this crate was added to the workspace and
can be removed.

Closes #12227
2024-12-09 17:14:38 -05:00
Tshepang Mbambo
a1a5faebef typo (#12224) 2024-12-09 12:23:30 -06:00
Nikita Revenco
db1d84256f fix: report correct amount of files opened and improved error message when Helix can't parse directory as file (#12199)
* feat: improve information on the amount of files loaded

* refactor: naming consitency Doc and not Buf

* fix: correct name of method

* chore: appease clippy

* feat: more human error information when Helix cannot start

* refatcor: use if guard on match arm
2024-12-08 20:14:29 +09:00
Michael Davis
271c32f2e6 Support bindings with the Super (Cmd/Win/Meta) modifier (#6592)
Terminals which support the enhanced keyboard protocol send events for
keys pressed with the Super modifier (Windows/Linux key or the Command
key). The only changes that are needed to support this in Helix are:

* Mapping the modifier from crossterm's KeyModifiers to Helix's
  KeyModifiers.
* Representing and parsing the modifier from the KeyEvent text
  representation.
* Documenting the ability to remap it.

When writing keybindings, use 'Meta-', 'Cmd-' or 'Win-' which are all
synonymous. For example:

    [keys.normal]
    Cmd-s = ":write"

will trigger for the Windows or Linux keys and the Command key plus 's'.
2024-12-08 12:35:14 +09:00
80 changed files with 1624 additions and 1464 deletions

136
Cargo.lock generated
View File

@@ -68,9 +68,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.93"
version = "1.0.94"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c95c10ba0b00a02636238b814946408b1322d5ac4760326e6fb8ec956d85775"
checksum = "c1fd03a028ef38ba2276dce7e33fcd6369c158a1bca17946c4b1b701891c1ff7"
[[package]]
name = "arc-swap"
@@ -136,9 +136,9 @@ checksum = "df8670b8c7b9dae1793364eafadf7239c40d669904660c5960d74cfd80b46a53"
[[package]]
name = "cc"
version = "1.2.2"
version = "1.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f34d93e62b03caf570cccc334cbc6c2fceca82f39211051345108adcba3eebdc"
checksum = "9157bbaa6b165880c27a4293a474c91cdcf265cc68cc829bf10be0964a391caf"
dependencies = [
"shlex",
]
@@ -162,9 +162,9 @@ dependencies = [
[[package]]
name = "chrono"
version = "0.4.38"
version = "0.4.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a21f936df1771bf62b77f047b726c4625ff2e8aa607c01ec06e5a05bd8463401"
checksum = "7e36cc9d416881d2e24f9a963be5fb1cd90966419ac844274161d10488b3e825"
dependencies = [
"android-tzdata",
"iana-time-zone",
@@ -327,12 +327,12 @@ checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5"
[[package]]
name = "errno"
version = "0.3.9"
version = "0.3.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "534c5cf6194dfab3db3242765c03bbe257cf92f22b38f6bc0c58d59108a820ba"
checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -369,9 +369,9 @@ checksum = "e8c02a5121d4ea3eb16a80748c74f5549a5665e4c21333c6098f283870fbdea6"
[[package]]
name = "fern"
version = "0.7.0"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69ff9c9d5fb3e6da8ac2f77ab76fe7e8087d512ce095200f8f29ac5b656cf6dc"
checksum = "4316185f709b23713e41e3195f90edef7fb00c3ed4adc79769cf09cc762a3b29"
dependencies = [
"log",
]
@@ -522,7 +522,7 @@ dependencies = [
"gix-worktree",
"once_cell",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -535,7 +535,7 @@ dependencies = [
"gix-date",
"gix-utils",
"itoa",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -552,7 +552,7 @@ dependencies = [
"gix-trace",
"kstring",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"unicode-bom",
]
@@ -562,7 +562,7 @@ version = "0.2.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d48b897b4bbc881aea994b4a5bbb340a04979d7be9089791304e04a9fbc66b53"
dependencies = [
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -571,7 +571,7 @@ version = "0.4.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6ffbeb3a5c0b8b84c3fe4133a6f8c82fa962f4caefe8d0762eced025d3eb4f7"
dependencies = [
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -597,7 +597,7 @@ dependencies = [
"gix-features",
"gix-hash",
"memmap2",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -616,7 +616,7 @@ dependencies = [
"memchr",
"once_cell",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"unicode-bom",
"winnow",
]
@@ -631,7 +631,7 @@ dependencies = [
"bstr",
"gix-path",
"libc",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -643,7 +643,7 @@ dependencies = [
"bstr",
"itoa",
"jiff",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -664,7 +664,7 @@ dependencies = [
"gix-traverse",
"gix-worktree",
"imara-diff",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -684,7 +684,7 @@ dependencies = [
"gix-trace",
"gix-utils",
"gix-worktree",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -700,7 +700,7 @@ dependencies = [
"gix-path",
"gix-ref",
"gix-sec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -718,7 +718,7 @@ dependencies = [
"once_cell",
"prodash",
"sha1_smol",
"thiserror 2.0.3",
"thiserror 2.0.7",
"walkdir",
]
@@ -740,7 +740,7 @@ dependencies = [
"gix-trace",
"gix-utils",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -773,7 +773,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b5eccc17194ed0e67d49285e4853307e4147e95407f91c1c3e4a13ba9f4e4ce"
dependencies = [
"faster-hex",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -825,7 +825,7 @@ dependencies = [
"memmap2",
"rustix",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -855,7 +855,7 @@ dependencies = [
"gix-validate",
"itoa",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -877,7 +877,7 @@ dependencies = [
"gix-quote",
"parking_lot",
"tempfile",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -895,7 +895,7 @@ dependencies = [
"gix-path",
"memmap2",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -907,7 +907,7 @@ dependencies = [
"bstr",
"faster-hex",
"gix-trace",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -920,7 +920,7 @@ dependencies = [
"gix-trace",
"home",
"once_cell",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -935,7 +935,7 @@ dependencies = [
"gix-config-value",
"gix-glob",
"gix-path",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -946,7 +946,7 @@ checksum = "64a1e282216ec2ab2816cd57e6ed88f8009e634aec47562883c05ac8a7009a63"
dependencies = [
"bstr",
"gix-utils",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -966,7 +966,7 @@ dependencies = [
"gix-utils",
"gix-validate",
"memmap2",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -981,7 +981,7 @@ dependencies = [
"gix-revision",
"gix-validate",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -996,7 +996,7 @@ dependencies = [
"gix-hash",
"gix-object",
"gix-revwalk",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1011,7 +1011,7 @@ dependencies = [
"gix-hashtable",
"gix-object",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1046,7 +1046,7 @@ dependencies = [
"gix-pathspec",
"gix-worktree",
"portable-atomic",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1061,7 +1061,7 @@ dependencies = [
"gix-pathspec",
"gix-refspec",
"gix-url",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1098,7 +1098,7 @@ dependencies = [
"gix-object",
"gix-revwalk",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1110,7 +1110,7 @@ dependencies = [
"bstr",
"gix-features",
"gix-path",
"thiserror 2.0.3",
"thiserror 2.0.7",
"url",
]
@@ -1132,7 +1132,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd520d09f9f585b34b32aba1d0b36ada89ab7fefb54a8ca3fe37fc482a750937"
dependencies = [
"bstr",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1237,6 +1237,7 @@ dependencies = [
"nucleo",
"once_cell",
"parking_lot",
"percent-encoding",
"quickcheck",
"regex",
"regex-cursor",
@@ -1252,7 +1253,6 @@ dependencies = [
"unicode-general-category",
"unicode-segmentation",
"unicode-width",
"url",
]
[[package]]
@@ -1266,7 +1266,7 @@ dependencies = [
"log",
"serde",
"serde_json",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
]
@@ -1322,7 +1322,7 @@ dependencies = [
"serde",
"serde_json",
"slotmap",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
]
@@ -1332,10 +1332,10 @@ name = "helix-lsp-types"
version = "0.95.1"
dependencies = [
"bitflags",
"percent-encoding",
"serde",
"serde_json",
"serde_repr",
"url",
]
[[package]]
@@ -1397,7 +1397,7 @@ dependencies = [
"smallvec",
"tempfile",
"termini",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
"toml",
@@ -1464,11 +1464,10 @@ dependencies = [
"serde_json",
"slotmap",
"tempfile",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
"toml",
"url",
]
[[package]]
@@ -1760,9 +1759,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.167"
version = "0.2.168"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09d6582e104315a817dff97f75133544b2e094ee22447d2acf4a74e189ba06fc"
checksum = "5aaeb2981e0606ca11d79718f8bb01164f1d6ed75080182d3abf017e6d244b6d"
[[package]]
name = "libloading"
@@ -2133,15 +2132,15 @@ checksum = "719b953e2095829ee67db738b3bfa9fa368c94900df327b3f07fe6e794d2fe1f"
[[package]]
name = "rustix"
version = "0.38.41"
version = "0.38.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d7f649912bc1495e167a6edee79151c84b1bad49748cb4f1f1167f459f6224f6"
checksum = "f93dc38ecbab2eb790ff964bb77fa94faf256fd3e73285fd7ba0903b76bedb85"
dependencies = [
"bitflags",
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -2167,18 +2166,18 @@ checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "serde"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f"
checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0"
checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e"
dependencies = [
"proc-macro2",
"quote",
@@ -2412,11 +2411,11 @@ dependencies = [
[[package]]
name = "thiserror"
version = "2.0.3"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c006c85c7651b3cf2ada4584faa36773bd07bac24acfb39f3c431b36d7e667aa"
checksum = "93605438cbd668185516ab499d589afb7ee1859ea3d5fc8f6b0755e1c7443767"
dependencies = [
"thiserror-impl 2.0.3",
"thiserror-impl 2.0.7",
]
[[package]]
@@ -2432,9 +2431,9 @@ dependencies = [
[[package]]
name = "thiserror-impl"
version = "2.0.3"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f077553d607adc1caf65430528a576c757a71ed73944b66ebb58ef2bbd243568"
checksum = "e1d8749b4531af2117677a5fcd12b1348a3fe2b81e36e61ffeac5c4aa3273e36"
dependencies = [
"proc-macro2",
"quote",
@@ -2477,9 +2476,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "tokio"
version = "1.41.1"
version = "1.42.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "22cfb5bee7a6a52939ca9224d6ac897bb669134078daa8735560897f69de4d33"
checksum = "5cec9b21b0450273377fc97bd4c33a8acffc8c996c987a7c5b319a0083707551"
dependencies = [
"backtrace",
"bytes",
@@ -2506,9 +2505,9 @@ dependencies = [
[[package]]
name = "tokio-stream"
version = "0.1.16"
version = "0.1.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f4e6ce100d0eb49a2734f8c0812bcd324cf357d21810932c5df6b96ef2b86f1"
checksum = "eca58d7bba4a75707817a2c44174253f9236b2d5fbd055602e9d5c07c139a047"
dependencies = [
"futures-core",
"pin-project-lite",
@@ -2622,7 +2621,6 @@ dependencies = [
"form_urlencoded",
"idna",
"percent-encoding",
"serde",
]
[[package]]

View File

@@ -42,6 +42,7 @@ tree-sitter = { version = "0.22" }
nucleo = "0.5.0"
slotmap = "1.0.7"
thiserror = "2.0"
percent-encoding = "2.3"
[workspace.package]
version = "24.7.0"

View File

@@ -31,6 +31,7 @@
| `line-number` | Line number display: `absolute` simply shows each line's number, while `relative` shows the distance from the current line. When unfocused or in insert mode, `relative` will still show absolute line numbers | `absolute` |
| `cursorline` | Highlight all lines with a cursor | `false` |
| `cursorcolumn` | Highlight all columns with a cursor | `false` |
| `continue-comments` | if helix should automatically add a line comment token if you create a new line inside a comment. | `true` |
| `gutters` | Gutters to display: Available are `diagnostics` and `diff` and `line-numbers` and `spacer`, note that `diagnostics` also includes other features like breakpoints, 1-width padding will be inserted if gutters is non-empty | `["diagnostics", "spacer", "line-numbers", "spacer", "diff"]` |
| `auto-completion` | Enable automatic pop up of auto-completion | `true` |
| `path-completion` | Enable filepath completion. Show files and directories if an existing path at the cursor was recognized, either absolute or relative to the current opened document or current working directory (if the buffer is not yet saved). Defaults to true. | `true` |
@@ -58,7 +59,7 @@
### `[editor.clipboard-provider]` Section
Helix can be configured wither to use a builtin clipboard configuration or to use
Helix can be configured either to use a builtin clipboard configuration or to use
a provided command.
For instance, setting it to use OSC 52 termcodes, the configuration would be:
@@ -441,6 +442,8 @@ fn main() {
| `max-wrap` | Equivalent of the `editor.soft-wrap.max-wrap` option for diagnostics. | `20` |
| `max-diagnostics` | Maximum number of diagnostics to render inline for a given line | `10` |
The allowed values for `cursor-line` and `other-lines` are: `error`, `warning`, `info`, `hint`.
The (first) diagnostic with the highest severity that is not shown inline is rendered at the end of the line (as long as its severity is higher than the `end-of-line-diagnostics` config option):
```

View File

@@ -115,6 +115,7 @@
| kdl | ✓ | ✓ | ✓ | |
| koka | ✓ | | ✓ | `koka` |
| kotlin | ✓ | | | `kotlin-language-server` |
| koto | ✓ | ✓ | ✓ | `koto-ls` |
| latex | ✓ | ✓ | | `texlab` |
| ld | ✓ | | ✓ | |
| ldif | ✓ | | | |
@@ -146,7 +147,7 @@
| nunjucks | ✓ | | | |
| ocaml | ✓ | | ✓ | `ocamllsp` |
| ocaml-interface | ✓ | | | `ocamllsp` |
| odin | ✓ | | ✓ | `ols` |
| odin | ✓ | | ✓ | `ols` |
| ohm | ✓ | ✓ | ✓ | |
| opencl | ✓ | ✓ | ✓ | `clangd` |
| openscad | ✓ | | | `openscad-lsp` |

View File

@@ -60,7 +60,7 @@ These configuration keys are available:
| `shebangs` | The interpreters from the shebang line, for example `["sh", "bash"]` |
| `roots` | A set of marker files to look for when trying to find the workspace root. For example `Cargo.lock`, `yarn.lock` |
| `auto-format` | Whether to autoformat this language when saving |
| `diagnostic-severity` | Minimal severity of diagnostic for it to be displayed. (Allowed values: `Error`, `Warning`, `Info`, `Hint`) |
| `diagnostic-severity` | Minimal severity of diagnostic for it to be displayed. (Allowed values: `error`, `warning`, `info`, `hint`) |
| `comment-tokens` | The tokens to use as a comment token, either a single token `"//"` or an array `["//", "///", "//!"]` (the first token will be used for commenting). Also configurable as `comment-token` for backwards compatibility|
| `block-comment-tokens`| The start and end tokens for a multiline comment either an array or single table of `{ start = "/*", end = "*/"}`. The first set of tokens will be used for commenting, any pairs in the array can be uncommented |
| `indent` | The indent to use. Has sub keys `unit` (the text inserted into the document when indenting; usually set to N spaces or `"\t"` for tabs) and `tab-width` (the number of spaces rendered for a tab) |

View File

@@ -72,15 +72,28 @@ t = ":run-shell-command cargo test"
## Special keys and modifiers
Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes
`C-`, `S-` and `A-`. Special keys are encoded as follows:
Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes `C-`, `S-` and `A-`.
The [Super key](https://en.wikipedia.org/wiki/Super_key_(keyboard_button)) - the Windows/Linux
key or the Command key on Mac keyboards - is also supported when using a terminal emulator that
supports the [enhanced keyboard protocol](https://github.com/helix-editor/helix/wiki/Terminal-Support#enhanced-keyboard-protocol).
The super key is encoded with prefixes `Meta-`, `Cmd-` or `Win-`. These are all synonyms for the
super modifier - binding a key with a `Win-` modifier will mean it can be used with the
Windows/Linux key or the Command key.
```toml
[keys.normal]
C-s = ":write" # Ctrl and 's' to write
Cmd-s = ":write" # Cmd or Win or Meta and 's' to write
```
Special keys are encoded as follows:
| Key name | Representation |
| --- | --- |
| Backspace | `"backspace"` |
| Space | `"space"` |
| Return/Enter | `"ret"` |
| \- | `"minus"` |
| Left | `"left"` |
| Right | `"right"` |
| Up | `"up"` |
@@ -96,3 +109,14 @@ Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes
| Escape | `"esc"` |
Keys can be disabled by binding them to the `no_op` command.
All other keys such as `?`, `!`, `-` etc. can be used literally:
```toml
[keys.normal]
"?" = ":write"
"!" = ":write"
"-" = ":write"
```
Note: `-` can't be used when combined with a modifier, for example `Alt` + `-` should be written as `A-minus`. `A--` is not accepted.

View File

@@ -305,6 +305,7 @@ These scopes are used for theming the editor interface:
| `ui.text.focus` | The currently selected line in the picker |
| `ui.text.inactive` | Same as `ui.text` but when the text is inactive (e.g. suggestions) |
| `ui.text.info` | The key: command text in `ui.popup.info` boxes |
| `ui.text.directory` | Directory names in prompt completion |
| `ui.virtual.ruler` | Ruler columns (see the [`editor.rulers` config][editor-section]) |
| `ui.virtual.whitespace` | Visible whitespace characters |
| `ui.virtual.indent-guide` | Vertical indent width guides |

View File

@@ -40,7 +40,7 @@ bitflags = "2.6"
ahash = "0.8.11"
hashbrown = { version = "0.14.5", features = ["raw"] }
dunce = "1.0"
url = "2.5.4"
percent-encoding.workspace = true
log = "0.4"
anyhow = "1.0"

View File

@@ -1,6 +1,5 @@
use std::borrow::Cow;
use crate::diagnostic::LanguageServerId;
use crate::Transaction;
#[derive(Debug, PartialEq, Clone)]
@@ -10,17 +9,4 @@ pub struct CompletionItem {
pub kind: Cow<'static, str>,
/// Containing Markdown
pub documentation: String,
pub provider: CompletionProvider,
}
#[derive(Debug, PartialEq, Eq, Hash, Clone, Copy)]
pub enum CompletionProvider {
Lsp(LanguageServerId),
PathCompletions,
}
impl From<LanguageServerId> for CompletionProvider {
fn from(id: LanguageServerId) -> Self {
CompletionProvider::Lsp(id)
}
}

View File

@@ -210,8 +210,8 @@ fn whitespace_with_same_width(text: RopeSlice) -> String {
s
}
/// normalizes indentation to tabs/spaces based on user configurtion This
/// function does not change the actual indentaiton width just the character
/// normalizes indentation to tabs/spaces based on user configuration
/// This function does not change the actual indentation width, just the character
/// composition.
pub fn normalize_indentation(
prefix: RopeSlice<'_>,

View File

@@ -13,7 +13,7 @@ use crate::{Assoc, ChangeSet, Selection, Transaction};
pub struct ActiveSnippet {
ranges: Vec<Range>,
active_tabstops: HashSet<TabstopIdx>,
active_tabstop: TabstopIdx,
current_tabstop: TabstopIdx,
tabstops: Vec<Tabstop>,
}
@@ -36,7 +36,7 @@ impl ActiveSnippet {
ranges: snippet.ranges,
tabstops: snippet.tabstops,
active_tabstops: HashSet::new(),
active_tabstop: TabstopIdx(0),
current_tabstop: TabstopIdx(0),
};
(snippet.tabstops.len() != 1).then_some(snippet)
}
@@ -52,14 +52,14 @@ impl ActiveSnippet {
pub fn delete_placeholder(&self, doc: &Rope) -> Transaction {
Transaction::delete(
doc,
self[self.active_tabstop]
self[self.current_tabstop]
.ranges
.iter()
.map(|range| (range.start, range.end)),
)
}
/// maps the active snippets trough a `ChangeSet` updating all tabstop ranges
/// maps the active snippets through a `ChangeSet` updating all tabstop ranges
pub fn map(&mut self, changes: &ChangeSet) -> bool {
let positions_to_map = self.ranges.iter_mut().flat_map(|range| {
[
@@ -112,13 +112,13 @@ impl ActiveSnippet {
if retain {
range.start = range.start.max(snippet_range.start);
range.end = range.end.max(range.start).min(snippet_range.end);
// garunteed by assoc
// guaranteed by assoc
debug_assert!(prev.start <= range.start);
debug_assert!(range.start <= range.end);
if prev.end > range.start {
// not really sure what to do in this case. It shouldn't
// really occur in practice% the below just ensures
// our invriants hold
// really occur in practice, the below just ensures
// our invariants hold
range.start = prev.end;
range.end = range.end.max(range.start)
}
@@ -132,11 +132,11 @@ impl ActiveSnippet {
pub fn next_tabstop(&mut self, current_selection: &Selection) -> (Selection, bool) {
let primary_idx = self.primary_idx(current_selection);
while self.active_tabstop.0 + 1 < self.tabstops.len() {
self.active_tabstop.0 += 1;
while self.current_tabstop.0 + 1 < self.tabstops.len() {
self.current_tabstop.0 += 1;
if self.activate_tabstop() {
let selection = self.tabstop_selection(primary_idx, Direction::Forward);
return (selection, self.active_tabstop.0 + 1 == self.tabstops.len());
return (selection, self.current_tabstop.0 + 1 == self.tabstops.len());
}
}
@@ -148,15 +148,15 @@ impl ActiveSnippet {
pub fn prev_tabstop(&mut self, current_selection: &Selection) -> Option<Selection> {
let primary_idx = self.primary_idx(current_selection);
while self.active_tabstop.0 != 0 {
self.active_tabstop.0 -= 1;
while self.current_tabstop.0 != 0 {
self.current_tabstop.0 -= 1;
if self.activate_tabstop() {
return Some(self.tabstop_selection(primary_idx, Direction::Forward));
}
}
None
}
// computes the primary idx adjust for the number of cursors in the current tabstop
// computes the primary idx adjusted for the number of cursors in the current tabstop
fn primary_idx(&self, current_selection: &Selection) -> usize {
let primary: Range = current_selection.primary().into();
let res = self
@@ -172,19 +172,19 @@ impl ActiveSnippet {
}
fn activate_tabstop(&mut self) -> bool {
let tabstop = &self[self.active_tabstop];
let tabstop = &self[self.current_tabstop];
if tabstop.has_placeholder() && tabstop.ranges.iter().all(|range| range.is_empty()) {
return false;
}
self.active_tabstops.clear();
self.active_tabstops.insert(self.active_tabstop);
let mut parent = self[self.active_tabstop].parent;
self.active_tabstops.insert(self.current_tabstop);
let mut parent = self[self.current_tabstop].parent;
while let Some(tabstop) = parent {
self.active_tabstops.insert(tabstop);
parent = self[tabstop].parent;
}
true
// TODO: if the user removes the seleciton(s) in one snippet (but
// TODO: if the user removes the selection(s) in one snippet (but
// there are still other cursors in other snippets) and jumps to the
// next tabstop the selection in that tabstop is restored (at the
// next tabstop). This could be annoying since its not possible to
@@ -192,11 +192,11 @@ impl ActiveSnippet {
// hand it may be useful since the user may just have meant to edit
// a subselection (like with s) of the tabstops and so the selection
// removal was just temporary. Potentially this could have some sort of
// seperate keymap
// separate keymap
}
pub fn tabstop_selection(&self, primary_idx: usize, direction: Direction) -> Selection {
let tabstop = &self[self.active_tabstop];
let tabstop = &self[self.current_tabstop];
tabstop.selection(direction, primary_idx, self.ranges.len())
}
@@ -208,18 +208,18 @@ impl ActiveSnippet {
return ActiveSnippet::new(snippet);
}
let mut cnt = 0;
let parent = self[self.active_tabstop].parent;
let parent = self[self.current_tabstop].parent;
let tabstops = snippet.tabstops.into_iter().map(|mut tabstop| {
cnt += 1;
if let Some(parent) = &mut tabstop.parent {
parent.0 += self.active_tabstop.0;
parent.0 += self.current_tabstop.0;
} else {
tabstop.parent = parent;
}
tabstop
});
self.tabstops
.splice(self.active_tabstop.0..=self.active_tabstop.0, tabstops);
.splice(self.current_tabstop.0..=self.current_tabstop.0, tabstops);
self.activate_tabstop();
Some(self)
}

View File

@@ -1,6 +1,6 @@
/*!
A parser for LSP/VSCode style snippet syntax see
<https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#snippet_syntax>
A parser for LSP/VSCode style snippet syntax
See <https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#snippet_syntax>.
``` text
any ::= tabstop | placeholder | choice | variable | text
@@ -704,7 +704,7 @@ mod test {
}],
);
// invalid regex TODO: reneable tests once we actually parse this regex flavour
// invalid regex TODO: reneable tests once we actually parse this regex flavor
// assert_text(
// "${foo/([A-Z][a-z])/format/GMI}",
// "${foo/([A-Z][a-z])/format/GMI}",

View File

@@ -94,7 +94,8 @@ impl Snippet {
elaborate::TabstopKind::Choice { choices } => TabstopKind::Choice {
choices: choices.clone(),
},
// start out as empty the first non-empty placeholder will change this to a aplaceholder automatically
// start out as empty: the first non-empty placeholder will change this to
// a placeholder automatically
elaborate::TabstopKind::Empty
| elaborate::TabstopKind::Placeholder { .. } => TabstopKind::Empty,
elaborate::TabstopKind::Transform(transform) => {

View File

@@ -1,6 +1,7 @@
use std::{
fmt,
path::{Path, PathBuf},
str::FromStr,
sync::Arc,
};
@@ -16,14 +17,6 @@ pub enum Uri {
}
impl Uri {
// This clippy allow mirrors url::Url::from_file_path
#[allow(clippy::result_unit_err)]
pub fn to_url(&self) -> Result<url::Url, ()> {
match self {
Uri::File(path) => url::Url::from_file_path(path),
}
}
pub fn as_path(&self) -> Option<&Path> {
match self {
Self::File(path) => Some(path),
@@ -45,81 +38,96 @@ impl fmt::Display for Uri {
}
}
#[derive(Debug)]
pub struct UrlConversionError {
source: url::Url,
kind: UrlConversionErrorKind,
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct UriParseError {
source: String,
kind: UriParseErrorKind,
}
#[derive(Debug)]
pub enum UrlConversionErrorKind {
UnsupportedScheme,
UnableToConvert,
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UriParseErrorKind {
UnsupportedScheme(String),
MalformedUri,
}
impl fmt::Display for UrlConversionError {
impl fmt::Display for UriParseError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.kind {
UrlConversionErrorKind::UnsupportedScheme => {
match &self.kind {
UriParseErrorKind::UnsupportedScheme(scheme) => {
write!(f, "unsupported scheme '{scheme}' in URI {}", self.source)
}
UriParseErrorKind::MalformedUri => {
write!(
f,
"unsupported scheme '{}' in URL {}",
self.source.scheme(),
"unable to convert malformed URI to file path: {}",
self.source
)
}
UrlConversionErrorKind::UnableToConvert => {
write!(f, "unable to convert URL to file path: {}", self.source)
}
}
}
}
impl std::error::Error for UrlConversionError {}
impl std::error::Error for UriParseError {}
fn convert_url_to_uri(url: &url::Url) -> Result<Uri, UrlConversionErrorKind> {
if url.scheme() == "file" {
url.to_file_path()
.map(|path| Uri::File(helix_stdx::path::normalize(path).into()))
.map_err(|_| UrlConversionErrorKind::UnableToConvert)
} else {
Err(UrlConversionErrorKind::UnsupportedScheme)
impl FromStr for Uri {
type Err = UriParseError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use std::ffi::OsStr;
#[cfg(any(unix, target_os = "redox"))]
use std::os::unix::prelude::OsStrExt;
#[cfg(target_os = "wasi")]
use std::os::wasi::prelude::OsStrExt;
let Some((scheme, rest)) = s.split_once("://") else {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::MalformedUri,
});
};
if scheme != "file" {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::UnsupportedScheme(scheme.to_string()),
});
}
// Assert there is no query or fragment in the URI.
if s.find(['?', '#']).is_some() {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::MalformedUri,
});
}
let mut bytes = Vec::new();
bytes.extend(percent_encoding::percent_decode(rest.as_bytes()));
Ok(PathBuf::from(OsStr::from_bytes(&bytes)).into())
}
}
impl TryFrom<url::Url> for Uri {
type Error = UrlConversionError;
impl TryFrom<&str> for Uri {
type Error = UriParseError;
fn try_from(url: url::Url) -> Result<Self, Self::Error> {
convert_url_to_uri(&url).map_err(|kind| Self::Error { source: url, kind })
}
}
impl TryFrom<&url::Url> for Uri {
type Error = UrlConversionError;
fn try_from(url: &url::Url) -> Result<Self, Self::Error> {
convert_url_to_uri(url).map_err(|kind| Self::Error {
source: url.clone(),
kind,
})
fn try_from(s: &str) -> Result<Self, Self::Error> {
s.parse()
}
}
#[cfg(test)]
mod test {
use super::*;
use url::Url;
#[test]
fn unknown_scheme() {
let url = Url::parse("csharp:/metadata/foo/bar/Baz.cs").unwrap();
assert!(matches!(
Uri::try_from(url),
Err(UrlConversionError {
kind: UrlConversionErrorKind::UnsupportedScheme,
..
let uri = "csharp://metadata/foo/barBaz.cs";
assert_eq!(
uri.parse::<Uri>(),
Err(UriParseError {
source: uri.to_string(),
kind: UriParseErrorKind::UnsupportedScheme("csharp".to_string()),
})
));
);
}
}

View File

@@ -1,176 +0,0 @@
# This file is automatically @generated by Cargo.
# It is not intended for manual editing.
version = 3
[[package]]
name = "bitflags"
version = "1.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
[[package]]
name = "form_urlencoded"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9c384f161156f5260c24a097c56119f9be8c798586aecc13afbcbe7b7e26bf8"
dependencies = [
"percent-encoding",
]
[[package]]
name = "idna"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e14ddfc70884202db2244c223200c204c2bda1bc6e0998d11b5e024d657209e6"
dependencies = [
"unicode-bidi",
"unicode-normalization",
]
[[package]]
name = "itoa"
version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4217ad341ebadf8d8e724e264f13e593e0648f5b3e94b3896a5df283be015ecc"
[[package]]
name = "lsp-types"
version = "0.95.1"
dependencies = [
"bitflags",
"serde",
"serde_json",
"serde_repr",
"url",
]
[[package]]
name = "percent-encoding"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "478c572c3d73181ff3c2539045f6eb99e5491218eae919370993b890cdbdd98e"
[[package]]
name = "proc-macro2"
version = "1.0.47"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ea3d908b0e36316caf9e9e2c4625cdde190a7e6f440d794667ed17a1855e725"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbe448f377a7d6961e30f5955f9b8d106c3f5e449d493ee1b125c1d43c2b5179"
dependencies = [
"proc-macro2",
]
[[package]]
name = "ryu"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4501abdff3ae82a1c1b477a17252eb69cee9e66eb915c1abaa4f44d873df9f09"
[[package]]
name = "serde"
version = "1.0.145"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "728eb6351430bccb993660dfffc5a72f91ccc1295abaa8ce19b27ebe4f75568b"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.145"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81fa1584d3d1bcacd84c277a0dfe21f5b0f6accf4a23d04d4c6d61f1af522b4c"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "serde_json"
version = "1.0.86"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "41feea4228a6f1cd09ec7a3593a682276702cd67b5273544757dae23c096f074"
dependencies = [
"itoa",
"ryu",
"serde",
]
[[package]]
name = "serde_repr"
version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fe39d9fbb0ebf5eb2c7cb7e2a47e4f462fad1379f1166b8ae49ad9eae89a7ca"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "syn"
version = "1.0.102"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3fcd952facd492f9be3ef0d0b7032a6e442ee9b361d4acc2b1d0c4aaa5f613a1"
dependencies = [
"proc-macro2",
"quote",
"unicode-ident",
]
[[package]]
name = "tinyvec"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50"
dependencies = [
"tinyvec_macros",
]
[[package]]
name = "tinyvec_macros"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cda74da7e1a664f795bb1f8a87ec406fb89a02522cf6e50620d016add6dbbf5c"
[[package]]
name = "unicode-bidi"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "099b7128301d285f79ddd55b9a83d5e6b9e97c92e0ea0daebee7263e932de992"
[[package]]
name = "unicode-ident"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ceab39d59e4c9499d4e5a8ee0e2735b891bb7308ac83dfb4e80cad195c9f6f3"
[[package]]
name = "unicode-normalization"
version = "0.1.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c5713f0fc4b5db668a2ac63cdb7bb4469d8c9fed047b1d0292cc7b0ce2ba921"
dependencies = [
"tinyvec",
]
[[package]]
name = "url"
version = "2.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d68c799ae75762b8c3fe375feb6600ef5602c883c5d21eb51c09f22b83c4643"
dependencies = [
"form_urlencoded",
"idna",
"percent-encoding",
"serde",
]

View File

@@ -22,10 +22,10 @@ license = "MIT"
[dependencies]
bitflags = "2.6.0"
serde = { version = "1.0.215", features = ["derive"] }
serde = { version = "1.0.216", features = ["derive"] }
serde_json = "1.0.133"
serde_repr = "0.1"
url = {version = "2.5.4", features = ["serde"]}
percent-encoding.workspace = true
[features]
default = []

View File

@@ -1,3 +1,5 @@
# Helix's `lsp-types`
This is a fork of the [`lsp-types`](https://crates.io/crates/lsp-types) crate ([`gluon-lang/lsp-types`](https://github.com/gluon-lang/lsp-types)) taken at version v0.95.1 (commit [3e6daee](https://github.com/gluon-lang/lsp-types/commit/3e6daee771d14db4094a554b8d03e29c310dfcbe)). This fork focuses usability improvements that make the types easier to work with for the Helix codebase. For example the URL type - the `uri` crate at this version of `lsp-types` - will be replaced with a wrapper around a string.
This is a fork of the [`lsp-types`](https://crates.io/crates/lsp-types) crate ([`gluon-lang/lsp-types`](https://github.com/gluon-lang/lsp-types)) taken at version v0.95.1 (commit [3e6daee](https://github.com/gluon-lang/lsp-types/commit/3e6daee771d14db4094a554b8d03e29c310dfcbe)). This fork focuses on usability improvements that make the types easier to work with for the Helix codebase.
The URL type has been replaced with a newtype wrapper of a `String`. The `lsp-types` crate at the forked version used [`url::Url`](https://docs.rs/url/2.5.0/url/struct.Url.html) which provides conveniences for using URLs according to [the WHATWG URL spec](https://url.spec.whatwg.org). Helix supports a subset of valid URLs, namely the `file://` scheme, so a wrapper around a normal `String` is sufficient. Plus the LSP spec requires URLs to be in [RFC3986](https://tools.ietf.org/html/rfc3986) format instead.

View File

@@ -1,10 +1,9 @@
use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
use crate::{
DynamicRegistrationClientCapabilities, PartialResultParams, Range, SymbolKind, SymbolTag,
TextDocumentPositionParams, WorkDoneProgressOptions, WorkDoneProgressParams,
TextDocumentPositionParams, Url, WorkDoneProgressOptions, WorkDoneProgressParams,
};
pub type CallHierarchyClientCapabilities = DynamicRegistrationClientCapabilities;

View File

@@ -1,11 +1,10 @@
use std::collections::HashMap;
use serde::{Deserialize, Serialize};
use url::Url;
use crate::{
Diagnostic, PartialResultParams, StaticRegistrationOptions, TextDocumentIdentifier,
TextDocumentRegistrationOptions, WorkDoneProgressOptions, WorkDoneProgressParams,
TextDocumentRegistrationOptions, Url, WorkDoneProgressOptions, WorkDoneProgressParams,
};
/// Client capabilities specific to diagnostic pull requests.

View File

@@ -1,10 +1,9 @@
use crate::{
PartialResultParams, Range, TextDocumentIdentifier, WorkDoneProgressOptions,
PartialResultParams, Range, TextDocumentIdentifier, Url, WorkDoneProgressOptions,
WorkDoneProgressParams,
};
use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
#[derive(Debug, Eq, PartialEq, Clone, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]

View File

@@ -3,27 +3,151 @@
Language Server Protocol types for Rust.
Based on: <https://microsoft.github.io/language-server-protocol/specification>
This library uses the URL crate for parsing URIs. Note that there is
some confusion on the meaning of URLs vs URIs:
<http://stackoverflow.com/a/28865728/393898>. According to that
information, on the classical sense of "URLs", "URLs" are a subset of
URIs, But on the modern/new meaning of URLs, they are the same as
URIs. The important take-away aspect is that the URL crate should be
able to parse any URI, such as `urn:isbn:0451450523`.
*/
#![allow(non_upper_case_globals)]
#![forbid(unsafe_code)]
#[macro_use]
extern crate bitflags;
use std::{collections::HashMap, fmt::Debug};
use bitflags::bitflags;
use std::{collections::HashMap, fmt::Debug, path::Path};
use serde::{de, de::Error as Error_, Deserialize, Serialize};
use serde_json::Value;
pub use url::Url;
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub struct Url(String);
// <https://datatracker.ietf.org/doc/html/rfc3986#section-2.2>, also see
// <https://github.com/microsoft/vscode-uri/blob/6dec22d7dcc6c63c30343d3a8d56050d0078cb6a/src/uri.ts#L454-L477>
const RESERVED: &percent_encoding::AsciiSet = &percent_encoding::CONTROLS
// GEN_DELIMS
.add(b':')
.add(b'/')
.add(b'?')
.add(b'#')
.add(b'[')
.add(b']')
.add(b'@')
// SUB_DELIMS
.add(b'!')
.add(b'$')
.add(b'&')
.add(b'\'')
.add(b'(')
.add(b')')
.add(b'*')
.add(b'+')
.add(b',')
.add(b';')
.add(b'=');
impl Url {
#[cfg(any(unix, target_os = "redox", target_os = "wasi"))]
pub fn from_file_path<P: AsRef<Path>>(path: P) -> Self {
#[cfg(any(unix, target_os = "redox"))]
use std::os::unix::prelude::OsStrExt;
#[cfg(target_os = "wasi")]
use std::os::wasi::prelude::OsStrExt;
let mut serialization = String::from("file://");
// skip the root component
for component in path.as_ref().components().skip(1) {
serialization.push('/');
serialization.extend(percent_encoding::percent_encode(
component.as_os_str().as_bytes(),
RESERVED,
));
}
if &serialization == "file://" {
// An URL's path must not be empty.
serialization.push('/');
}
Self(serialization)
}
#[cfg(windows)]
pub fn from_file_path<P: AsRef<Path>>(path: P) -> Self {
from_file_path_windows(path.as_ref())
}
#[cfg_attr(not(windows), allow(dead_code))]
fn from_file_path_windows(path: &Path) -> Self {
use std::path::{Component, Prefix};
fn is_windows_drive_letter(segment: &str) -> bool {
segment.len() == 2
&& (segment.as_bytes()[0] as char).is_ascii_alphabetic()
&& matches!(segment.as_bytes()[1], b':' | b'|')
}
assert!(path.is_absolute());
let mut serialization = String::from("file://");
let mut components = path.components();
let host_start = serialization.len() + 1;
match components.next() {
Some(Component::Prefix(ref p)) => match p.kind() {
Prefix::Disk(letter) | Prefix::VerbatimDisk(letter) => {
serialization.push('/');
serialization.push(letter as char);
serialization.push(':');
}
// TODO: Prefix::UNC | Prefix::VerbatimUNC
_ => todo!("support UNC drives"),
},
_ => unreachable!("absolute windows paths must start with a prefix"),
}
let mut path_only_has_prefix = true;
for component in components {
if component == Component::RootDir {
continue;
}
path_only_has_prefix = false;
serialization.push('/');
serialization.extend(percent_encoding::percent_encode(
component.as_os_str().as_encoded_bytes(),
RESERVED,
));
}
if serialization.len() > host_start
&& is_windows_drive_letter(&serialization[host_start..])
&& path_only_has_prefix
{
serialization.push('/');
}
Self(serialization)
}
pub fn from_directory_path<P: AsRef<Path>>(path: P) -> Self {
let Self(mut serialization) = Self::from_file_path(path);
if !serialization.ends_with('/') {
serialization.push('/');
}
Self(serialization)
}
/// Returns the serialized representation of the URL as a `&str`
pub fn as_str(&self) -> &str {
&self.0
}
/// Consumes the URL, converting into a `String`.
/// Note that the string is the serialized representation of the URL.
pub fn into_string(self) -> String {
self.0
}
}
impl From<&str> for Url {
fn from(value: &str) -> Self {
Self(value.to_string())
}
}
// Large enough to contain any enumeration name defined in this crate
type PascalCaseBuf = [u8; 32];
@@ -2843,14 +2967,14 @@ mod tests {
test_serialization(
&WorkspaceEdit {
changes: Some(
vec![(Url::parse("file://test").unwrap(), vec![])]
vec![(Url::from("file://test"), vec![])]
.into_iter()
.collect(),
),
document_changes: None,
..Default::default()
},
r#"{"changes":{"file://test/":[]}}"#,
r#"{"changes":{"file://test":[]}}"#,
);
}

View File

@@ -4,9 +4,7 @@ use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
use crate::Range;
use crate::{Range, Url};
#[derive(Eq, PartialEq, Clone, Copy, Deserialize, Serialize)]
#[serde(transparent)]

View File

@@ -1,8 +1,7 @@
use serde::{Deserialize, Serialize};
use url::Url;
use crate::{
FullDocumentDiagnosticReport, PartialResultParams, UnchangedDocumentDiagnosticReport,
FullDocumentDiagnosticReport, PartialResultParams, UnchangedDocumentDiagnosticReport, Url,
WorkDoneProgressParams,
};

View File

@@ -1,7 +1,6 @@
use serde::{Deserialize, Serialize};
use url::Url;
use crate::OneOf;
use crate::{OneOf, Url};
#[derive(Debug, Eq, PartialEq, Clone, Default, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]

View File

@@ -26,8 +26,8 @@ globset = "0.4.15"
log = "0.4"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1.41", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] }
tokio-stream = "0.1.15"
tokio = { version = "1.42", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] }
tokio-stream = "0.1.17"
parking_lot = "0.12.3"
arc-swap = "1"
slotmap.workspace = true

View File

@@ -32,14 +32,17 @@ use tokio::{
},
};
fn workspace_for_uri(uri: lsp::Url) -> WorkspaceFolder {
fn workspace_for_path(path: &Path) -> WorkspaceFolder {
let name = path
.iter()
.last()
.expect("workspace paths should be non-empty")
.to_string_lossy()
.to_string();
lsp::WorkspaceFolder {
name: uri
.path_segments()
.and_then(|segments| segments.last())
.map(|basename| basename.to_string())
.unwrap_or_default(),
uri,
name,
uri: lsp::Url::from_directory_path(path),
}
}
@@ -55,7 +58,7 @@ pub struct Client {
config: Option<Value>,
root_path: std::path::PathBuf,
root_uri: Option<lsp::Url>,
workspace_folders: Mutex<Vec<lsp::WorkspaceFolder>>,
workspace_folders: Mutex<Vec<PathBuf>>,
initialize_notify: Arc<Notify>,
/// workspace folders added while the server is still initializing
req_timeout: u64,
@@ -80,16 +83,13 @@ impl Client {
&workspace,
workspace_is_cwd,
);
let root_uri = root
.as_ref()
.and_then(|root| lsp::Url::from_file_path(root).ok());
if self.root_path == root.unwrap_or(workspace)
|| root_uri.as_ref().map_or(false, |root_uri| {
if &self.root_path == root.as_ref().unwrap_or(&workspace)
|| root.as_ref().is_some_and(|root| {
self.workspace_folders
.lock()
.iter()
.any(|workspace| &workspace.uri == root_uri)
.any(|workspace| workspace == root)
})
{
// workspace URI is already registered so we can use this client
@@ -113,15 +113,16 @@ impl Client {
// wait and see if anyone ever runs into it.
tokio::spawn(async move {
client.initialize_notify.notified().await;
if let Some(workspace_folders_caps) = client
if let Some((workspace_folders_caps, root)) = client
.capabilities()
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
.zip(root)
{
client.add_workspace_folder(
root_uri,
root,
workspace_folders_caps.change_notifications.as_ref(),
);
}
@@ -129,16 +130,14 @@ impl Client {
return true;
};
if let Some(workspace_folders_caps) = capabilities
if let Some((workspace_folders_caps, root)) = capabilities
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
.zip(root)
{
self.add_workspace_folder(
root_uri,
workspace_folders_caps.change_notifications.as_ref(),
);
self.add_workspace_folder(root, workspace_folders_caps.change_notifications.as_ref());
true
} else {
// the server doesn't support multi workspaces, we need a new client
@@ -148,29 +147,19 @@ impl Client {
fn add_workspace_folder(
&self,
root_uri: Option<lsp::Url>,
root: PathBuf,
change_notifications: Option<&OneOf<bool, String>>,
) {
// root_uri is None just means that there isn't really any LSP workspace
// associated with this file. For servers that support multiple workspaces
// there is just one server so we can always just use that shared instance.
// No need to add a new workspace root here as there is no logical root for this file
// let the server deal with this
let Some(root_uri) = root_uri else {
return;
};
let workspace = workspace_for_path(&root);
// server supports workspace folders, let's add the new root to the list
self.workspace_folders
.lock()
.push(workspace_for_uri(root_uri.clone()));
self.workspace_folders.lock().push(root);
if Some(&OneOf::Left(false)) == change_notifications {
// server specifically opted out of DidWorkspaceChange notifications
// let's assume the server will request the workspace folders itself
// and that we can therefore reuse the client (but are done now)
return;
}
tokio::spawn(self.did_change_workspace(vec![workspace_for_uri(root_uri)], Vec::new()));
tokio::spawn(self.did_change_workspace(vec![workspace], Vec::new()));
}
#[allow(clippy::type_complexity, clippy::too_many_arguments)]
@@ -179,8 +168,8 @@ impl Client {
args: &[String],
config: Option<Value>,
server_environment: HashMap<String, String>,
root_path: PathBuf,
root_uri: Option<lsp::Url>,
root: Option<PathBuf>,
workspace: PathBuf,
id: LanguageServerId,
name: String,
req_timeout: u64,
@@ -212,10 +201,11 @@ impl Client {
let (server_rx, server_tx, initialize_notify) =
Transport::start(reader, writer, stderr, id, name.clone());
let workspace_folders = root_uri
.clone()
.map(|root| vec![workspace_for_uri(root)])
.unwrap_or_default();
let workspace_folders = root.clone().into_iter().collect();
let root_uri = root.clone().map(lsp::Url::from_file_path);
// `root_uri` and `workspace_folder` can be empty in case there is no workspace
// `root_url` can not, use `workspace` as a fallback
let root_path = root.unwrap_or(workspace);
let client = Self {
id,
@@ -376,10 +366,12 @@ impl Client {
self.config.as_ref()
}
pub async fn workspace_folders(
&self,
) -> parking_lot::MutexGuard<'_, Vec<lsp::WorkspaceFolder>> {
self.workspace_folders.lock()
pub async fn workspace_folders(&self) -> Vec<lsp::WorkspaceFolder> {
self.workspace_folders
.lock()
.iter()
.map(|path| workspace_for_path(path))
.collect()
}
/// Execute a RPC request on the language server.
@@ -426,32 +418,29 @@ impl Client {
let server_tx = self.server_tx.clone();
let id = self.next_request_id();
// it' important this is not part of the future so that it gets
// executed right away so that the request order stays concisents
let rx = serde_json::to_value(params)
.map_err(Error::from)
.and_then(|params| {
let request = jsonrpc::MethodCall {
jsonrpc: Some(jsonrpc::Version::V2),
id: id.clone(),
method: R::METHOD.to_string(),
params: Self::value_into_params(params),
};
let (tx, rx) = channel::<Result<Value>>(1);
server_tx
.send(Payload::Request {
chan: tx,
value: request,
})
.map_err(|e| Error::Other(e.into()))?;
Ok(rx)
});
let params = serde_json::to_value(params);
async move {
use std::time::Duration;
use tokio::time::timeout;
let request = jsonrpc::MethodCall {
jsonrpc: Some(jsonrpc::Version::V2),
id: id.clone(),
method: R::METHOD.to_string(),
params: Self::value_into_params(params?),
};
let (tx, mut rx) = channel::<Result<Value>>(1);
server_tx
.send(Payload::Request {
chan: tx,
value: request,
})
.map_err(|e| Error::Other(e.into()))?;
// TODO: delay other calls until initialize success
timeout(Duration::from_secs(timeout_secs), rx?.recv())
timeout(Duration::from_secs(timeout_secs), rx.recv())
.await
.map_err(|_| Error::Timeout(id))? // return Timeout
.ok_or(Error::StreamClosed)?
@@ -468,25 +457,21 @@ impl Client {
{
let server_tx = self.server_tx.clone();
// it' important this is not part of the future so that it gets
// executed right away so that the request order stays consisents
let res = serde_json::to_value(params)
.map_err(Error::from)
.and_then(|params| {
let params = serde_json::to_value(params)?;
async move {
let params = serde_json::to_value(params)?;
let notification = jsonrpc::Notification {
jsonrpc: Some(jsonrpc::Version::V2),
method: R::METHOD.to_string(),
params: Self::value_into_params(params),
};
server_tx
.send(Payload::Notification(notification))
.map_err(|e| Error::Other(e.into()))
});
// TODO: this function is not async and never should have been
// but turning it into non-async function is a big refactor
async move { res }
let notification = jsonrpc::Notification {
jsonrpc: Some(jsonrpc::Version::V2),
method: R::METHOD.to_string(),
params: Self::value_into_params(params),
};
server_tx
.send(Payload::Notification(notification))
.map_err(|e| Error::Other(e.into()))?;
Ok(())
}
}
/// Reply to a language server RPC call.
@@ -499,27 +484,26 @@ impl Client {
let server_tx = self.server_tx.clone();
let output = match result {
Ok(result) => serde_json::to_value(result).map(|result| {
Output::Success(Success {
async move {
let output = match result {
Ok(result) => Output::Success(Success {
jsonrpc: Some(Version::V2),
id,
result,
})
}),
Err(error) => Ok(Output::Failure(Failure {
jsonrpc: Some(Version::V2),
id,
error,
})),
};
result: serde_json::to_value(result)?,
}),
Err(error) => Output::Failure(Failure {
jsonrpc: Some(Version::V2),
id,
error,
}),
};
let res = output.map_err(Error::from).and_then(|output| {
server_tx
.send(Payload::Response(output))
.map_err(|e| Error::Other(e.into()))
});
async move { res }
.map_err(|e| Error::Other(e.into()))?;
Ok(())
}
}
// -------------------------------------------------------------------------------------------
@@ -534,7 +518,7 @@ impl Client {
#[allow(deprecated)]
let params = lsp::InitializeParams {
process_id: Some(std::process::id()),
workspace_folders: Some(self.workspace_folders.lock().clone()),
workspace_folders: Some(self.workspace_folders().await),
// root_path is obsolete, but some clients like pyright still use it so we specify both.
// clients will prefer _uri if possible
root_path: self.root_path.to_str().map(|path| path.to_owned()),
@@ -756,11 +740,11 @@ impl Client {
} else {
Url::from_file_path(path)
};
Some(url.ok()?.to_string())
url.into_string()
};
let files = vec![lsp::FileRename {
old_uri: url_from_path(old_path)?,
new_uri: url_from_path(new_path)?,
old_uri: url_from_path(old_path),
new_uri: url_from_path(new_path),
}];
let request = self.call_with_timeout::<lsp::request::WillRenameFiles>(
&lsp::RenameFilesParams { files },
@@ -790,12 +774,12 @@ impl Client {
} else {
Url::from_file_path(path)
};
Some(url.ok()?.to_string())
url.into_string()
};
let files = vec![lsp::FileRename {
old_uri: url_from_path(old_path)?,
new_uri: url_from_path(new_path)?,
old_uri: url_from_path(old_path),
new_uri: url_from_path(new_path),
}];
Some(self.notify::<lsp::notification::DidRenameFiles>(lsp::RenameFilesParams { files }))
}

View File

@@ -106,9 +106,7 @@ impl Handler {
log::warn!("LSP client was dropped: {id}");
return false;
};
let Ok(uri) = lsp::Url::from_file_path(&path) else {
return true;
};
let uri = lsp::Url::from_file_path(&path);
log::debug!(
"Sending didChangeWatchedFiles notification to client '{}'",
client.name()

View File

@@ -853,12 +853,8 @@ fn start_client(
workspace_is_cwd,
);
// `root_uri` and `workspace_folder` can be empty in case there is no workspace
// `root_url` can not, use `workspace` as a fallback
let root_path = root.clone().unwrap_or_else(|| workspace.clone());
let root_uri = root.and_then(|root| lsp::Url::from_file_path(root).ok());
if let Some(globset) = &ls_config.required_root_patterns {
let root_path = root.as_ref().unwrap_or(&workspace);
if !root_path
.read_dir()?
.flatten()
@@ -874,8 +870,8 @@ fn start_client(
&ls_config.args,
ls_config.config.clone(),
ls_config.environment.clone(),
root_path,
root_uri,
root,
workspace,
id,
name,
ls_config.timeout,

View File

@@ -31,8 +31,8 @@ impl<T> RangeBounds<T> for Range<T> {
/// `sub_set.all(|rb| super_set.any(|ra| ra.contains(rb)))` that runs in O(m+n)
/// instead of O(mn) (and in many cases faster).
///
/// Both iterators must uphold a the follwong invariants:
/// * ranges must not overlap (but they can be adjecent)
/// Both iterators must uphold a the following invariants:
/// * ranges must not overlap (but they can be adjacent)
/// * ranges must be sorted
pub fn is_subset<const ALLOW_EMPTY: bool>(
mut super_set: impl Iterator<Item = Range>,

View File

@@ -74,7 +74,7 @@ grep-searcher = "0.1.14"
[target.'cfg(not(windows))'.dependencies] # https://github.com/vorner/signal-hook/issues/100
signal-hook-tokio = { version = "0.3", features = ["futures-v0_3"] }
libc = "0.2.167"
libc = "0.2.168"
[target.'cfg(target_os = "macos")'.dependencies]
crossterm = { version = "0.28", features = ["event-stream", "use-dev-tty", "libc"] }

View File

@@ -175,7 +175,7 @@ impl Application {
nr_of_files += 1;
if file.is_dir() {
return Err(anyhow::anyhow!(
"expected a path to file, found a directory. (to open a directory pass it as first argument)"
"expected a path to file, but found a directory: {file:?}. (to open a directory pass it as first argument)"
));
} else {
// If the user passes in either `--vsplit` or
@@ -189,6 +189,7 @@ impl Application {
Some(Layout::Horizontal) => Action::HorizontalSplit,
None => Action::Load,
};
let old_id = editor.document_id_by_path(&file);
let doc_id = match editor.open(&file, action) {
// Ignore irregular files during application init.
Err(DocumentOpenError::IrregularFile) => {
@@ -196,6 +197,11 @@ impl Application {
continue;
}
Err(err) => return Err(anyhow::anyhow!(err)),
// We can't open more than 1 buffer for 1 file, in this case we already have opened this file previously
Ok(doc_id) if old_id == Some(doc_id) => {
nr_of_files -= 1;
doc_id
}
Ok(doc_id) => doc_id,
};
// with Action::Load all documents have the same view
@@ -738,7 +744,7 @@ impl Application {
}
}
Notification::PublishDiagnostics(mut params) => {
let uri = match helix_core::Uri::try_from(params.uri) {
let uri = match helix_core::Uri::try_from(params.uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");
@@ -1137,7 +1143,8 @@ impl Application {
..
} = params
{
self.jobs.callback(crate::open_external_url_callback(uri));
self.jobs
.callback(crate::open_external_url_callback(uri.as_str()));
return lsp::ShowDocumentResult { success: true };
};
@@ -1148,7 +1155,7 @@ impl Application {
..
} = params;
let uri = match helix_core::Uri::try_from(uri) {
let uri = match helix_core::Uri::try_from(uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");

View File

@@ -129,7 +129,7 @@ pub(crate) fn parse_file(s: &str) -> (PathBuf, Position) {
///
/// Does not validate if file.rs is a file or directory.
fn split_path_row_col(s: &str) -> Option<(PathBuf, Position)> {
let mut s = s.rsplitn(3, ':');
let mut s = s.trim_end_matches(':').rsplitn(3, ':');
let col: usize = s.next()?.parse().ok()?;
let row: usize = s.next()?.parse().ok()?;
let path = s.next()?.into();
@@ -141,7 +141,7 @@ fn split_path_row_col(s: &str) -> Option<(PathBuf, Position)> {
///
/// Does not validate if file.rs is a file or directory.
fn split_path_row(s: &str) -> Option<(PathBuf, Position)> {
let (path, row) = s.rsplit_once(':')?;
let (path, row) = s.trim_end_matches(':').rsplit_once(':')?;
let row: usize = row.parse().ok()?;
let path = path.into();
let pos = Position::new(row.saturating_sub(1), 0);

View File

@@ -1347,7 +1347,9 @@ fn open_url(cx: &mut Context, url: Url, action: Action) {
.unwrap_or_default();
if url.scheme() != "file" {
return cx.jobs.callback(crate::open_external_url_callback(url));
return cx
.jobs
.callback(crate::open_external_url_callback(url.as_str()));
}
let content_type = std::fs::File::open(url.path()).and_then(|file| {
@@ -1360,9 +1362,9 @@ fn open_url(cx: &mut Context, url: Url, action: Action) {
// we attempt to open binary files - files that can't be open in helix - using external
// program as well, e.g. pdf files or images
match content_type {
Ok(content_inspector::ContentType::BINARY) => {
cx.jobs.callback(crate::open_external_url_callback(url))
}
Ok(content_inspector::ContentType::BINARY) => cx
.jobs
.callback(crate::open_external_url_callback(url.as_str())),
Ok(_) | Err(_) => {
let path = &rel_path.join(url.path());
if path.is_dir() {
@@ -2183,7 +2185,7 @@ fn searcher(cx: &mut Context, direction: Direction) {
completions
.iter()
.filter(|comp| comp.starts_with(input))
.map(|comp| (0.., std::borrow::Cow::Owned(comp.clone())))
.map(|comp| (0.., comp.clone().into()))
.collect()
},
move |cx, regex, event| {
@@ -3477,40 +3479,42 @@ fn open(cx: &mut Context, open: Open) {
let selection = doc.selection(view.id);
let mut ranges = SmallVec::with_capacity(selection.len());
let mut offs = 0;
let mut transaction = Transaction::change_by_selection(contents, selection, |range| {
let cursor_line = text.char_to_line(match open {
// the line number, where the cursor is currently
let curr_line_num = text.char_to_line(match open {
Open::Below => graphemes::prev_grapheme_boundary(text, range.to()),
Open::Above => range.from(),
});
let new_line = match open {
// adjust position to the end of the line (next line - 1)
Open::Below => cursor_line + 1,
// adjust position to the end of the previous line (current line - 1)
Open::Above => cursor_line,
// the next line number, where the cursor will be, after finishing the transaction
let next_new_line_num = match open {
Open::Below => curr_line_num + 1,
Open::Above => curr_line_num,
};
let line_num = new_line.saturating_sub(1);
let above_next_new_line_num = next_new_line_num.saturating_sub(1);
let continue_comment_token = if doc.config.load().continue_comments {
doc.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, curr_line_num))
} else {
None
};
// Index to insert newlines after, as well as the char width
// to use to compensate for those inserted newlines.
let (line_end_index, line_end_offset_width) = if new_line == 0 {
let (above_next_line_end_index, above_next_line_end_width) = if next_new_line_num == 0 {
(0, 0)
} else {
(
line_end_char_index(&text, line_num),
line_end_char_index(&text, above_next_new_line_num),
doc.line_ending.len_chars(),
)
};
let continue_comment_token = doc
.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, cursor_line));
let line = text.line(cursor_line);
let line = text.line(curr_line_num);
let indent = match line.first_non_whitespace_char() {
Some(pos) if continue_comment_token.is_some() => line.slice(..pos).to_string(),
_ => indent::indent_for_newline(
@@ -3520,26 +3524,36 @@ fn open(cx: &mut Context, open: Open) {
&doc.indent_style,
doc.tab_width(),
text,
line_num,
line_end_index,
cursor_line,
above_next_new_line_num,
above_next_line_end_index,
curr_line_num,
),
};
let indent_len = indent.len();
let mut text = String::with_capacity(1 + indent_len);
text.push_str(doc.line_ending.as_str());
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
if open == Open::Above && next_new_line_num == 0 {
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
}
text.push_str(doc.line_ending.as_str());
} else {
text.push_str(doc.line_ending.as_str());
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
}
}
let text = text.repeat(count);
// calculate new selection ranges
let pos = offs + line_end_index + line_end_offset_width;
let pos = above_next_line_end_index + above_next_line_end_width;
let comment_len = continue_comment_token
.map(|token| token.len() + 1) // `+ 1` for the extra space added
.unwrap_or_default();
@@ -3552,9 +3566,11 @@ fn open(cx: &mut Context, open: Open) {
));
}
offs += text.chars().count();
(line_end_index, line_end_index, Some(text.into()))
(
above_next_line_end_index,
above_next_line_end_index,
Some(text.into()),
)
});
transaction = transaction.with_selection(Selection::new(ranges, selection.primary_index()));
@@ -3988,10 +4004,13 @@ pub mod insert {
let mut new_text = String::new();
let continue_comment_token = doc
.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, current_line));
let continue_comment_token = if doc.config.load().continue_comments {
doc.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, current_line))
} else {
None
};
let (from, to, local_offs) = if let Some(idx) =
text.slice(line_start..pos).last_non_whitespace_char()

View File

@@ -69,7 +69,7 @@ struct Location {
}
fn lsp_location_to_location(location: lsp::Location) -> Option<Location> {
let uri = match location.uri.try_into() {
let uri = match location.uri.as_str().try_into() {
Ok(uri) => uri,
Err(err) => {
log::warn!("discarding invalid or unsupported URI: {err}");
@@ -456,7 +456,7 @@ pub fn workspace_symbol_picker(cx: &mut Context) {
.unwrap_or_default()
.into_iter()
.filter_map(|symbol| {
let uri = match Uri::try_from(&symbol.location.uri) {
let uri = match Uri::try_from(symbol.location.uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::warn!("discarding symbol with invalid URI: {err}");
@@ -510,7 +510,7 @@ pub fn workspace_symbol_picker(cx: &mut Context) {
.to_string()
.into()
} else {
item.symbol.location.uri.to_string().into()
item.symbol.location.uri.as_str().into()
}
}),
];

View File

@@ -2507,7 +2507,8 @@ fn read(cx: &mut compositor::Context, args: &[Cow<str>], event: PromptEvent) ->
ensure!(args.len() == 1, "only the file name is expected");
let filename = args.first().unwrap();
let path = PathBuf::from(filename.to_string());
let path = helix_stdx::path::expand_tilde(PathBuf::from(filename.to_string()));
ensure!(
path.exists() && path.is_file(),
"path is not a file: {:?}",
@@ -3197,8 +3198,8 @@ pub(super) fn command_mode(cx: &mut Context) {
{
completer(editor, word)
.into_iter()
.map(|(range, file)| {
let file = shellwords::escape(file);
.map(|(range, mut file)| {
file.content = shellwords::escape(file.content);
// offset ranges to input
let offset = input.len() - word_len;

View File

@@ -9,6 +9,7 @@ use crate::handlers::auto_save::AutoSaveHandler;
use crate::handlers::completion::CompletionHandler;
use crate::handlers::signature_help::SignatureHelpHandler;
pub use completion::trigger_auto_completion;
pub use helix_view::handlers::Handlers;
mod auto_save;

View File

@@ -1,86 +1,307 @@
use std::collections::HashMap;
use std::collections::HashSet;
use std::sync::Arc;
use std::time::Duration;
use anyhow::Result;
use arc_swap::ArcSwap;
use futures_util::stream::FuturesUnordered;
use futures_util::FutureExt;
use helix_core::chars::char_is_word;
use helix_core::completion::CompletionProvider;
use helix_core::syntax::LanguageServerFeature;
use helix_event::{register_hook, send_blocking, TaskHandle};
use helix_event::{cancelable_future, register_hook, send_blocking, TaskController, TaskHandle};
use helix_lsp::lsp;
use helix_lsp::util::pos_to_lsp_pos;
use helix_stdx::rope::RopeSliceExt;
use helix_view::document::{Mode, SavePoint};
use helix_view::handlers::lsp::CompletionEvent;
use helix_view::Editor;
use helix_view::{DocumentId, Editor, ViewId};
use path::path_completion;
use tokio::sync::mpsc::Sender;
use tokio::task::JoinSet;
use tokio::time::Instant;
use tokio_stream::StreamExt as _;
use crate::commands;
use crate::compositor::Compositor;
use crate::config::Config;
use crate::events::{OnModeSwitch, PostCommand, PostInsertChar};
use crate::handlers::completion::request::{request_incomplete_completion_list, Trigger};
use crate::job::dispatch;
use crate::job::{dispatch, dispatch_blocking};
use crate::keymap::MappableCommand;
use crate::ui::editor::InsertEvent;
use crate::ui::lsp::SignatureHelp;
use crate::ui::{self, Popup};
use super::Handlers;
pub use item::{CompletionItem, CompletionItems, CompletionResponse, LspCompletionItem};
pub use request::CompletionHandler;
pub use item::{CompletionItem, LspCompletionItem};
pub use resolve::ResolveHandler;
mod item;
mod path;
mod request;
mod resolve;
async fn handle_response(
requests: &mut JoinSet<CompletionResponse>,
incomplete: bool,
) -> Option<CompletionResponse> {
loop {
let response = requests.join_next().await?.unwrap();
if !incomplete && !response.incomplete && response.items.is_empty() {
continue;
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
enum TriggerKind {
Auto,
TriggerChar,
Manual,
}
#[derive(Debug, Clone, Copy)]
struct Trigger {
pos: usize,
view: ViewId,
doc: DocumentId,
kind: TriggerKind,
}
#[derive(Debug)]
pub(super) struct CompletionHandler {
/// currently active trigger which will cause a
/// completion request after the timeout
trigger: Option<Trigger>,
in_flight: Option<Trigger>,
task_controller: TaskController,
config: Arc<ArcSwap<Config>>,
}
impl CompletionHandler {
pub fn new(config: Arc<ArcSwap<Config>>) -> CompletionHandler {
Self {
config,
task_controller: TaskController::new(),
trigger: None,
in_flight: None,
}
return Some(response);
}
}
async fn replace_completions(
handle: TaskHandle,
mut requests: JoinSet<CompletionResponse>,
incomplete: bool,
) {
while let Some(response) = handle_response(&mut requests, incomplete).await {
let handle = handle.clone();
dispatch(move |editor, compositor| {
let editor_view = compositor.find::<ui::EditorView>().unwrap();
let Some(completion) = &mut editor_view.completion else {
return;
impl helix_event::AsyncHook for CompletionHandler {
type Event = CompletionEvent;
fn handle_event(
&mut self,
event: Self::Event,
_old_timeout: Option<Instant>,
) -> Option<Instant> {
if self.in_flight.is_some() && !self.task_controller.is_running() {
self.in_flight = None;
}
match event {
CompletionEvent::AutoTrigger {
cursor: trigger_pos,
doc,
view,
} => {
// techically it shouldn't be possible to switch views/documents in insert mode
// but people may create weird keymaps/use the mouse so lets be extra careful
if self
.trigger
.or(self.in_flight)
.map_or(true, |trigger| trigger.doc != doc || trigger.view != view)
{
self.trigger = Some(Trigger {
pos: trigger_pos,
view,
doc,
kind: TriggerKind::Auto,
});
}
}
CompletionEvent::TriggerChar { cursor, doc, view } => {
// immediately request completions and drop all auto completion requests
self.task_controller.cancel();
self.trigger = Some(Trigger {
pos: cursor,
view,
doc,
kind: TriggerKind::TriggerChar,
});
}
CompletionEvent::ManualTrigger { cursor, doc, view } => {
// immediately request completions and drop all auto completion requests
self.trigger = Some(Trigger {
pos: cursor,
view,
doc,
kind: TriggerKind::Manual,
});
// stop debouncing immediately and request the completion
self.finish_debounce();
return None;
}
CompletionEvent::Cancel => {
self.trigger = None;
self.task_controller.cancel();
}
CompletionEvent::DeleteText { cursor } => {
// if we deleted the original trigger, abort the completion
if matches!(self.trigger.or(self.in_flight), Some(Trigger{ pos, .. }) if cursor < pos)
{
self.trigger = None;
self.task_controller.cancel();
}
}
}
self.trigger.map(|trigger| {
// if the current request was closed forget about it
// otherwise immediately restart the completion request
let timeout = if trigger.kind == TriggerKind::Auto {
self.config.load().editor.completion_timeout
} else {
// we want almost instant completions for trigger chars
// and restarting completion requests. The small timeout here mainly
// serves to better handle cases where the completion handler
// may fall behind (so multiple events in the channel) and macros
Duration::from_millis(5)
};
if handle.is_canceled() {
log::error!("dropping outdated completion response");
return;
}
completion.replace_provider_completions(response);
if completion.is_empty() {
editor_view.clear_completion(editor);
// clearing completions might mean we want to immediately rerequest them (usually
// this occurs if typing a trigger char)
trigger_auto_completion(&editor.handlers.completions, editor, false);
}
Instant::now() + timeout
})
.await;
}
fn finish_debounce(&mut self) {
let trigger = self.trigger.take().expect("debounce always has a trigger");
self.in_flight = Some(trigger);
let handle = self.task_controller.restart();
dispatch_blocking(move |editor, compositor| {
request_completion(trigger, handle, editor, compositor)
});
}
}
fn request_completion(
mut trigger: Trigger,
handle: TaskHandle,
editor: &mut Editor,
compositor: &mut Compositor,
) {
let (view, doc) = current!(editor);
if compositor
.find::<ui::EditorView>()
.unwrap()
.completion
.is_some()
|| editor.mode != Mode::Insert
{
return;
}
let text = doc.text();
let cursor = doc.selection(view.id).primary().cursor(text.slice(..));
if trigger.view != view.id || trigger.doc != doc.id() || cursor < trigger.pos {
return;
}
// this looks odd... Why are we not using the trigger position from
// the `trigger` here? Won't that mean that the trigger char doesn't get
// send to the LS if we type fast enougn? Yes that is true but it's
// not actually a problem. The LSP will resolve the completion to the identifier
// anyway (in fact sending the later position is necessary to get the right results
// from LSPs that provide incomplete completion list). We rely on trigger offset
// and primary cursor matching for multi-cursor completions so this is definitely
// necessary from our side too.
trigger.pos = cursor;
let trigger_text = text.slice(..cursor);
let mut seen_language_servers = HashSet::new();
let mut futures: FuturesUnordered<_> = doc
.language_servers_with_feature(LanguageServerFeature::Completion)
.filter(|ls| seen_language_servers.insert(ls.id()))
.map(|ls| {
let language_server_id = ls.id();
let offset_encoding = ls.offset_encoding();
let pos = pos_to_lsp_pos(text, cursor, offset_encoding);
let doc_id = doc.identifier();
let context = if trigger.kind == TriggerKind::Manual {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::INVOKED,
trigger_character: None,
}
} else {
let trigger_char =
ls.capabilities()
.completion_provider
.as_ref()
.and_then(|provider| {
provider
.trigger_characters
.as_deref()?
.iter()
.find(|&trigger| trigger_text.ends_with(trigger))
});
if trigger_char.is_some() {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::TRIGGER_CHARACTER,
trigger_character: trigger_char.cloned(),
}
} else {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::INVOKED,
trigger_character: None,
}
}
};
let completion_response = ls.completion(doc_id, pos, None, context).unwrap();
async move {
let json = completion_response.await?;
let response: Option<lsp::CompletionResponse> = serde_json::from_value(json)?;
let items = match response {
Some(lsp::CompletionResponse::Array(items)) => items,
// TODO: do something with is_incomplete
Some(lsp::CompletionResponse::List(lsp::CompletionList {
is_incomplete: _is_incomplete,
items,
})) => items,
None => Vec::new(),
}
.into_iter()
.map(|item| {
CompletionItem::Lsp(LspCompletionItem {
item,
provider: language_server_id,
resolved: false,
})
})
.collect();
anyhow::Ok(items)
}
.boxed()
})
.chain(path_completion(cursor, text.clone(), doc, handle.clone()))
.collect();
let future = async move {
let mut items = Vec::new();
while let Some(lsp_items) = futures.next().await {
match lsp_items {
Ok(mut lsp_items) => items.append(&mut lsp_items),
Err(err) => {
log::debug!("completion request failed: {err:?}");
}
};
}
items
};
let savepoint = doc.savepoint(view);
let ui = compositor.find::<ui::EditorView>().unwrap();
ui.last_insert.1.push(InsertEvent::RequestCompletion);
tokio::spawn(async move {
let items = cancelable_future(future, &handle).await;
let Some(items) = items.filter(|items| !items.is_empty()) else {
return;
};
dispatch(move |editor, compositor| {
show_completion(editor, compositor, items, trigger, savepoint);
drop(handle)
})
.await
});
}
fn show_completion(
editor: &mut Editor,
compositor: &mut Compositor,
items: Vec<CompletionItem>,
incomplete_completion_lists: HashMap<CompletionProvider, i8>,
trigger: Trigger,
savepoint: Arc<SavePoint>,
) {
@@ -100,14 +321,7 @@ fn show_completion(
return;
}
let completion_area = ui.set_completion(
editor,
savepoint,
items,
incomplete_completion_lists,
trigger.pos,
size,
);
let completion_area = ui.set_completion(editor, savepoint, items, trigger.pos, size);
let signature_help_area = compositor
.find_id::<Popup<SignatureHelp>>(SignatureHelp::ID)
.map(|signature_help| signature_help.area(size, editor));
@@ -181,21 +395,18 @@ pub fn trigger_auto_completion(
}
}
fn update_completion_filter(cx: &mut commands::Context, c: Option<char>) {
fn update_completions(cx: &mut commands::Context, c: Option<char>) {
cx.callback.push(Box::new(move |compositor, cx| {
let editor_view = compositor.find::<ui::EditorView>().unwrap();
if let Some(ui) = &mut editor_view.completion {
ui.update_filter(c);
if ui.is_empty() || c.is_some_and(|c| !char_is_word(c)) {
if let Some(completion) = &mut editor_view.completion {
completion.update_filter(c);
if completion.is_empty() {
editor_view.clear_completion(cx.editor);
// clearing completions might mean we want to immediately rerequest them (usually
// this occurs if typing a trigger char)
if c.is_some() {
trigger_auto_completion(&cx.editor.handlers.completions, cx.editor, false);
}
} else {
let handle = ui.incomplete_list_controller.restart();
request_incomplete_completion_list(cx.editor, ui, handle)
}
}
}))
@@ -211,7 +422,7 @@ fn clear_completions(cx: &mut commands::Context) {
fn completion_post_command_hook(
tx: &Sender<CompletionEvent>,
PostCommand { command, cx }: &mut PostCommand<'_, '_>,
) -> Result<()> {
) -> anyhow::Result<()> {
if cx.editor.mode == Mode::Insert {
if cx.editor.last_completion.is_some() {
match command {
@@ -222,7 +433,7 @@ fn completion_post_command_hook(
MappableCommand::Static {
name: "delete_char_backward",
..
} => update_completion_filter(cx, None),
} => update_completions(cx, None),
_ => clear_completions(cx),
}
} else {
@@ -272,7 +483,7 @@ pub(super) fn register_hooks(handlers: &Handlers) {
let tx = handlers.completions.clone();
register_hook!(move |event: &mut PostInsertChar<'_, '_>| {
if event.cx.editor.last_completion.is_some() {
update_completion_filter(event.cx, Some(event.c))
update_completions(event.cx, Some(event.c))
} else {
trigger_auto_completion(&tx, event.cx.editor, false);
}

View File

@@ -1,69 +1,10 @@
use helix_core::completion::CompletionProvider;
use helix_lsp::{lsp, LanguageServerId};
pub struct CompletionResponse {
pub items: CompletionItems,
pub incomplete: bool,
pub provider: CompletionProvider,
pub priority: i8,
}
pub enum CompletionItems {
Lsp(Vec<lsp::CompletionItem>),
Other(Vec<CompletionItem>),
}
impl CompletionItems {
pub fn is_empty(&self) -> bool {
match self {
CompletionItems::Lsp(items) => items.is_empty(),
CompletionItems::Other(items) => items.is_empty(),
}
}
}
impl CompletionResponse {
pub fn into_items(self, dst: &mut Vec<CompletionItem>) {
match self.items {
CompletionItems::Lsp(items) => dst.extend(items.into_iter().map(|item| {
CompletionItem::Lsp(LspCompletionItem {
item,
provider: match self.provider {
CompletionProvider::Lsp(provider) => provider,
CompletionProvider::PathCompletions => unreachable!(),
},
resolved: false,
provider_priority: self.priority,
})
})),
CompletionItems::Other(items) if dst.is_empty() => *dst = items,
CompletionItems::Other(mut items) => dst.append(&mut items),
}
}
}
#[derive(Debug, PartialEq, Clone)]
pub struct LspCompletionItem {
pub item: lsp::CompletionItem,
pub provider: LanguageServerId,
pub resolved: bool,
// TODO: we should not be filtering and sorting incomplete completion list
// according to the spec but vscode does that anyway and most servers (
// including rust-analyzer) rely on that.. so we can't do that without
// breaking completions.
// pub incomplete_completion_list: bool,
pub provider_priority: i8,
}
impl LspCompletionItem {
#[inline]
pub fn filter_text(&self) -> &str {
self.item
.filter_text
.as_ref()
.unwrap_or(&self.item.label)
.as_str()
}
}
#[derive(Debug, PartialEq, Clone)]
@@ -72,16 +13,6 @@ pub enum CompletionItem {
Other(helix_core::CompletionItem),
}
impl CompletionItem {
#[inline]
pub fn filter_text(&self) -> &str {
match self {
CompletionItem::Lsp(item) => item.filter_text(),
CompletionItem::Other(item) => &item.label,
}
}
}
impl PartialEq<CompletionItem> for LspCompletionItem {
fn eq(&self, other: &CompletionItem) -> bool {
match other {
@@ -101,21 +32,6 @@ impl PartialEq<CompletionItem> for helix_core::CompletionItem {
}
impl CompletionItem {
pub fn provider_priority(&self) -> i8 {
match self {
CompletionItem::Lsp(item) => item.provider_priority,
// sorting path completions after LSP for now
CompletionItem::Other(_) => 1,
}
}
pub fn provider(&self) -> CompletionProvider {
match self {
CompletionItem::Lsp(item) => CompletionProvider::Lsp(item.provider),
CompletionItem::Other(item) => item.provider,
}
}
pub fn preselect(&self) -> bool {
match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => item.preselect.unwrap_or(false),

View File

@@ -5,21 +5,22 @@ use std::{
str::FromStr as _,
};
use futures_util::{future::BoxFuture, FutureExt as _};
use helix_core as core;
use helix_core::Transaction;
use helix_core::{self as core, completion::CompletionProvider};
use helix_event::TaskHandle;
use helix_stdx::path::{self, canonicalize, fold_home_dir, get_path_suffix};
use helix_view::Document;
use url::Url;
use crate::handlers::completion::{item::CompletionResponse, CompletionItem, CompletionItems};
use super::item::CompletionItem;
pub(crate) fn path_completion(
cursor: usize,
text: core::Rope,
doc: &Document,
handle: TaskHandle,
) -> Option<impl Fn() -> CompletionResponse> {
) -> Option<BoxFuture<'static, anyhow::Result<Vec<CompletionItem>>>> {
if !doc.path_completion_enabled() {
return None;
}
@@ -66,19 +67,12 @@ pub(crate) fn path_completion(
return None;
}
// TODO: handle properly in the future
const PRIORITY: i8 = 1;
let future = move || {
let future = tokio::task::spawn_blocking(move || {
let Ok(read_dir) = std::fs::read_dir(&dir_path) else {
return CompletionResponse {
items: CompletionItems::Other(Vec::new()),
incomplete: false,
provider: CompletionProvider::PathCompletions,
priority: PRIORITY, // TODO: hand
};
return Vec::new();
};
let res: Vec<_> = read_dir
read_dir
.filter_map(Result::ok)
.filter_map(|dir_entry| {
dir_entry
@@ -109,19 +103,12 @@ pub(crate) fn path_completion(
label: file_name.into(),
transaction,
documentation,
provider: CompletionProvider::PathCompletions,
}))
})
.collect();
CompletionResponse {
items: CompletionItems::Other(res),
incomplete: false,
provider: CompletionProvider::PathCompletions,
priority: PRIORITY, // TODO: hand
}
};
.collect::<Vec<_>>()
});
Some(future)
Some(async move { Ok(future.await?) }.boxed())
}
#[cfg(unix)]

View File

@@ -1,373 +0,0 @@
use std::collections::{HashMap, HashSet};
use std::sync::Arc;
use std::time::Duration;
use arc_swap::ArcSwap;
use futures_util::Future;
use helix_core::completion::CompletionProvider;
use helix_core::syntax::LanguageServerFeature;
use helix_event::{cancelable_future, TaskController, TaskHandle};
use helix_lsp::lsp;
use helix_lsp::lsp::{CompletionContext, CompletionTriggerKind};
use helix_lsp::util::pos_to_lsp_pos;
use helix_stdx::rope::RopeSliceExt;
use helix_view::document::Mode;
use helix_view::handlers::lsp::CompletionEvent;
use helix_view::{Document, DocumentId, Editor, ViewId};
use tokio::task::JoinSet;
use tokio::time::{timeout_at, Instant};
use crate::compositor::Compositor;
use crate::config::Config;
use crate::handlers::completion::item::CompletionResponse;
use crate::handlers::completion::path::path_completion;
use crate::handlers::completion::{
handle_response, replace_completions, show_completion, CompletionItems,
};
use crate::job::{dispatch, dispatch_blocking};
use crate::ui;
use crate::ui::editor::InsertEvent;
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
pub(super) enum TriggerKind {
Auto,
TriggerChar,
Manual,
}
#[derive(Debug, Clone, Copy)]
pub(super) struct Trigger {
pub(super) pos: usize,
pub(super) view: ViewId,
pub(super) doc: DocumentId,
pub(super) kind: TriggerKind,
}
#[derive(Debug)]
pub struct CompletionHandler {
/// currently active trigger which will cause a
/// completion request after the timeout
trigger: Option<Trigger>,
in_flight: Option<Trigger>,
task_controller: TaskController,
config: Arc<ArcSwap<Config>>,
}
impl CompletionHandler {
pub fn new(config: Arc<ArcSwap<Config>>) -> CompletionHandler {
Self {
config,
task_controller: TaskController::new(),
trigger: None,
in_flight: None,
}
}
}
impl helix_event::AsyncHook for CompletionHandler {
type Event = CompletionEvent;
fn handle_event(
&mut self,
event: Self::Event,
_old_timeout: Option<Instant>,
) -> Option<Instant> {
if self.in_flight.is_some() && !self.task_controller.is_running() {
self.in_flight = None;
}
match event {
CompletionEvent::AutoTrigger {
cursor: trigger_pos,
doc,
view,
} => {
// techically it shouldn't be possible to switch views/documents in insert mode
// but people may create weird keymaps/use the mouse so lets be extra careful
if self
.trigger
.or(self.in_flight)
.map_or(true, |trigger| trigger.doc != doc || trigger.view != view)
{
self.trigger = Some(Trigger {
pos: trigger_pos,
view,
doc,
kind: TriggerKind::Auto,
});
}
}
CompletionEvent::TriggerChar { cursor, doc, view } => {
// immediately request completions and drop all auto completion requests
self.task_controller.cancel();
self.trigger = Some(Trigger {
pos: cursor,
view,
doc,
kind: TriggerKind::TriggerChar,
});
}
CompletionEvent::ManualTrigger { cursor, doc, view } => {
// immediately request completions and drop all auto completion requests
self.trigger = Some(Trigger {
pos: cursor,
view,
doc,
kind: TriggerKind::Manual,
});
// stop debouncing immediately and request the completion
self.finish_debounce();
return None;
}
CompletionEvent::Cancel => {
self.trigger = None;
self.task_controller.cancel();
}
CompletionEvent::DeleteText { cursor } => {
// if we deleted the original trigger, abort the completion
if matches!(self.trigger.or(self.in_flight), Some(Trigger{ pos, .. }) if cursor < pos)
{
self.trigger = None;
self.task_controller.cancel();
}
}
}
self.trigger.map(|trigger| {
// if the current request was closed forget about it
// otherwise immediately restart the completion request
let timeout = if trigger.kind == TriggerKind::Auto {
self.config.load().editor.completion_timeout
} else {
// we want almost instant completions for trigger chars
// and restarting completion requests. The small timeout here mainly
// serves to better handle cases where the completion handler
// may fall behind (so multiple events in the channel) and macros
Duration::from_millis(5)
};
Instant::now() + timeout
})
}
fn finish_debounce(&mut self) {
let trigger = self.trigger.take().expect("debounce always has a trigger");
self.in_flight = Some(trigger);
let handle = self.task_controller.restart();
dispatch_blocking(move |editor, compositor| {
request_completions(trigger, handle, editor, compositor)
});
}
}
fn request_completions(
mut trigger: Trigger,
handle: TaskHandle,
editor: &mut Editor,
compositor: &mut Compositor,
) {
let (view, doc) = current!(editor);
if compositor
.find::<ui::EditorView>()
.unwrap()
.completion
.is_some()
|| editor.mode != Mode::Insert
{
return;
}
let text = doc.text();
let cursor = doc.selection(view.id).primary().cursor(text.slice(..));
if trigger.view != view.id || trigger.doc != doc.id() || cursor < trigger.pos {
return;
}
// this looks odd... Why are we not using the trigger position from
// the `trigger` here? Won't that mean that the trigger char doesn't get
// send to the LS if we type fast enougn? Yes that is true but it's
// not actually a problem. The LSP will resolve the completion to the identifier
// anyway (in fact sending the later position is necessary to get the right results
// from LSPs that provide incomplete completion list). We rely on trigger offset
// and primary cursor matching for multi-cursor completions so this is definitely
// necessary from our side too.
trigger.pos = cursor;
let trigger_text = text.slice(..cursor);
let mut seen_language_servers = HashSet::new();
let language_servers: Vec<_> = doc
.language_servers_with_feature(LanguageServerFeature::Completion)
.filter(|ls| seen_language_servers.insert(ls.id()))
.collect();
let mut requests = JoinSet::new();
for (priority, ls) in language_servers.iter().enumerate() {
let context = if trigger.kind == TriggerKind::Manual {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::INVOKED,
trigger_character: None,
}
} else {
let trigger_char =
ls.capabilities()
.completion_provider
.as_ref()
.and_then(|provider| {
provider
.trigger_characters
.as_deref()?
.iter()
.find(|&trigger| trigger_text.ends_with(trigger))
});
if trigger_char.is_some() {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::TRIGGER_CHARACTER,
trigger_character: trigger_char.cloned(),
}
} else {
lsp::CompletionContext {
trigger_kind: lsp::CompletionTriggerKind::INVOKED,
trigger_character: None,
}
}
};
requests.spawn(request_completions_from_language_server(
ls,
doc,
view.id,
context,
-(priority as i8),
));
}
if let Some(path_completion_request) =
path_completion(cursor, text.clone(), doc, handle.clone())
{
requests.spawn_blocking(path_completion_request);
}
let savepoint = doc.savepoint(view);
let ui = compositor.find::<ui::EditorView>().unwrap();
ui.last_insert.1.push(InsertEvent::RequestCompletion);
let handle_ = handle.clone();
let request_completions = async move {
let mut incomplete_completion_lists = HashMap::new();
let Some(response) = handle_response(&mut requests, false).await else {
return;
};
if response.incomplete {
incomplete_completion_lists.insert(response.provider, response.priority);
}
let mut items: Vec<_> = Vec::new();
response.into_items(&mut items);
let deadline = Instant::now() + Duration::from_millis(100);
loop {
let Some(response) = timeout_at(deadline, handle_response(&mut requests, false))
.await
.ok()
.flatten()
else {
break;
};
if response.incomplete {
incomplete_completion_lists.insert(response.provider, response.priority);
}
response.into_items(&mut items);
}
dispatch(move |editor, compositor| {
show_completion(
editor,
compositor,
items,
incomplete_completion_lists,
trigger,
savepoint,
)
})
.await;
if !requests.is_empty() {
replace_completions(handle_, requests, false).await;
}
};
tokio::spawn(cancelable_future(request_completions, handle));
}
fn request_completions_from_language_server(
ls: &helix_lsp::Client,
doc: &Document,
view: ViewId,
context: lsp::CompletionContext,
priority: i8,
) -> impl Future<Output = CompletionResponse> {
let provider = ls.id();
let offset_encoding = ls.offset_encoding();
let text = doc.text();
let cursor = doc.selection(view).primary().cursor(text.slice(..));
let pos = pos_to_lsp_pos(text, cursor, offset_encoding);
let doc_id = doc.identifier();
// it's important that this is berofe the async block (and that this is not an async function)
// to ensure the request is dispatched right away before any new edit notifications
let completion_response = ls.completion(doc_id, pos, None, context).unwrap();
async move {
let response: Option<lsp::CompletionResponse> = completion_response
.await
.and_then(|json| serde_json::from_value(json).map_err(helix_lsp::Error::Parse))
.inspect_err(|err| log::error!("completion request failed: {err}"))
.ok()
.flatten();
let (mut items, incomplete) = match response {
Some(lsp::CompletionResponse::Array(items)) => (items, false),
Some(lsp::CompletionResponse::List(lsp::CompletionList {
is_incomplete,
items,
})) => (items, is_incomplete),
None => (Vec::new(), false),
};
items.sort_by(|item1, item2| {
let sort_text1 = item1.sort_text.as_deref().unwrap_or(&item1.label);
let sort_text2 = item2.sort_text.as_deref().unwrap_or(&item2.label);
sort_text1.cmp(sort_text2)
});
CompletionResponse {
items: CompletionItems::Lsp(items),
incomplete,
provider: CompletionProvider::Lsp(provider),
priority,
}
}
}
pub fn request_incomplete_completion_list(
editor: &mut Editor,
ui: &mut ui::Completion,
handle: TaskHandle,
) {
if ui.incomplete_completion_lists.is_empty() {
return;
}
let (view, doc) = current_ref!(editor);
let mut requests = JoinSet::new();
log::error!("request incomplete completions");
ui.incomplete_completion_lists
.retain(|&provider, &mut priority| {
let CompletionProvider::Lsp(ls_id) = provider else {
unimplemented!("non-lsp incomplete completion lists")
};
let Some(ls) = editor.language_server_by_id(ls_id) else {
return false;
};
log::error!("request incomplete completions2");
let request = request_completions_from_language_server(
ls,
doc,
view.id,
CompletionContext {
trigger_kind: CompletionTriggerKind::TRIGGER_FOR_INCOMPLETE_COMPLETIONS,
trigger_character: None,
},
priority,
);
requests.spawn(request);
true
});
tokio::spawn(replace_completions(handle, requests, true));
}

View File

@@ -307,6 +307,8 @@ pub fn language(lang_str: String) -> std::io::Result<()> {
.map(|formatter| formatter.command.to_string()),
)?;
probe_parser(lang.grammar.as_ref().unwrap_or(&lang.language_id))?;
for ts_feat in TsFeature::all() {
probe_treesitter_feature(&lang_str, *ts_feat)?
}
@@ -314,6 +316,18 @@ pub fn language(lang_str: String) -> std::io::Result<()> {
Ok(())
}
fn probe_parser(grammar_name: &str) -> std::io::Result<()> {
let stdout = std::io::stdout();
let mut stdout = stdout.lock();
write!(stdout, "Tree-sitter parser: ")?;
match helix_loader::grammar::get_language(grammar_name) {
Ok(_) => writeln!(stdout, "{}", "".green()),
Err(_) => writeln!(stdout, "{}", "None".yellow()),
}
}
/// Display diagnostics about multiple LSPs and DAPs.
fn probe_protocols<'a, I: Iterator<Item = &'a str> + 'a>(
protocol_name: &str,

View File

@@ -18,7 +18,6 @@ use futures_util::Future;
mod handlers;
use ignore::DirEntry;
use url::Url;
#[cfg(windows)]
fn true_color() -> bool {
@@ -70,10 +69,10 @@ fn filter_picker_entry(entry: &DirEntry, root: &Path, dedup_symlinks: bool) -> b
}
/// Opens URL in external program.
fn open_external_url_callback(
url: Url,
fn open_external_url_callback<U: AsRef<std::ffi::OsStr>>(
url: U,
) -> impl Future<Output = Result<job::Callback, anyhow::Error>> + Send + 'static {
let commands = open::commands(url.as_str());
let commands = open::commands(url);
async {
for cmd in commands {
let mut command = tokio::process::Command::new(cmd.get_program());

View File

@@ -40,8 +40,15 @@ fn main() -> Result<()> {
#[tokio::main]
async fn main_impl() -> Result<i32> {
let help = format!(
"\
let mut args = Args::parse_args().context("could not parse arguments")?;
helix_loader::initialize_config_file(args.config_file.clone());
helix_loader::initialize_log_file(args.log_file.clone());
// Help has a higher priority and should be handled separately.
if args.display_help {
print!(
"\
{} {}
{}
{}
@@ -69,21 +76,12 @@ FLAGS:
-w, --working-dir <path> Specify an initial working directory
+N Open the first given file at line number N
",
env!("CARGO_PKG_NAME"),
VERSION_AND_GIT_HASH,
env!("CARGO_PKG_AUTHORS"),
env!("CARGO_PKG_DESCRIPTION"),
helix_loader::default_log_file().display(),
);
let mut args = Args::parse_args().context("could not parse arguments")?;
helix_loader::initialize_config_file(args.config_file.clone());
helix_loader::initialize_log_file(args.log_file.clone());
// Help has a higher priority and should be handled separately.
if args.display_help {
print!("{}", help);
env!("CARGO_PKG_NAME"),
VERSION_AND_GIT_HASH,
env!("CARGO_PKG_AUTHORS"),
env!("CARGO_PKG_DESCRIPTION"),
helix_loader::default_log_file().display(),
);
std::process::exit(0);
}
@@ -154,8 +152,7 @@ FLAGS:
});
// TODO: use the thread local executor to spawn the application task separately from the work pool
let mut app =
Application::new(args, config, lang_loader).context("unable to create new application")?;
let mut app = Application::new(args, config, lang_loader).context("unable to start Helix")?;
let exit_code = app.run(&mut EventStream::new()).await?;

View File

@@ -1,30 +1,26 @@
use crate::{
compositor::{Component, Context, Event, EventResult},
handlers::completion::{
trigger_auto_completion, CompletionItem, CompletionResponse, LspCompletionItem,
ResolveHandler,
handlers::{
completion::{CompletionItem, LspCompletionItem, ResolveHandler},
trigger_auto_completion,
},
};
use helix_event::TaskController;
use helix_view::{
document::SavePoint,
editor::CompleteAction,
handlers::lsp::SignatureHelpInvoked,
theme::{Modifier, Style},
theme::{Color, Modifier, Style},
ViewId,
};
use nucleo::{
pattern::{Atom, AtomKind, CaseMatching, Normalization},
Config, Utf32Str,
use tui::{
buffer::Buffer as Surface,
text::{Span, Spans},
};
use tui::{buffer::Buffer as Surface, text::Span};
use std::{cmp::Reverse, collections::HashMap, sync::Arc};
use std::{borrow::Cow, sync::Arc};
use helix_core::{
self as core, chars,
completion::CompletionProvider,
fuzzy::MATCHER,
snippets::{ActiveSnippet, RenderedSnippet, Snippet},
Change, Transaction,
};
@@ -35,9 +31,25 @@ use crate::ui::{menu, Markdown, Menu, Popup, PromptEvent};
use helix_lsp::{lsp, util, OffsetEncoding};
impl menu::Item for CompletionItem {
type Data = ();
type Data = Style;
fn sort_text(&self, data: &Self::Data) -> Cow<str> {
self.filter_text(data)
}
fn format(&self, _data: &Self::Data) -> menu::Row {
#[inline]
fn filter_text(&self, _data: &Self::Data) -> Cow<str> {
match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => item
.filter_text
.as_ref()
.unwrap_or(&item.label)
.as_str()
.into(),
CompletionItem::Other(core::CompletionItem { label, .. }) => label.clone(),
}
}
fn format(&self, dir_style: &Self::Data) -> menu::Row {
let deprecated = match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => {
item.deprecated.unwrap_or_default()
@@ -55,51 +67,69 @@ impl menu::Item for CompletionItem {
let kind = match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => match item.kind {
Some(lsp::CompletionItemKind::TEXT) => "text",
Some(lsp::CompletionItemKind::METHOD) => "method",
Some(lsp::CompletionItemKind::FUNCTION) => "function",
Some(lsp::CompletionItemKind::CONSTRUCTOR) => "constructor",
Some(lsp::CompletionItemKind::FIELD) => "field",
Some(lsp::CompletionItemKind::VARIABLE) => "variable",
Some(lsp::CompletionItemKind::CLASS) => "class",
Some(lsp::CompletionItemKind::INTERFACE) => "interface",
Some(lsp::CompletionItemKind::MODULE) => "module",
Some(lsp::CompletionItemKind::PROPERTY) => "property",
Some(lsp::CompletionItemKind::UNIT) => "unit",
Some(lsp::CompletionItemKind::VALUE) => "value",
Some(lsp::CompletionItemKind::ENUM) => "enum",
Some(lsp::CompletionItemKind::KEYWORD) => "keyword",
Some(lsp::CompletionItemKind::SNIPPET) => "snippet",
Some(lsp::CompletionItemKind::COLOR) => "color",
Some(lsp::CompletionItemKind::FILE) => "file",
Some(lsp::CompletionItemKind::REFERENCE) => "reference",
Some(lsp::CompletionItemKind::FOLDER) => "folder",
Some(lsp::CompletionItemKind::ENUM_MEMBER) => "enum_member",
Some(lsp::CompletionItemKind::CONSTANT) => "constant",
Some(lsp::CompletionItemKind::STRUCT) => "struct",
Some(lsp::CompletionItemKind::EVENT) => "event",
Some(lsp::CompletionItemKind::OPERATOR) => "operator",
Some(lsp::CompletionItemKind::TYPE_PARAMETER) => "type_param",
Some(lsp::CompletionItemKind::TEXT) => "text".into(),
Some(lsp::CompletionItemKind::METHOD) => "method".into(),
Some(lsp::CompletionItemKind::FUNCTION) => "function".into(),
Some(lsp::CompletionItemKind::CONSTRUCTOR) => "constructor".into(),
Some(lsp::CompletionItemKind::FIELD) => "field".into(),
Some(lsp::CompletionItemKind::VARIABLE) => "variable".into(),
Some(lsp::CompletionItemKind::CLASS) => "class".into(),
Some(lsp::CompletionItemKind::INTERFACE) => "interface".into(),
Some(lsp::CompletionItemKind::MODULE) => "module".into(),
Some(lsp::CompletionItemKind::PROPERTY) => "property".into(),
Some(lsp::CompletionItemKind::UNIT) => "unit".into(),
Some(lsp::CompletionItemKind::VALUE) => "value".into(),
Some(lsp::CompletionItemKind::ENUM) => "enum".into(),
Some(lsp::CompletionItemKind::KEYWORD) => "keyword".into(),
Some(lsp::CompletionItemKind::SNIPPET) => "snippet".into(),
Some(lsp::CompletionItemKind::COLOR) => item
.documentation
.as_ref()
.and_then(|docs| {
let text = match docs {
lsp::Documentation::String(text) => text,
lsp::Documentation::MarkupContent(lsp::MarkupContent {
value, ..
}) => value,
};
Color::from_hex(text)
})
.map_or("color".into(), |color| {
Spans::from(vec![
Span::raw("color "),
Span::styled("", Style::default().fg(color)),
])
}),
Some(lsp::CompletionItemKind::FILE) => "file".into(),
Some(lsp::CompletionItemKind::REFERENCE) => "reference".into(),
Some(lsp::CompletionItemKind::FOLDER) => "folder".into(),
Some(lsp::CompletionItemKind::ENUM_MEMBER) => "enum_member".into(),
Some(lsp::CompletionItemKind::CONSTANT) => "constant".into(),
Some(lsp::CompletionItemKind::STRUCT) => "struct".into(),
Some(lsp::CompletionItemKind::EVENT) => "event".into(),
Some(lsp::CompletionItemKind::OPERATOR) => "operator".into(),
Some(lsp::CompletionItemKind::TYPE_PARAMETER) => "type_param".into(),
Some(kind) => {
log::error!("Received unknown completion item kind: {:?}", kind);
""
"".into()
}
None => "",
None => "".into(),
},
CompletionItem::Other(core::CompletionItem { kind, .. }) => kind,
CompletionItem::Other(core::CompletionItem { kind, .. }) => kind.as_ref().into(),
};
menu::Row::new([
menu::Cell::from(Span::styled(
label,
if deprecated {
Style::default().add_modifier(Modifier::CROSSED_OUT)
} else {
Style::default()
},
)),
menu::Cell::from(kind),
])
let label = Span::styled(
label,
if deprecated {
Style::default().add_modifier(Modifier::CROSSED_OUT)
} else if kind.0[0].content == "folder" {
*dir_style
} else {
Style::default()
},
);
menu::Row::new([menu::Cell::from(label), menu::Cell::from(kind)])
}
}
@@ -110,9 +140,6 @@ pub struct Completion {
trigger_offset: usize,
filter: String,
resolve_handler: ResolveHandler,
pub incomplete_completion_lists: HashMap<CompletionProvider, i8>,
// controller for requesting updates for incomplete completion lists
pub incomplete_list_controller: TaskController,
}
impl Completion {
@@ -121,15 +148,18 @@ impl Completion {
pub fn new(
editor: &Editor,
savepoint: Arc<SavePoint>,
items: Vec<CompletionItem>,
incomplete_completion_lists: HashMap<CompletionProvider, i8>,
mut items: Vec<CompletionItem>,
trigger_offset: usize,
) -> Self {
let preview_completion_insert = editor.config().preview_completion_insert;
let replace_mode = editor.config().completion_replace;
// Sort completion items according to their preselect status (given by the LSP server)
items.sort_by_key(|item| !item.preselect());
let dir_style = editor.theme.get("ui.text.directory");
// Then create the menu
let menu = Menu::new(items, (), move |editor: &mut Editor, item, event| {
let menu = Menu::new(items, dir_style, move |editor: &mut Editor, item, event| {
let (view, doc) = current!(editor);
macro_rules! language_server {
@@ -302,77 +332,17 @@ impl Completion {
// and avoid allocation during matching
filter: String::from(fragment),
resolve_handler: ResolveHandler::new(),
incomplete_completion_lists,
incomplete_list_controller: TaskController::new(),
};
// need to recompute immediately in case start_offset != trigger_offset
completion.score(false);
completion
.popup
.contents_mut()
.score(&completion.filter, false);
completion
}
fn score(&mut self, incremental: bool) {
let pattern = &self.filter;
let mut matcher = MATCHER.lock();
matcher.config = Config::DEFAULT;
// slight preference towards prefix matches
matcher.config.prefer_prefix = true;
let pattern = Atom::new(
pattern,
CaseMatching::Ignore,
Normalization::Smart,
AtomKind::Fuzzy,
false,
);
let mut buf = Vec::new();
let (matches, options) = self.popup.contents_mut().update_options();
if incremental {
matches.retain_mut(|(index, score)| {
let option = &options[*index as usize];
let text = option.filter_text();
let new_score = pattern.score(Utf32Str::new(text, &mut buf), &mut matcher);
match new_score {
Some(new_score) => {
*score = new_score as u32 / 2;
true
}
None => false,
}
})
} else {
matches.clear();
matches.extend(options.iter().enumerate().filter_map(|(i, option)| {
let text = option.filter_text();
pattern
.score(Utf32Str::new(text, &mut buf), &mut matcher)
.map(|score| (i as u32, score as u32 / 3))
}));
}
// nuclueo is meant as an fzf-like fuzzy matcher and only hides
// matches that are truely impossible (as in the sequence of char
// just doens't appeart) that doesn't work well for completions
// with multi lsps where all completions of the next lsp are below
// the current one (so you would good suggestions from the second lsp below those
// of the first). Setting a reasonable cutoff below which to move
// bad completions out of the way helps with that.
//
// The score computation is a heuristic dervied from nucleo internal
// constants and may move upstream in the future. I want to test this out
// here to settle on a good number
let min_score = (7 + pattern.needle_text().len() as u32 * 14) / 3;
matches.sort_unstable_by_key(|&(i, score)| {
let option = &options[i as usize];
(
score <= min_score,
Reverse(option.preselect()),
option.provider_priority(),
Reverse(score),
i,
)
});
}
/// Synchronously resolve the given completion item. This is used when
/// accepting a completion.
fn resolve_completion_item(
@@ -414,28 +384,7 @@ impl Completion {
}
}
}
self.score(c.is_some());
self.popup.contents_mut().reset_cursor();
}
pub fn replace_provider_completions(&mut self, response: CompletionResponse) {
let menu = self.popup.contents_mut();
let (_, options) = menu.update_options();
if self
.incomplete_completion_lists
.remove(&response.provider)
.is_some()
{
options.retain(|item| item.provider() != response.provider)
}
if response.incomplete {
self.incomplete_completion_lists
.insert(response.provider, response.priority);
}
response.into_items(options);
self.score(false);
let menu = self.popup.contents_mut();
menu.ensure_cursor_in_bounds();
menu.score(&self.filter, c.is_some());
}
pub fn is_empty(&self) -> bool {
@@ -657,98 +606,3 @@ fn completion_changes(transaction: &Transaction, trigger_offset: usize) -> Vec<C
.filter(|(start, end, _)| (*start..=*end).contains(&trigger_offset))
.collect()
}
// fn lsp_item_to_transaction(
// doc: &Document,
// view_id: ViewId,
// item: &lsp::CompletionItem,
// offset_encoding: OffsetEncoding,
// trigger_offset: usize,
// include_placeholder: bool,
// replace_mode: bool,
// ) -> Transaction {
// use helix_lsp::snippet;
// let selection = doc.selection(view_id);
// let text = doc.text().slice(..);
// let primary_cursor = selection.primary().cursor(text);
// let (edit_offset, new_text) = if let Some(edit) = &item.text_edit {
// let edit = match edit {
// lsp::CompletionTextEdit::Edit(edit) => edit.clone(),
// lsp::CompletionTextEdit::InsertAndReplace(item) => {
// let range = if replace_mode {
// item.replace
// } else {
// item.insert
// };
// lsp::TextEdit::new(range, item.new_text.clone())
// }
// };
// let Some(range) =
// util::lsp_range_to_range(doc.text(), edit.range, offset_encoding)
// else {
// return Transaction::new(doc.text());
// };
// let start_offset = range.anchor as i128 - primary_cursor as i128;
// let end_offset = range.head as i128 - primary_cursor as i128;
// (Some((start_offset, end_offset)), edit.new_text)
// } else {
// let new_text = item
// .insert_text
// .clone()
// .unwrap_or_else(|| item.label.clone());
// // check that we are still at the correct savepoint
// // we can still generate a transaction regardless but if the
// // document changed (and not just the selection) then we will
// // likely delete the wrong text (same if we applied an edit sent by the LS)
// debug_assert!(primary_cursor == trigger_offset);
// (None, new_text)
// };
// if matches!(item.kind, Some(lsp::CompletionItemKind::SNIPPET))
// || matches!(
// item.insert_text_format,
// Some(lsp::InsertTextFormat::SNIPPET)
// )
// {
// match snippet::parse(&new_text) {
// Ok(snippet) => util::generate_transaction_from_snippet(
// doc.text(),
// selection,
// edit_offset,
// replace_mode,
// snippet,
// doc.line_ending.as_str(),
// include_placeholder,
// doc.tab_width(),
// doc.indent_width(),
// ),
// Err(err) => {
// log::error!(
// "Failed to parse snippet: {:?}, remaining output: {}",
// &new_text,
// err
// );
// Transaction::new(doc.text())
// }
// }
// } else {
// util::generate_transaction_from_completion_edit(
// doc.text(),
// selection,
// edit_offset,
// replace_mode,
// new_text,
// )
// }
// }
// fn completion_changes(transaction: &Transaction, trigger_offset: usize) -> Vec<Change> {
// transaction
// .changes_iter()
// .filter(|(start, end, _)| (*start..=*end).contains(&trigger_offset))
// .collect()
// }

View File

@@ -14,7 +14,6 @@ use crate::{
};
use helix_core::{
completion::CompletionProvider,
diagnostic::NumberOrString,
graphemes::{next_grapheme_boundary, prev_grapheme_boundary},
movement::Direction,
@@ -32,7 +31,7 @@ use helix_view::{
keyboard::{KeyCode, KeyModifiers},
Document, Editor, Theme, View,
};
use std::{collections::HashMap, mem::take, num::NonZeroUsize, path::PathBuf, rc::Rc, sync::Arc};
use std::{mem::take, num::NonZeroUsize, path::PathBuf, rc::Rc, sync::Arc};
use tui::{buffer::Buffer as Surface, text::Span};
@@ -1058,17 +1057,10 @@ impl EditorView {
editor: &mut Editor,
savepoint: Arc<SavePoint>,
items: Vec<CompletionItem>,
incomplete_completion_lists: HashMap<CompletionProvider, i8>,
trigger_offset: usize,
size: Rect,
) -> Option<Rect> {
let mut completion = Completion::new(
editor,
savepoint,
items,
incomplete_completion_lists,
trigger_offset,
);
let mut completion = Completion::new(editor, savepoint, items, trigger_offset);
if completion.is_empty() {
// skip if we got no completion results

View File

@@ -1,7 +1,12 @@
use std::{borrow::Cow, cmp::Reverse};
use crate::{
compositor::{Callback, Component, Compositor, Context, Event, EventResult},
ctrl, key, shift,
};
use helix_core::fuzzy::MATCHER;
use nucleo::pattern::{Atom, AtomKind, CaseMatching, Normalization};
use nucleo::{Config, Utf32Str};
use tui::{buffer::Buffer as Surface, widgets::Table};
pub use tui::widgets::{Cell, Row};
@@ -14,6 +19,16 @@ pub trait Item: Sync + Send + 'static {
type Data: Sync + Send + 'static;
fn format(&self, data: &Self::Data) -> Row;
fn sort_text(&self, data: &Self::Data) -> Cow<str> {
let label: String = self.format(data).cell_text().collect();
label.into()
}
fn filter_text(&self, data: &Self::Data) -> Cow<str> {
let label: String = self.format(data).cell_text().collect();
label.into()
}
}
pub type MenuCallback<T> = Box<dyn Fn(&mut Editor, Option<&T>, MenuEvent)>;
@@ -62,30 +77,49 @@ impl<T: Item> Menu<T> {
}
}
pub fn reset_cursor(&mut self) {
pub fn score(&mut self, pattern: &str, incremental: bool) {
let mut matcher = MATCHER.lock();
matcher.config = Config::DEFAULT;
let pattern = Atom::new(
pattern,
CaseMatching::Ignore,
Normalization::Smart,
AtomKind::Fuzzy,
false,
);
let mut buf = Vec::new();
if incremental {
self.matches.retain_mut(|(index, score)| {
let option = &self.options[*index as usize];
let text = option.filter_text(&self.editor_data);
let new_score = pattern.score(Utf32Str::new(&text, &mut buf), &mut matcher);
match new_score {
Some(new_score) => {
*score = new_score as u32;
true
}
None => false,
}
})
} else {
self.matches.clear();
let matches = self.options.iter().enumerate().filter_map(|(i, option)| {
let text = option.filter_text(&self.editor_data);
pattern
.score(Utf32Str::new(&text, &mut buf), &mut matcher)
.map(|score| (i as u32, score as u32))
});
self.matches.extend(matches);
}
self.matches
.sort_unstable_by_key(|&(i, score)| (Reverse(score), i));
// reset cursor position
self.cursor = None;
self.scroll = 0;
self.recalculate = true;
}
pub fn update_options(&mut self) -> (&mut Vec<(u32, u32)>, &mut Vec<T>) {
self.recalculate = true;
(&mut self.matches, &mut self.options)
}
pub fn ensure_cursor_in_bounds(&mut self) {
if self.matches.is_empty() {
self.cursor = None;
self.scroll = 0;
} else {
self.scroll = 0;
self.recalculate = true;
if let Some(cursor) = &mut self.cursor {
*cursor = (*cursor).min(self.matches.len() - 1)
}
}
}
pub fn clear(&mut self) {
self.matches.clear();

View File

@@ -32,6 +32,17 @@ use helix_view::Editor;
use std::{error::Error, path::PathBuf};
struct Utf8PathBuf {
path: String,
is_dir: bool,
}
impl AsRef<str> for Utf8PathBuf {
fn as_ref(&self) -> &str {
&self.path
}
}
pub fn prompt(
cx: &mut crate::commands::Context,
prompt: std::borrow::Cow<'static, str>,
@@ -266,6 +277,7 @@ pub fn file_picker(root: PathBuf, config: &helix_view::editor::Config) -> FilePi
}
pub mod completers {
use super::Utf8PathBuf;
use crate::ui::prompt::Completion;
use helix_core::fuzzy::fuzzy_match;
use helix_core::syntax::LanguageServerFeature;
@@ -274,6 +286,7 @@ pub mod completers {
use helix_view::{editor::Config, Editor};
use once_cell::sync::Lazy;
use std::borrow::Cow;
use tui::text::Span;
pub type Completer = fn(&Editor, &str) -> Vec<Completion>;
@@ -290,7 +303,7 @@ pub mod completers {
fuzzy_match(input, names, true)
.into_iter()
.map(|(name, _)| ((0..), name))
.map(|(name, _)| ((0..), name.into()))
.collect()
}
@@ -336,7 +349,7 @@ pub mod completers {
fuzzy_match(input, &*KEYS, false)
.into_iter()
.map(|(name, _)| ((0..), name.into()))
.map(|(name, _)| ((0..), Span::raw(name)))
.collect()
}
@@ -424,7 +437,7 @@ pub mod completers {
// TODO: we could return an iter/lazy thing so it can fetch as many as it needs.
fn filename_impl<F>(
_editor: &Editor,
editor: &Editor,
input: &str,
git_ignore: bool,
filter_fn: F,
@@ -482,7 +495,7 @@ pub mod completers {
return None;
}
//let is_dir = entry.file_type().map_or(false, |entry| entry.is_dir());
let is_dir = entry.file_type().is_some_and(|entry| entry.is_dir());
let path = entry.path();
let mut path = if is_tilde {
@@ -501,23 +514,35 @@ pub mod completers {
}
let path = path.into_os_string().into_string().ok()?;
Some(Cow::from(path))
Some(Utf8PathBuf { path, is_dir })
})
}) // TODO: unwrap or skip
.filter(|path| !path.is_empty());
.filter(|path| !path.path.is_empty());
let directory_color = editor.theme.get("ui.text.directory");
let style_from_file = |file: Utf8PathBuf| {
if file.is_dir {
Span::styled(file.path, directory_color)
} else {
Span::raw(file.path)
}
};
// if empty, return a list of dirs and files in current dir
if let Some(file_name) = file_name {
let range = (input.len().saturating_sub(file_name.len()))..;
fuzzy_match(&file_name, files, true)
.into_iter()
.map(|(name, _)| (range.clone(), name))
.map(|(name, _)| (range.clone(), style_from_file(name)))
.collect()
// TODO: complete to longest common match
} else {
let mut files: Vec<_> = files.map(|file| (end.clone(), file)).collect();
files.sort_unstable_by(|(_, path1), (_, path2)| path1.cmp(path2));
let mut files: Vec<_> = files
.map(|file| (end.clone(), style_from_file(file)))
.collect();
files.sort_unstable_by(|(_, path1), (_, path2)| path1.content.cmp(&path2.content));
files
}
}

View File

@@ -8,6 +8,7 @@ use helix_view::keyboard::KeyCode;
use std::sync::Arc;
use std::{borrow::Cow, ops::RangeFrom};
use tui::buffer::Buffer as Surface;
use tui::text::Span;
use tui::widgets::{Block, Widget};
use helix_core::{
@@ -19,7 +20,8 @@ use helix_view::{
};
type PromptCharHandler = Box<dyn Fn(&mut Prompt, char, &Context)>;
pub type Completion = (RangeFrom<usize>, Cow<'static, str>);
pub type Completion = (RangeFrom<usize>, Span<'static>);
type CompletionFn = Box<dyn FnMut(&Editor, &str) -> Vec<Completion>>;
type CallbackFn = Box<dyn FnMut(&mut Context, &str, PromptEvent)>;
pub type DocFn = Box<dyn Fn(&str) -> Option<Cow<str>>>;
@@ -233,15 +235,7 @@ impl Prompt {
position
}
Movement::StartOfLine => 0,
Movement::EndOfLine => {
let mut cursor =
GraphemeCursor::new(self.line.len().saturating_sub(1), self.line.len(), false);
if let Ok(Some(pos)) = cursor.next_boundary(&self.line, 0) {
pos
} else {
self.cursor
}
}
Movement::EndOfLine => self.line.len(),
Movement::None => self.cursor,
}
}
@@ -382,7 +376,7 @@ impl Prompt {
let (range, item) = &self.completion[index];
self.line.replace_range(range.clone(), item);
self.line.replace_range(range.clone(), &item.content);
self.move_end();
}
@@ -407,7 +401,7 @@ impl Prompt {
let max_len = self
.completion
.iter()
.map(|(_, completion)| completion.len() as u16)
.map(|(_, completion)| completion.content.len() as u16)
.max()
.unwrap_or(BASE_WIDTH)
.max(BASE_WIDTH);
@@ -446,18 +440,22 @@ impl Prompt {
for (i, (_range, completion)) in
self.completion.iter().enumerate().skip(offset).take(items)
{
let color = if Some(i) == self.selection {
selected_color // TODO: just invert bg
let is_selected = Some(i) == self.selection;
let completion_item_style = if is_selected {
selected_color
} else {
completion_color
completion_color.patch(completion.style)
};
surface.set_stringn(
area.x + col * (1 + col_width),
area.y + row,
completion,
&completion.content,
col_width.saturating_sub(1) as usize,
color,
completion_item_style,
);
row += 1;
if row > area.height - 1 {
row = 0;

View File

@@ -119,3 +119,128 @@ async fn insert_newline_continue_line_comment() -> anyhow::Result<()> {
Ok(())
}
/// NOTE: Language is set to markdown to check if the indentation is correct for the new line
#[tokio::test(flavor = "multi_thread")]
async fn test_open_above() -> anyhow::Result<()> {
// `O` is pressed in the first line
test((
indoc! {"Helix #[is|]# cool"},
":lang markdown<ret>O",
indoc! {"\
#[\n|]#
Helix is cool
"},
))
.await?;
// `O` is pressed in the first line, but the current line has some indentation
test((
indoc! {"\
··This line has 2 spaces in front of it#[\n|]#
"}
.replace('·', " "),
":lang markdown<ret>Oa",
indoc! {"\
··a#[\n|]#
··This line has 2 spaces in front of it
"}
.replace('·', " "),
))
.await?;
// `O` is pressed but *not* in the first line
test((
indoc! {"\
I use
b#[t|]#w.
"},
":lang markdown<ret>Oarch",
indoc! {"\
I use
arch#[\n|]#
btw.
"},
))
.await?;
// `O` is pressed but *not* in the first line and the line has some indentation
test((
indoc! {"\
I use
····b#[t|]#w.
"}
.replace("·", " "),
":lang markdown<ret>Ohelix",
indoc! {"\
I use
····helix#[\n|]#
····btw.
"}
.replace("·", " "),
))
.await?;
Ok(())
}
/// NOTE: To make the `open_above` comment-aware, we're setting the language for each test to rust.
#[tokio::test(flavor = "multi_thread")]
async fn test_open_above_with_comments() -> anyhow::Result<()> {
// `O` is pressed in the first line inside a line comment
test((
indoc! {"// a commen#[t|]#"},
":lang rust<ret>O",
indoc! {"\
// #[\n|]#
// a comment
"},
))
.await?;
// `O` is pressed in the first line inside a line comment, but with indentation
test((
indoc! {"····// a comm#[e|]#nt"}.replace("·", " "),
":lang rust<ret>O",
indoc! {"\
····// #[\n|]#
····// a comment
"}
.replace("·", " "),
))
.await?;
// `O` is pressed but not in the first line but inside a line comment
test((
indoc! {"\
fn main() { }
// yeetus deletus#[\n|]#
"},
":lang rust<ret>O",
indoc! {"\
fn main() { }
// #[\n|]#
// yeetus deletus
"},
))
.await?;
// `O` is pressed but not in the first line but inside a line comment and with indentation
test((
indoc! {"\
fn main() { }
····// yeetus deletus#[\n|]#
"}
.replace("·", " "),
":lang rust<ret>O",
indoc! {"\
fn main() { }
····// #[\n|]#
····// yeetus deletus
"}
.replace("·", " "),
))
.await?;
Ok(())
}

View File

@@ -30,9 +30,7 @@ crossterm = { version = "0.28", optional = true }
tempfile = "3.14"
# Conversion traits
once_cell = "1.20"
url = "2.5.4"
arc-swap = { version = "1.7.1" }
@@ -40,7 +38,8 @@ tokio = { version = "1", features = ["rt", "rt-multi-thread", "io-util", "io-std
tokio-stream = "0.1"
futures-util = { version = "0.3", features = ["std", "async-await"], default-features = false }
slotmap.workspace = true
slotmap = "1"
chardetng = "0.1"
serde = { version = "1.0", features = ["derive"] }

View File

@@ -642,7 +642,6 @@ where
}
use helix_lsp::{lsp, Client, LanguageServerId, LanguageServerName};
use url::Url;
impl Document {
pub fn from(
@@ -1435,12 +1434,16 @@ impl Document {
// TODO: move to hook
// emit lsp notification
for language_server in self.language_servers() {
let _ = language_server.text_document_did_change(
let notify = language_server.text_document_did_change(
self.versioned_identifier(),
&old_doc,
self.text(),
changes,
);
if let Some(notify) = notify {
tokio::spawn(notify);
}
}
}
@@ -1757,25 +1760,6 @@ impl Document {
})
}
pub fn language_servers_with_feature_owned(
&self,
feature: LanguageServerFeature,
) -> impl Iterator<Item = Arc<helix_lsp::Client>> + '_ {
self.language_config().into_iter().flat_map(move |config| {
config.language_servers.iter().filter_map(move |features| {
let ls = self.language_servers.get(&features.name)?.clone();
if ls.is_initialized()
&& ls.supports_feature(feature)
&& features.has_feature(feature)
{
Some(ls)
} else {
None
}
})
})
}
pub fn supports_language_server(&self, id: LanguageServerId) -> bool {
self.language_servers().any(|l| l.id() == id)
}
@@ -1837,8 +1821,8 @@ impl Document {
}
/// File path as a URL.
pub fn url(&self) -> Option<Url> {
Url::from_file_path(self.path()?).ok()
pub fn url(&self) -> Option<lsp::Url> {
self.path().map(lsp::Url::from_file_path)
}
pub fn uri(&self) -> Option<helix_core::Uri> {
@@ -1924,7 +1908,7 @@ impl Document {
pub fn lsp_diagnostic_to_diagnostic(
text: &Rope,
language_config: Option<&LanguageConfiguration>,
diagnostic: &helix_lsp::lsp::Diagnostic,
diagnostic: &lsp::Diagnostic,
language_server_id: LanguageServerId,
offset_encoding: helix_lsp::OffsetEncoding,
) -> Option<Diagnostic> {

View File

@@ -306,6 +306,9 @@ pub struct Config {
/// Whether to instruct the LSP to replace the entire word when applying a completion
/// or to only insert new text
pub completion_replace: bool,
/// `true` if helix should automatically add a line comment token if you're currently in a comment
/// and press `enter`.
pub continue_comments: bool,
/// Whether to display infoboxes. Defaults to true.
pub auto_info: bool,
pub file_picker: FilePickerConfig,
@@ -987,6 +990,7 @@ impl Default for Config {
},
text_width: 80,
completion_replace: false,
continue_comments: true,
workspace_lsp_roots: Vec::new(),
default_line_ending: LineEndingConfig::default(),
insert_final_newline: true,
@@ -1733,10 +1737,14 @@ impl Editor {
Ok(doc_id)
}
pub fn document_id_by_path(&self, path: &Path) -> Option<DocumentId> {
self.document_by_path(path).map(|doc| doc.id)
}
// ??? possible use for integration tests
pub fn open(&mut self, path: &Path, action: Action) -> Result<DocumentId, DocumentOpenError> {
let path = helix_stdx::path::canonicalize(path);
let id = self.document_by_path(&path).map(|doc| doc.id);
let id = self.document_id_by_path(&path);
let id = if let Some(id) = id {
id

View File

@@ -263,6 +263,31 @@ pub enum Color {
Indexed(u8),
}
impl Color {
/// Creates a `Color` from a hex string
///
/// # Examples
///
/// ```rust
/// use helix_view::theme::Color;
///
/// let color1 = Color::from_hex("#c0ffee").unwrap();
/// let color2 = Color::Rgb(192, 255, 238);
///
/// assert_eq!(color1, color2);
/// ```
pub fn from_hex(hex: &str) -> Option<Self> {
if !(hex.starts_with('#') && hex.len() == 7) {
return None;
}
match [1..=2, 3..=4, 5..=6].map(|i| hex.get(i).and_then(|c| u8::from_str_radix(c, 16).ok()))
{
[Some(r), Some(g), Some(b)] => Some(Self::Rgb(r, g, b)),
_ => None,
}
}
}
#[cfg(feature = "term")]
impl From<Color> for crossterm::style::Color {
fn from(color: Color) -> Self {

View File

@@ -57,7 +57,7 @@ pub struct ApplyEditError {
pub enum ApplyEditErrorKind {
DocumentChanged,
FileNotFound,
InvalidUrl(helix_core::uri::UrlConversionError),
InvalidUrl(helix_core::uri::UriParseError),
IoError(std::io::Error),
// TODO: check edits before applying and propagate failure
// InvalidEdit,
@@ -69,8 +69,8 @@ impl From<std::io::Error> for ApplyEditErrorKind {
}
}
impl From<helix_core::uri::UrlConversionError> for ApplyEditErrorKind {
fn from(err: helix_core::uri::UrlConversionError) -> Self {
impl From<helix_core::uri::UriParseError> for ApplyEditErrorKind {
fn from(err: helix_core::uri::UriParseError) -> Self {
ApplyEditErrorKind::InvalidUrl(err)
}
}
@@ -94,7 +94,7 @@ impl Editor {
text_edits: Vec<lsp::TextEdit>,
offset_encoding: OffsetEncoding,
) -> Result<(), ApplyEditErrorKind> {
let uri = match Uri::try_from(url) {
let uri = match Uri::try_from(url.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");
@@ -242,7 +242,7 @@ impl Editor {
// may no longer be valid.
match op {
ResourceOp::Create(op) => {
let uri = Uri::try_from(&op.uri)?;
let uri = Uri::try_from(op.uri.as_str())?;
let path = uri.as_path().expect("URIs are valid paths");
let ignore_if_exists = op.options.as_ref().map_or(false, |options| {
!options.overwrite.unwrap_or(false) && options.ignore_if_exists.unwrap_or(false)
@@ -262,7 +262,7 @@ impl Editor {
}
}
ResourceOp::Delete(op) => {
let uri = Uri::try_from(&op.uri)?;
let uri = Uri::try_from(op.uri.as_str())?;
let path = uri.as_path().expect("URIs are valid paths");
if path.is_dir() {
let recursive = op
@@ -284,9 +284,9 @@ impl Editor {
}
}
ResourceOp::Rename(op) => {
let from_uri = Uri::try_from(&op.old_uri)?;
let from_uri = Uri::try_from(op.old_uri.as_str())?;
let from = from_uri.as_path().expect("URIs are valid paths");
let to_uri = Uri::try_from(&op.new_uri)?;
let to_uri = Uri::try_from(op.new_uri.as_str())?;
let to = to_uri.as_path().expect("URIs are valid paths");
let ignore_if_exists = op.options.as_ref().map_or(false, |options| {
!options.overwrite.unwrap_or(false) && options.ignore_if_exists.unwrap_or(false)

View File

@@ -162,7 +162,12 @@ pub(crate) mod keys {
impl fmt::Display for KeyEvent {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> std::fmt::Result {
f.write_fmt(format_args!(
"{}{}{}",
"{}{}{}{}",
if self.modifiers.contains(KeyModifiers::SUPER) {
"Meta-"
} else {
""
},
if self.modifiers.contains(KeyModifiers::SHIFT) {
"S-"
} else {
@@ -312,6 +317,10 @@ impl UnicodeWidthStr for KeyEvent {
if self.modifiers.contains(KeyModifiers::CONTROL) {
width += 2;
}
if self.modifiers.contains(KeyModifiers::SUPER) {
// "-Meta"
width += 5;
}
width
}
@@ -413,6 +422,7 @@ impl std::str::FromStr for KeyEvent {
"S" => KeyModifiers::SHIFT,
"A" => KeyModifiers::ALT,
"C" => KeyModifiers::CONTROL,
"Meta" | "Cmd" | "Win" => KeyModifiers::SUPER,
_ => return Err(anyhow!("Invalid key modifier '{}-'", token)),
};
@@ -733,6 +743,28 @@ mod test {
modifiers: KeyModifiers::NONE
}
);
assert_eq!(
str::parse::<KeyEvent>("Meta-c").unwrap(),
KeyEvent {
code: KeyCode::Char('c'),
modifiers: KeyModifiers::SUPER
}
);
assert_eq!(
str::parse::<KeyEvent>("Win-s").unwrap(),
KeyEvent {
code: KeyCode::Char('s'),
modifiers: KeyModifiers::SUPER
}
);
assert_eq!(
str::parse::<KeyEvent>("Cmd-d").unwrap(),
KeyEvent {
code: KeyCode::Char('d'),
modifiers: KeyModifiers::SUPER
}
);
}
#[test]

View File

@@ -7,6 +7,7 @@ bitflags! {
const SHIFT = 0b0000_0001;
const CONTROL = 0b0000_0010;
const ALT = 0b0000_0100;
const SUPER = 0b0000_1000;
const NONE = 0b0000_0000;
}
}
@@ -27,6 +28,9 @@ impl From<KeyModifiers> for crossterm::event::KeyModifiers {
if key_modifiers.contains(KeyModifiers::ALT) {
result.insert(CKeyModifiers::ALT);
}
if key_modifiers.contains(KeyModifiers::SUPER) {
result.insert(CKeyModifiers::SUPER);
}
result
}
@@ -48,6 +52,9 @@ impl From<crossterm::event::KeyModifiers> for KeyModifiers {
if val.contains(CKeyModifiers::ALT) {
result.insert(KeyModifiers::ALT);
}
if val.contains(CKeyModifiers::SUPER) {
result.insert(KeyModifiers::SUPER);
}
result
}

View File

@@ -53,9 +53,11 @@ jq-lsp = { command = "jq-lsp" }
jsonnet-language-server = { command = "jsonnet-language-server", args= ["-t", "--lint"] }
julia = { command = "julia", timeout = 60, args = [ "--startup-file=no", "--history-file=no", "--quiet", "-e", "using LanguageServer; runserver()", ] }
koka = { command = "koka", args = ["--language-server", "--lsstdio"] }
koto-ls = { command = "koto-ls" }
kotlin-language-server = { command = "kotlin-language-server" }
lean = { command = "lean", args = [ "--server", "--memory=1024" ] }
ltex-ls = { command = "ltex-ls" }
ltex-ls-plus = { command = "ltex-ls-plus" }
markdoc-ls = { command = "markdoc-ls", args = ["--stdio"] }
markdown-oxide = { command = "markdown-oxide" }
marksman = { command = "marksman", args = ["server"] }
@@ -1183,6 +1185,8 @@ file-types = ["java", "jav", "pde"]
roots = ["pom.xml", "build.gradle", "build.gradle.kts"]
language-servers = [ "jdtls" ]
indent = { tab-width = 2, unit = " " }
comment-tokens = ["//"]
block-comment-tokens = { start = "/*", end = "*/" }
[[grammar]]
name = "java"
@@ -1703,7 +1707,7 @@ language-servers = [ "docker-langserver" ]
[[grammar]]
name = "dockerfile"
source = { git = "https://github.com/camdencheek/tree-sitter-dockerfile", rev = "8ee3a0f7587b2bd8c45c8cb7d28bd414604aec62" }
source = { git = "https://github.com/camdencheek/tree-sitter-dockerfile", rev = "087daa20438a6cc01fa5e6fe6906d77c869d19fe" }
[[language]]
name = "docker-compose"
@@ -2286,7 +2290,7 @@ indent = { tab-width = 4, unit = "\t" }
[[grammar]]
name = "v"
source = {git = "https://github.com/v-analyzer/v-analyzer", subpath = "tree_sitter_v", rev = "e14fdf6e661b10edccc744102e4ccf0b187aa8ad"}
source = {git = "https://github.com/vlang/v-analyzer", subpath = "tree_sitter_v", rev = "e14fdf6e661b10edccc744102e4ccf0b187aa8ad"}
[[language]]
name = "verilog"
@@ -2468,7 +2472,7 @@ language-servers = [ "slint-lsp" ]
[[grammar]]
name = "slint"
source = { git = "https://github.com/slint-ui/tree-sitter-slint", rev = "34ccfd58d3baee7636f62d9326f32092264e8407" }
source = { git = "https://github.com/slint-ui/tree-sitter-slint", rev = "f11da7e62051ba8b9d4faa299c26de8aeedfc1cd" }
[[language]]
name = "task"
@@ -3949,7 +3953,7 @@ indent = { tab-width = 4, unit = " " }
[[grammar]]
name = "spade"
source = { git = "https://gitlab.com/spade-lang/tree-sitter-spade/", rev = "4d5b141017c61fe7e168e0a5c5721ee62b0d9572" }
source = { git = "https://gitlab.com/spade-lang/tree-sitter-spade", rev = "4d5b141017c61fe7e168e0a5c5721ee62b0d9572" }
[[language]]
name = "amber"
@@ -3962,6 +3966,20 @@ indent = { tab-width = 4, unit = " " }
name = "amber"
source = { git = "https://github.com/amber-lang/tree-sitter-amber", rev = "c6df3ec2ec243ed76550c525e7ac3d9a10c6c814" }
[[language]]
name = "koto"
scope = "source.koto"
injection-regex = "koto"
file-types = ["koto"]
comment-token = "#"
block-comment-tokens = ["#-", "-#"]
indent = { tab-width = 2, unit = " " }
language-servers = ["koto-ls"]
[[grammar]]
name = "koto"
source = { git = "https://github.com/koto-lang/tree-sitter-koto", rev = "b420f7922d0d74905fd0d771e5b83be9ee8a8a9a" }
[[language]]
name = "gpr"
scope = "source.gpr"

View File

@@ -21,10 +21,10 @@
; Error level tags
((tag (name) @error)
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT)$"))
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT|COMPLIANCE)$"))
("text" @error
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT)$"))
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT|COMPLIANCE)$"))
(tag
(name) @ui.text

View File

@@ -19,6 +19,8 @@
"SHELL"
"MAINTAINER"
"CROSS_BUILD"
(heredoc_marker)
(heredoc_end)
] @keyword
[
@@ -35,7 +37,12 @@
(image_digest
"@" @punctuation.special))
(double_quoted_string) @string
[
(double_quoted_string)
(single_quoted_string)
(json_string)
(heredoc_line)
] @string
(expansion
[

View File

@@ -0,0 +1,9 @@
[
(assign)
(comment)
(function)
(list)
(map)
(tuple)
(string)
] @fold

View File

@@ -0,0 +1,152 @@
[
"="
"+"
"-"
"*"
"/"
"%"
"+="
"-="
"*="
"/="
"%="
"=="
"!="
"<"
">"
"<="
">="
".."
"..="
"->"
(null_check)
] @operator
[
"let"
] @keyword
[
"and"
"not"
"or"
] @keyword.operator
[
"return"
"yield"
] @keyword.control.return
[
"if"
"then"
"else"
"else if"
"match"
"switch"
] @keyword.control.conditional
[
(break)
(continue)
"for"
"in"
"loop"
"until"
"while"
] @keyword.control.repeat
[
"throw"
"try"
"catch"
"finally"
] @keyword.control.exception
[
"export"
"from"
"import"
"as"
] @keyword.control.import
(string (interpolation ("{") @punctuation.special))
(string (interpolation ("}") @punctuation.special))
[
"("
")"
"["
"]"
"{"
"}"
"|"
] @punctuation.bracket
[
";"
":"
","
] @punctuation.delimiter
(import_module
(identifier) @module)
(import_item
(identifier) @module)
(export
(identifier) @module)
(call
function: (identifier) @function.method)
(chain
lookup: (identifier) @variable.other.member)
[
(true)
(false)
] @constant.builtin.boolean
(comment) @comment
(debug) @keyword
(string) @string
(fill_char) @punctuation.delimiter
(alignment) @operator
(escape) @constant.character.escape
(null) @constant.builtin
(number) @constant.numeric
(meta) @keyword.directive
(meta
name: (identifier) @variable.other.member)
(entry_inline
key: (identifier) @variable.other.member)
(entry_block
key: (identifier) @variable.other.member)
(self) @variable.builtin
(variable
type: (identifier) @type)
(arg
(_ (identifier) @variable.parameter))
(ellipsis) @variable.parameter
(function
output_type: (identifier) @type)
(identifier) @variable

View File

@@ -0,0 +1,61 @@
[
(list)
(map)
(tuple)
] @indent
[
(for)
(else_if)
(else)
(match)
(switch)
(until)
(while)
] @indent @extend
(assign
"=" @indent @extend
!rhs
)
(assign
"=" @indent @extend
rhs: (_) @anchor
(#not-same-line? @indent @anchor)
)
(if
condition: (_) @indent @extend
!then
)
(if
condition: (_) @indent @extend
then: (_) @anchor
(#not-same-line? @indent @anchor)
)
(function
(args) @indent @extend
!body
)
(function
(args) @indent @extend
body: (_) @anchor
(#not-same-line? @indent @anchor)
)
(match_arm
"then" @indent @extend
!then
)
(match_arm
"then" @indent @extend
then: (_) @anchor
(#not-same-line? @indent @anchor)
)
[
"}"
"]"
")"
] @outdent

View File

@@ -0,0 +1,2 @@
((comment) @injection.content
(#set! injection.language "comment"))

View File

@@ -0,0 +1,30 @@
; Scopes
(module (_) @local.scope)
(function
body: (_) @local.scope)
; Definitions
(assign
lhs: (identifier) @local.definition)
(variable
(identifier) @local.definition)
(arg
(identifier) @local.definition)
(arg
(variable (identifier)) @local.definition)
(import_item
(identifier) @local.definition)
(entry_block
(identifier) @local.definition)
(entry_inline
(identifier) @local.definition)
; References
(identifier) @local.reference

View File

@@ -0,0 +1,38 @@
(comment) @comment.inside
(comment)+ @comment.around
(function
body: (_) @function.inside) @function.around
(args
((arg) @parameter.inside . ","? @parameter.around) @parameter.around)
(call_args
((call_arg) @parameter.inside . ","? @parameter.around) @parameter.around)
(chain
call: (tuple
((element) @parameter.inside . ","? @parameter.around) @parameter.around))
(map
((entry_inline) @entry.inside . ","? @entry.around) @entry.around)
(map_block
((entry_block) @entry.inside) @entry.around)
(list
((element) @entry.inside . ","? @entry.around) @entry.around)
(tuple
(_) @entry.around)
(assign
(meta (test))
(function body: (_) @test.inside)
) @test.around
(entry_block
key: (meta (test))
value: (function body: (_) @test.inside)
) @test.around

View File

@@ -0,0 +1,22 @@
(procedure_declaration (identifier) (procedure (block) @function.inside)) @function.around
(procedure_declaration (identifier) (procedure (uninitialized) @function.inside)) @function.around
(overloaded_procedure_declaration (identifier) @function.inside) @function.around
(procedure_type (parameters (parameter (identifier) @parameter.inside) @parameter.around))
(procedure (parameters (parameter (identifier) @parameter.inside) @parameter.around))
((procedure_declaration
(attributes (attribute "@" "(" (identifier) @attr_name ")"))
(identifier) (procedure (block) @test.inside)) @test.around
(#match? @attr_name "test"))
(comment) @comment.inside
(comment)+ @comment.around
(block_comment) @comment.inside
(block_comment)+ @comment.around
(struct_declaration (identifier) "::") @class.around
(enum_declaration (identifier) "::") @class.around
(union_declaration (identifier) "::") @class.around
(bit_field_declaration (identifier) "::") @class.around
(const_declaration (identifier) "::" [(array_type) (distinct_type) (bit_set_type) (pointer_type)]) @class.around

View File

@@ -14,4 +14,5 @@
[
"}"
")"
"]"
] @outdent

View File

@@ -21,7 +21,7 @@
"variable" = "text"
"variable.parameter" = { fg = "maroon", modifiers = ["italic"] }
"variable.builtin" = "red"
"variable.other.member" = "teal"
"variable.other.member" = "blue"
"label" = "sapphire" # used for lifetimes
@@ -50,10 +50,10 @@
"markup.heading.5" = "pink"
"markup.heading.6" = "teal"
"markup.list" = "mauve"
"markup.bold" = { modifiers = ["bold"] }
"markup.italic" = { modifiers = ["italic"] }
"markup.list.unchecked" = "overlay2"
"markup.list.checked" = "green"
"markup.bold" = { modifiers = ["bold"] }
"markup.italic" = { modifiers = ["italic"] }
"markup.link.url" = { fg = "blue", modifiers = ["italic", "underlined"] }
"markup.link.text" = "blue"
"markup.raw" = "flamingo"
@@ -86,6 +86,7 @@
"ui.text" = "text"
"ui.text.focus" = { fg = "text", bg = "surface0", modifiers = ["bold"] }
"ui.text.inactive" = { fg = "overlay1" }
"ui.text.directory" = { fg = "blue" }
"ui.virtual" = "overlay0"
"ui.virtual.ruler" = { bg = "surface0" }

View File

@@ -72,11 +72,13 @@
"ui.bufferline.background" = { bg = "background" }
"ui.text" = { fg = "text" }
"ui.text.focus" = { fg = "white" }
"ui.text.directory" = { fg = "blue3" }
"ui.text.inactive" = { fg = "dark_gray" }
"ui.virtual.whitespace" = { fg = "#3e3e3d" }
"ui.virtual.ruler" = { bg = "borders" }
"ui.virtual.indent-guide" = { fg = "dark_gray4" }
"ui.virtual.inlay-hint" = { fg = "dark_gray5"}
"ui.virtual.jump-label" = { fg = "dark_gray", modifiers = ["bold"] }
"ui.virtual.jump-label" = { fg = "yellow", modifiers = ["bold"] }
"ui.highlight.frameline" = { bg = "#4b4b18" }
"ui.debug.active" = { fg = "#ffcc00" }
"ui.debug.breakpoint" = { fg = "#e51400" }

View File

@@ -118,6 +118,7 @@
"ui.statusline.select" = { fg = "black", bg = "cyan", modifiers = ["bold"] }
"ui.text" = { fg = "foreground" }
"ui.text.focus" = { fg = "cyan" }
"ui.text.directory" = { fg = "cyan" }
"ui.virtual.indent-guide" = { fg = "indent" }
"ui.virtual.inlay-hint" = { fg = "cyan" }
"ui.virtual.inlay-hint.parameter" = { fg = "cyan", modifiers = ["italic", "dim"] }

View File

@@ -42,6 +42,7 @@
"ui.statusline.select" = { fg = "background_dark", bg = "purple" }
"ui.text" = { fg = "foreground" }
"ui.text.focus" = { fg = "cyan" }
"ui.text.directory" = { fg = "cyan" }
"ui.window" = { fg = "foreground" }
"ui.virtual.jump-label" = { fg = "pink", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "background_dark" }

View File

@@ -94,6 +94,7 @@
"ui.window" = { fg = "bg4", bg = "bg_dim" }
"ui.help" = { fg = "fg", bg = "bg2" }
"ui.text" = "fg"
"ui.text.directory" = { fg = "green" }
"ui.text.focus" = "fg"
"ui.menu" = { fg = "fg", bg = "bg3" }
"ui.menu.selected" = { fg = "bg0", bg = "green" }

View File

@@ -93,6 +93,7 @@
"ui.window" = { fg = "bg4", bg = "bg_dim" }
"ui.help" = { fg = "fg", bg = "bg2" }
"ui.text" = "fg"
"ui.text.directory" = { fg = "green" }
"ui.text.focus" = "fg"
"ui.menu" = { fg = "fg", bg = "bg3" }
"ui.menu.selected" = { fg = "bg0", bg = "green" }

View File

@@ -59,6 +59,7 @@ label = "scale.red.3"
"ui.text" = { fg = "fg.muted" }
"ui.text.focus" = { fg = "fg.default" }
"ui.text.inactive" = "fg.subtle"
"ui.text.directory" = { fg = "scale.blue.2" }
"ui.virtual" = { fg = "scale.gray.6" }
"ui.virtual.ruler" = { bg = "canvas.subtle" }
"ui.virtual.jump-label" = { fg = "scale.red.2", modifiers = ["bold"] }

View File

@@ -59,6 +59,7 @@ label = "scale.red.5"
"ui.text" = { fg = "fg.muted" }
"ui.text.focus" = { fg = "fg.default" }
"ui.text.inactive" = "fg.subtle"
"ui.text.directory" = { fg = "scale.blue.4" }
"ui.virtual" = { fg = "scale.gray.2" }
"ui.virtual.ruler" = { bg = "canvas.subtle" }

View File

@@ -106,6 +106,7 @@
"ui.statusline.select" = { fg = "bg1", bg = "orange1", modifiers = ["bold"] }
"ui.text" = { fg = "fg1" }
"ui.text.directory" = { fg = "blue1" }
"ui.virtual.inlay-hint" = { fg = "gray" }
"ui.virtual.jump-label" = { fg = "purple0", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "bg1" }

View File

@@ -60,6 +60,7 @@
"ui.background" = { bg = "bg", fg = "text" }
"ui.text" = { fg = "text" }
"ui.text.directory" = { fg = "blue" }
"ui.statusline" = { bg = "bg", fg = "text" }
"ui.statusline.inactive" = { bg = "bg", fg = "disabled" }

View File

@@ -36,6 +36,7 @@
"ui.text" = { fg = "text" }
"ui.text.focus" = { bg = "overlay" }
"ui.text.info" = { fg = "subtle" }
"ui.text.directory" = { fg = "iris" }
"ui.virtual.jump-label" = { fg = "love", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "overlay" }

View File

@@ -89,6 +89,7 @@ hint = { fg = "hint" }
"ui.text.focus" = { bg = "bg-focus" }
"ui.text.inactive" = { fg = "comment", modifiers = ["italic"] }
"ui.text.info" = { bg = "bg-menu", fg = "fg" }
"ui.text.directory" = { fg = "cyan" }
"ui.virtual.ruler" = { bg = "fg-gutter" }
"ui.virtual.whitespace" = { fg = "fg-gutter" }
"ui.virtual.inlay-hint" = { bg = "bg-inlay", fg = "teal" }

View File

@@ -56,6 +56,7 @@ tabstop = { modifiers = ["italic"], bg = "bossanova" }
"ui.text" = { fg = "lavender" }
"ui.text.focus" = { fg = "white" }
"ui.text.inactive" = "sirocco"
"ui.text.directory" = { fg = "lilac" }
"ui.virtual" = { fg = "comet" }
"ui.virtual.ruler" = { bg = "bossanova" }
"ui.virtual.jump-label" = { fg = "apricot", modifiers = ["bold"] }