Compare commits

...

44 Commits

Author SHA1 Message Date
Michael Davis
61491af15e Remove unused Result wrapper for Path->Url conversion 2024-12-20 13:33:47 -05:00
Michael Davis
a36806e326 Handle conversion to/from new LSP URL type 2024-12-20 13:33:46 -05:00
Michael Davis
b84c9a893c Replace url::Url with a String wrapper 2024-12-20 13:33:46 -05:00
Michael Davis
652e316925 LSP: Use PathBufs for workspace folders
Internally the LSP client should hold workspace folders as paths. Using
URLs for this type is inconvenient (since we compare it to paths) and
might cause mismatches because of URLs not being normalized. The URLs
must be paths anyways so we can convert these types lazily when we need
to send them to a server.
2024-12-20 13:33:46 -05:00
Nikita Revenco
ba6e6dc3dd Colors for items in the completion menu (#12299) 2024-12-20 10:16:15 -06:00
cornishon
a91263d604 Odin textobjects (#12302)
Co-authored-by: Adam Zadrożny <zadroznyadam@protonmail.com>
2024-12-20 09:59:28 -06:00
Ian Hobson
06d0f33c94 Add Koto language support (#12307) 2024-12-20 09:56:13 -06:00
Eduardo Rittner Coelho
eaff0c3cd6 Document diagnostic severity levels (#12306) 2024-12-20 09:47:06 -06:00
uncenter
1e9412269a Sync Catppuccin theme changes (#12304) 2024-12-20 09:43:45 -06:00
Nikita Revenco
355e381626 feat: use ui.text.directory for path completion item if its a folder (#12295) 2024-12-19 14:36:54 -06:00
Tobias Hunger
cbc06d1f15 chore: Update slint tree-sitter grammar to version 1.9 (#12297) 2024-12-19 10:16:12 -06:00
Eduardo Rittner Coelho
9e4da4b950 Show parser availability in --health [LANG] (#12228) 2024-12-18 11:21:58 -06:00
Christian Schneider
13e5a2ee5a Outdent array literals for php [] (#12286)
Co-authored-by: Christian Schneider <schneider@search.ch>
2024-12-18 08:52:20 -06:00
David Else
0134bb7063 Update dark_plus theme for inactive text and improve jump label (#12289) 2024-12-18 08:32:41 -06:00
Peter Ingram
ec65cc4913 Adds colored directories to everforest themes (#12287)
Co-authored-by: Peter Ingram <p.ingram@mrx.technology>
2024-12-18 08:31:40 -06:00
Nikita Revenco
91a5d407da Allow theming directory prompt completions (#12205) 2024-12-17 21:13:42 -06:00
Michael Davis
6eb186eb7b helix-lsp-types: use bitflags::bitflags rather than extern crate
This seems to be a historical artifact in `lsp_types` - we can use a
regular `use` statement to pull in the `bitflags!` macro rather than
an external crate definition. This fixes rust-analyzer's ability to find
the macro at least on rust-analyzer 2024-02-26.
2024-12-17 15:42:36 -05:00
Michael Davis
1980bd5992 helix-lsp-types: Prefer crate::Url to url::Url
This is a cosmetic change to replace all direct `use`s of the `url::Url`
type in the `helix-lsp-types` crate with `use crate::Url;`. The types
are the same type currently: this refactor will make a future
replacement of the Url type less noisy.

Connects https://github.com/helix-editor/helix/pull/11889
2024-12-17 15:42:28 -05:00
Tim Sampson
cc3b77b584 dockerfile: bump tree-sitter grammar to gain support for heredocs (#12230) 2024-12-17 13:26:49 -06:00
Christian Schneider
fcded6ce1e Trim trailing colons from paths to allow copy/pasting git grep -n output (#9963)
Co-authored-by: Christian Schneider <schneider@search.ch>
2024-12-17 13:02:06 -06:00
Pascal Kuthe
1badd9e434 implement snippet tabstop support 2024-12-17 13:34:40 -05:00
Pascal Kuthe
66fb1e67c0 add fallback onNextKey
adds a variant of on_next_key callbacks that are only called when no other
mapping matches a key
2024-12-17 13:34:40 -05:00
Pascal Kuthe
609c29bf7e add DocumentFocusLost event 2024-12-17 13:34:40 -05:00
Pascal Kuthe
5537e68b5e add changes and ghost_transaction to DocumentDidChange events 2024-12-17 13:34:40 -05:00
Pascal Kuthe
c8c0d04168 add snippet system to helix core 2024-12-17 13:34:39 -05:00
Pascal Kuthe
db959274d4 Add range type to helix stdx 2024-12-17 13:34:39 -05:00
dependabot[bot]
312c64f0c2 build(deps): bump the rust-dependencies group with 10 updates (#12277)
Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
2024-12-16 18:48:13 -06:00
André Sá
67535804a5 Fix build from source with Spade tree-sitter grammar (#12276) 2024-12-16 14:44:28 -06:00
Michael Davis
bae6a58c3c Add block-comment-tokens configuration for Java
Ref https://github.com/helix-editor/helix/pull/12266#issuecomment-2546370787
2024-12-16 14:02:35 -05:00
Integral
250d9fa8fe Avoid allocating the --help message (#12243) 2024-12-16 11:16:48 -06:00
Aaalibaba
3b36cf1a15 Expand tildes in :read command (#12271) 2024-12-16 11:10:35 -06:00
Nikita Revenco
99fdbce566 docs: remove mention that - requires special handling (#12250) 2024-12-16 10:01:14 -06:00
David Else
9b14750e56 Add ltex-ls-plus language server (#12251) 2024-12-16 09:37:49 -06:00
TornaxO7
4e5b0644a2 language: add comment token for java files (#12266) 2024-12-16 09:24:04 -06:00
Takumi Matsuura
e14c346ee7 Fix panic in kill_to_end_of_line when handling multibyte characters (#12237) 2024-12-13 14:04:52 -06:00
RoloEdits
617f538d41 feat(highlights): add COMPLIANCE to error (#12244) 2024-12-13 13:26:08 -06:00
Yuki Kobayashi
ce133a2889 languages(v): use vlang/v-analyzer instead of v-analyzer/v-analyzer (#12236)
* use vlang/v-analyzer instead of v-analyzer/v-analyzer

* revert rev, because CI failed (couldn't repro working query-check locally, so not sure if this will work)
2024-12-13 12:09:24 +09:00
TornaxO7
89a7cde2f0 Fix continuing comment token for first line (#12215) 2024-12-10 13:24:34 -06:00
TornaxO7
51ac3e05e0 doc: fix default value in doc for continue-comments (#12235) 2024-12-10 13:19:31 -06:00
TornaxO7
5005c14e99 Add config option for continue commenting (#12213)
Co-authored-by: Michael Davis <mcarsondavis@gmail.com>
2024-12-09 17:31:41 -06:00
Michael Davis
2f74530328 helix-lsp-types: Remove Cargo.lock
This lockfile is unused since this crate was added to the workspace and
can be removed.

Closes #12227
2024-12-09 17:14:38 -05:00
Tshepang Mbambo
a1a5faebef typo (#12224) 2024-12-09 12:23:30 -06:00
Nikita Revenco
db1d84256f fix: report correct amount of files opened and improved error message when Helix can't parse directory as file (#12199)
* feat: improve information on the amount of files loaded

* refactor: naming consitency Doc and not Buf

* fix: correct name of method

* chore: appease clippy

* feat: more human error information when Helix cannot start

* refatcor: use if guard on match arm
2024-12-08 20:14:29 +09:00
Michael Davis
271c32f2e6 Support bindings with the Super (Cmd/Win/Meta) modifier (#6592)
Terminals which support the enhanced keyboard protocol send events for
keys pressed with the Super modifier (Windows/Linux key or the Command
key). The only changes that are needed to support this in Helix are:

* Mapping the modifier from crossterm's KeyModifiers to Helix's
  KeyModifiers.
* Representing and parsing the modifier from the KeyEvent text
  representation.
* Documenting the ability to remap it.

When writing keybindings, use 'Meta-', 'Cmd-' or 'Win-' which are all
synonymous. For example:

    [keys.normal]
    Cmd-s = ":write"

will trigger for the Windows or Linux keys and the Command key plus 's'.
2024-12-08 12:35:14 +09:00
87 changed files with 3684 additions and 1878 deletions

139
Cargo.lock generated
View File

@@ -68,9 +68,9 @@ dependencies = [
[[package]]
name = "anyhow"
version = "1.0.93"
version = "1.0.94"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4c95c10ba0b00a02636238b814946408b1322d5ac4760326e6fb8ec956d85775"
checksum = "c1fd03a028ef38ba2276dce7e33fcd6369c158a1bca17946c4b1b701891c1ff7"
[[package]]
name = "arc-swap"
@@ -136,9 +136,9 @@ checksum = "df8670b8c7b9dae1793364eafadf7239c40d669904660c5960d74cfd80b46a53"
[[package]]
name = "cc"
version = "1.2.2"
version = "1.2.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f34d93e62b03caf570cccc334cbc6c2fceca82f39211051345108adcba3eebdc"
checksum = "9157bbaa6b165880c27a4293a474c91cdcf265cc68cc829bf10be0964a391caf"
dependencies = [
"shlex",
]
@@ -162,9 +162,9 @@ dependencies = [
[[package]]
name = "chrono"
version = "0.4.38"
version = "0.4.39"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a21f936df1771bf62b77f047b726c4625ff2e8aa607c01ec06e5a05bd8463401"
checksum = "7e36cc9d416881d2e24f9a963be5fb1cd90966419ac844274161d10488b3e825"
dependencies = [
"android-tzdata",
"iana-time-zone",
@@ -327,12 +327,12 @@ checksum = "5443807d6dff69373d433ab9ef5378ad8df50ca6298caf15de6e52e24aaf54d5"
[[package]]
name = "errno"
version = "0.3.9"
version = "0.3.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "534c5cf6194dfab3db3242765c03bbe257cf92f22b38f6bc0c58d59108a820ba"
checksum = "33d852cb9b869c2a9b3df2f71a3074817f01e1844f839a144f5fcef059a4eb5d"
dependencies = [
"libc",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -369,9 +369,9 @@ checksum = "e8c02a5121d4ea3eb16a80748c74f5549a5665e4c21333c6098f283870fbdea6"
[[package]]
name = "fern"
version = "0.7.0"
version = "0.7.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "69ff9c9d5fb3e6da8ac2f77ab76fe7e8087d512ce095200f8f29ac5b656cf6dc"
checksum = "4316185f709b23713e41e3195f90edef7fb00c3ed4adc79769cf09cc762a3b29"
dependencies = [
"log",
]
@@ -522,7 +522,7 @@ dependencies = [
"gix-worktree",
"once_cell",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -535,7 +535,7 @@ dependencies = [
"gix-date",
"gix-utils",
"itoa",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -552,7 +552,7 @@ dependencies = [
"gix-trace",
"kstring",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"unicode-bom",
]
@@ -562,7 +562,7 @@ version = "0.2.13"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d48b897b4bbc881aea994b4a5bbb340a04979d7be9089791304e04a9fbc66b53"
dependencies = [
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -571,7 +571,7 @@ version = "0.4.10"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c6ffbeb3a5c0b8b84c3fe4133a6f8c82fa962f4caefe8d0762eced025d3eb4f7"
dependencies = [
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -597,7 +597,7 @@ dependencies = [
"gix-features",
"gix-hash",
"memmap2",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -616,7 +616,7 @@ dependencies = [
"memchr",
"once_cell",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"unicode-bom",
"winnow",
]
@@ -631,7 +631,7 @@ dependencies = [
"bstr",
"gix-path",
"libc",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -643,7 +643,7 @@ dependencies = [
"bstr",
"itoa",
"jiff",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -664,7 +664,7 @@ dependencies = [
"gix-traverse",
"gix-worktree",
"imara-diff",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -684,7 +684,7 @@ dependencies = [
"gix-trace",
"gix-utils",
"gix-worktree",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -700,7 +700,7 @@ dependencies = [
"gix-path",
"gix-ref",
"gix-sec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -718,7 +718,7 @@ dependencies = [
"once_cell",
"prodash",
"sha1_smol",
"thiserror 2.0.3",
"thiserror 2.0.7",
"walkdir",
]
@@ -740,7 +740,7 @@ dependencies = [
"gix-trace",
"gix-utils",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -773,7 +773,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0b5eccc17194ed0e67d49285e4853307e4147e95407f91c1c3e4a13ba9f4e4ce"
dependencies = [
"faster-hex",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -825,7 +825,7 @@ dependencies = [
"memmap2",
"rustix",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -855,7 +855,7 @@ dependencies = [
"gix-validate",
"itoa",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -877,7 +877,7 @@ dependencies = [
"gix-quote",
"parking_lot",
"tempfile",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -895,7 +895,7 @@ dependencies = [
"gix-path",
"memmap2",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -907,7 +907,7 @@ dependencies = [
"bstr",
"faster-hex",
"gix-trace",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -920,7 +920,7 @@ dependencies = [
"gix-trace",
"home",
"once_cell",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -935,7 +935,7 @@ dependencies = [
"gix-config-value",
"gix-glob",
"gix-path",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -946,7 +946,7 @@ checksum = "64a1e282216ec2ab2816cd57e6ed88f8009e634aec47562883c05ac8a7009a63"
dependencies = [
"bstr",
"gix-utils",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -966,7 +966,7 @@ dependencies = [
"gix-utils",
"gix-validate",
"memmap2",
"thiserror 2.0.3",
"thiserror 2.0.7",
"winnow",
]
@@ -981,7 +981,7 @@ dependencies = [
"gix-revision",
"gix-validate",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -996,7 +996,7 @@ dependencies = [
"gix-hash",
"gix-object",
"gix-revwalk",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1011,7 +1011,7 @@ dependencies = [
"gix-hashtable",
"gix-object",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1046,7 +1046,7 @@ dependencies = [
"gix-pathspec",
"gix-worktree",
"portable-atomic",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1061,7 +1061,7 @@ dependencies = [
"gix-pathspec",
"gix-refspec",
"gix-url",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1098,7 +1098,7 @@ dependencies = [
"gix-object",
"gix-revwalk",
"smallvec",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1110,7 +1110,7 @@ dependencies = [
"bstr",
"gix-features",
"gix-path",
"thiserror 2.0.3",
"thiserror 2.0.7",
"url",
]
@@ -1132,7 +1132,7 @@ source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cd520d09f9f585b34b32aba1d0b36ada89ab7fefb54a8ca3fe37fc482a750937"
dependencies = [
"bstr",
"thiserror 2.0.3",
"thiserror 2.0.7",
]
[[package]]
@@ -1219,6 +1219,7 @@ name = "helix-core"
version = "24.7.0"
dependencies = [
"ahash",
"anyhow",
"arc-swap",
"bitflags",
"chrono",
@@ -1228,6 +1229,7 @@ dependencies = [
"globset",
"hashbrown",
"helix-loader",
"helix-parsec",
"helix-stdx",
"imara-diff",
"indoc",
@@ -1235,8 +1237,10 @@ dependencies = [
"nucleo",
"once_cell",
"parking_lot",
"percent-encoding",
"quickcheck",
"regex",
"regex-cursor",
"ropey",
"serde",
"serde_json",
@@ -1249,7 +1253,6 @@ dependencies = [
"unicode-general-category",
"unicode-segmentation",
"unicode-width",
"url",
]
[[package]]
@@ -1263,7 +1266,7 @@ dependencies = [
"log",
"serde",
"serde_json",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
]
@@ -1319,7 +1322,7 @@ dependencies = [
"serde",
"serde_json",
"slotmap",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
]
@@ -1329,10 +1332,10 @@ name = "helix-lsp-types"
version = "0.95.1"
dependencies = [
"bitflags",
"percent-encoding",
"serde",
"serde_json",
"serde_repr",
"url",
]
[[package]]
@@ -1394,7 +1397,7 @@ dependencies = [
"smallvec",
"tempfile",
"termini",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
"toml",
@@ -1461,11 +1464,10 @@ dependencies = [
"serde_json",
"slotmap",
"tempfile",
"thiserror 2.0.3",
"thiserror 2.0.7",
"tokio",
"tokio-stream",
"toml",
"url",
]
[[package]]
@@ -1757,9 +1759,9 @@ dependencies = [
[[package]]
name = "libc"
version = "0.2.167"
version = "0.2.168"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "09d6582e104315a817dff97f75133544b2e094ee22447d2acf4a74e189ba06fc"
checksum = "5aaeb2981e0606ca11d79718f8bb01164f1d6ed75080182d3abf017e6d244b6d"
[[package]]
name = "libloading"
@@ -2130,15 +2132,15 @@ checksum = "719b953e2095829ee67db738b3bfa9fa368c94900df327b3f07fe6e794d2fe1f"
[[package]]
name = "rustix"
version = "0.38.41"
version = "0.38.42"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d7f649912bc1495e167a6edee79151c84b1bad49748cb4f1f1167f459f6224f6"
checksum = "f93dc38ecbab2eb790ff964bb77fa94faf256fd3e73285fd7ba0903b76bedb85"
dependencies = [
"bitflags",
"errno",
"libc",
"linux-raw-sys",
"windows-sys 0.52.0",
"windows-sys 0.59.0",
]
[[package]]
@@ -2164,18 +2166,18 @@ checksum = "94143f37725109f92c262ed2cf5e59bce7498c01bcc1502d7b9afe439a4e9f49"
[[package]]
name = "serde"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6513c1ad0b11a9376da888e3e0baa0077f1aed55c17f50e7b2397136129fb88f"
checksum = "0b9781016e935a97e8beecf0c933758c97a5520d32930e460142b4cd80c6338e"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.215"
version = "1.0.216"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ad1e866f866923f252f05c889987993144fb74e722403468a4ebd70c3cd756c0"
checksum = "46f859dbbf73865c6627ed570e78961cd3ac92407a2d117204c49232485da55e"
dependencies = [
"proc-macro2",
"quote",
@@ -2409,11 +2411,11 @@ dependencies = [
[[package]]
name = "thiserror"
version = "2.0.3"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "c006c85c7651b3cf2ada4584faa36773bd07bac24acfb39f3c431b36d7e667aa"
checksum = "93605438cbd668185516ab499d589afb7ee1859ea3d5fc8f6b0755e1c7443767"
dependencies = [
"thiserror-impl 2.0.3",
"thiserror-impl 2.0.7",
]
[[package]]
@@ -2429,9 +2431,9 @@ dependencies = [
[[package]]
name = "thiserror-impl"
version = "2.0.3"
version = "2.0.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "f077553d607adc1caf65430528a576c757a71ed73944b66ebb58ef2bbd243568"
checksum = "e1d8749b4531af2117677a5fcd12b1348a3fe2b81e36e61ffeac5c4aa3273e36"
dependencies = [
"proc-macro2",
"quote",
@@ -2474,9 +2476,9 @@ checksum = "1f3ccbac311fea05f86f61904b462b55fb3df8837a366dfc601a0161d0532f20"
[[package]]
name = "tokio"
version = "1.41.1"
version = "1.42.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "22cfb5bee7a6a52939ca9224d6ac897bb669134078daa8735560897f69de4d33"
checksum = "5cec9b21b0450273377fc97bd4c33a8acffc8c996c987a7c5b319a0083707551"
dependencies = [
"backtrace",
"bytes",
@@ -2503,9 +2505,9 @@ dependencies = [
[[package]]
name = "tokio-stream"
version = "0.1.16"
version = "0.1.17"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4f4e6ce100d0eb49a2734f8c0812bcd324cf357d21810932c5df6b96ef2b86f1"
checksum = "eca58d7bba4a75707817a2c44174253f9236b2d5fbd055602e9d5c07c139a047"
dependencies = [
"futures-core",
"pin-project-lite",
@@ -2619,7 +2621,6 @@ dependencies = [
"form_urlencoded",
"idna",
"percent-encoding",
"serde",
]
[[package]]

View File

@@ -42,6 +42,7 @@ tree-sitter = { version = "0.22" }
nucleo = "0.5.0"
slotmap = "1.0.7"
thiserror = "2.0"
percent-encoding = "2.3"
[workspace.package]
version = "24.7.0"

View File

@@ -31,6 +31,7 @@
| `line-number` | Line number display: `absolute` simply shows each line's number, while `relative` shows the distance from the current line. When unfocused or in insert mode, `relative` will still show absolute line numbers | `absolute` |
| `cursorline` | Highlight all lines with a cursor | `false` |
| `cursorcolumn` | Highlight all columns with a cursor | `false` |
| `continue-comments` | if helix should automatically add a line comment token if you create a new line inside a comment. | `true` |
| `gutters` | Gutters to display: Available are `diagnostics` and `diff` and `line-numbers` and `spacer`, note that `diagnostics` also includes other features like breakpoints, 1-width padding will be inserted if gutters is non-empty | `["diagnostics", "spacer", "line-numbers", "spacer", "diff"]` |
| `auto-completion` | Enable automatic pop up of auto-completion | `true` |
| `path-completion` | Enable filepath completion. Show files and directories if an existing path at the cursor was recognized, either absolute or relative to the current opened document or current working directory (if the buffer is not yet saved). Defaults to true. | `true` |
@@ -58,7 +59,7 @@
### `[editor.clipboard-provider]` Section
Helix can be configured wither to use a builtin clipboard configuration or to use
Helix can be configured either to use a builtin clipboard configuration or to use
a provided command.
For instance, setting it to use OSC 52 termcodes, the configuration would be:
@@ -441,6 +442,8 @@ fn main() {
| `max-wrap` | Equivalent of the `editor.soft-wrap.max-wrap` option for diagnostics. | `20` |
| `max-diagnostics` | Maximum number of diagnostics to render inline for a given line | `10` |
The allowed values for `cursor-line` and `other-lines` are: `error`, `warning`, `info`, `hint`.
The (first) diagnostic with the highest severity that is not shown inline is rendered at the end of the line (as long as its severity is higher than the `end-of-line-diagnostics` config option):
```

View File

@@ -115,6 +115,7 @@
| kdl | ✓ | ✓ | ✓ | |
| koka | ✓ | | ✓ | `koka` |
| kotlin | ✓ | | | `kotlin-language-server` |
| koto | ✓ | ✓ | ✓ | `koto-ls` |
| latex | ✓ | ✓ | | `texlab` |
| ld | ✓ | | ✓ | |
| ldif | ✓ | | | |
@@ -146,7 +147,7 @@
| nunjucks | ✓ | | | |
| ocaml | ✓ | | ✓ | `ocamllsp` |
| ocaml-interface | ✓ | | | `ocamllsp` |
| odin | ✓ | | ✓ | `ols` |
| odin | ✓ | | ✓ | `ols` |
| ohm | ✓ | ✓ | ✓ | |
| opencl | ✓ | ✓ | ✓ | `clangd` |
| openscad | ✓ | | | `openscad-lsp` |

View File

@@ -292,3 +292,5 @@
| `command_palette` | Open command palette | normal: `` <space>? ``, select: `` <space>? `` |
| `goto_word` | Jump to a two-character label | normal: `` gw `` |
| `extend_to_word` | Extend to a two-character label | select: `` gw `` |
| `goto_next_tabstop` | goto next snippet placeholder | |
| `goto_prev_tabstop` | goto next snippet placeholder | |

View File

@@ -60,7 +60,7 @@ These configuration keys are available:
| `shebangs` | The interpreters from the shebang line, for example `["sh", "bash"]` |
| `roots` | A set of marker files to look for when trying to find the workspace root. For example `Cargo.lock`, `yarn.lock` |
| `auto-format` | Whether to autoformat this language when saving |
| `diagnostic-severity` | Minimal severity of diagnostic for it to be displayed. (Allowed values: `Error`, `Warning`, `Info`, `Hint`) |
| `diagnostic-severity` | Minimal severity of diagnostic for it to be displayed. (Allowed values: `error`, `warning`, `info`, `hint`) |
| `comment-tokens` | The tokens to use as a comment token, either a single token `"//"` or an array `["//", "///", "//!"]` (the first token will be used for commenting). Also configurable as `comment-token` for backwards compatibility|
| `block-comment-tokens`| The start and end tokens for a multiline comment either an array or single table of `{ start = "/*", end = "*/"}`. The first set of tokens will be used for commenting, any pairs in the array can be uncommented |
| `indent` | The indent to use. Has sub keys `unit` (the text inserted into the document when indenting; usually set to N spaces or `"\t"` for tabs) and `tab-width` (the number of spaces rendered for a tab) |

View File

@@ -72,15 +72,28 @@ t = ":run-shell-command cargo test"
## Special keys and modifiers
Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes
`C-`, `S-` and `A-`. Special keys are encoded as follows:
Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes `C-`, `S-` and `A-`.
The [Super key](https://en.wikipedia.org/wiki/Super_key_(keyboard_button)) - the Windows/Linux
key or the Command key on Mac keyboards - is also supported when using a terminal emulator that
supports the [enhanced keyboard protocol](https://github.com/helix-editor/helix/wiki/Terminal-Support#enhanced-keyboard-protocol).
The super key is encoded with prefixes `Meta-`, `Cmd-` or `Win-`. These are all synonyms for the
super modifier - binding a key with a `Win-` modifier will mean it can be used with the
Windows/Linux key or the Command key.
```toml
[keys.normal]
C-s = ":write" # Ctrl and 's' to write
Cmd-s = ":write" # Cmd or Win or Meta and 's' to write
```
Special keys are encoded as follows:
| Key name | Representation |
| --- | --- |
| Backspace | `"backspace"` |
| Space | `"space"` |
| Return/Enter | `"ret"` |
| \- | `"minus"` |
| Left | `"left"` |
| Right | `"right"` |
| Up | `"up"` |
@@ -96,3 +109,14 @@ Ctrl, Shift and Alt modifiers are encoded respectively with the prefixes
| Escape | `"esc"` |
Keys can be disabled by binding them to the `no_op` command.
All other keys such as `?`, `!`, `-` etc. can be used literally:
```toml
[keys.normal]
"?" = ":write"
"!" = ":write"
"-" = ":write"
```
Note: `-` can't be used when combined with a modifier, for example `Alt` + `-` should be written as `A-minus`. `A--` is not accepted.

View File

@@ -305,6 +305,7 @@ These scopes are used for theming the editor interface:
| `ui.text.focus` | The currently selected line in the picker |
| `ui.text.inactive` | Same as `ui.text` but when the text is inactive (e.g. suggestions) |
| `ui.text.info` | The key: command text in `ui.popup.info` boxes |
| `ui.text.directory` | Directory names in prompt completion |
| `ui.virtual.ruler` | Ruler columns (see the [`editor.rulers` config][editor-section]) |
| `ui.virtual.whitespace` | Visible whitespace characters |
| `ui.virtual.indent-guide` | Vertical indent width guides |

View File

@@ -18,6 +18,7 @@ integration = []
[dependencies]
helix-stdx = { path = "../helix-stdx" }
helix-loader = { path = "../helix-loader" }
helix-parsec = { path = "../helix-parsec" }
ropey = { version = "1.6.1", default-features = false, features = ["simd"] }
smallvec = "1.13"
@@ -39,9 +40,10 @@ bitflags = "2.6"
ahash = "0.8.11"
hashbrown = { version = "0.14.5", features = ["raw"] }
dunce = "1.0"
url = "2.5.4"
percent-encoding.workspace = true
log = "0.4"
anyhow = "1.0"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
toml = "0.8"
@@ -58,6 +60,7 @@ textwrap = "0.16.1"
nucleo.workspace = true
parking_lot = "0.12"
globset = "0.4.15"
regex-cursor = "0.1.4"
[dev-dependencies]
quickcheck = { version = "1", default-features = false }

View File

@@ -0,0 +1,69 @@
use crate::Tendril;
// todo: should this be grapheme aware?
pub fn to_pascal_case(text: impl Iterator<Item = char>) -> Tendril {
let mut res = Tendril::new();
to_pascal_case_with(text, &mut res);
res
}
pub fn to_pascal_case_with(text: impl Iterator<Item = char>, buf: &mut Tendril) {
let mut at_word_start = true;
for c in text {
// we don't count _ as a word char here so case conversions work well
if !c.is_alphanumeric() {
at_word_start = true;
continue;
}
if at_word_start {
at_word_start = false;
buf.extend(c.to_uppercase());
} else {
buf.push(c)
}
}
}
pub fn to_upper_case_with(text: impl Iterator<Item = char>, buf: &mut Tendril) {
for c in text {
for c in c.to_uppercase() {
buf.push(c)
}
}
}
pub fn to_lower_case_with(text: impl Iterator<Item = char>, buf: &mut Tendril) {
for c in text {
for c in c.to_lowercase() {
buf.push(c)
}
}
}
pub fn to_camel_case(text: impl Iterator<Item = char>) -> Tendril {
let mut res = Tendril::new();
to_camel_case_with(text, &mut res);
res
}
pub fn to_camel_case_with(mut text: impl Iterator<Item = char>, buf: &mut Tendril) {
for c in &mut text {
if c.is_alphanumeric() {
buf.extend(c.to_lowercase())
}
}
let mut at_word_start = false;
for c in text {
// we don't count _ as a word char here so case conversions work well
if !c.is_alphanumeric() {
at_word_start = true;
continue;
}
if at_word_start {
at_word_start = false;
buf.extend(c.to_uppercase());
} else {
buf.push(c)
}
}
}

View File

@@ -1,6 +1,7 @@
//! LSP diagnostic utility types.
use std::fmt;
pub use helix_stdx::range::Range;
use serde::{Deserialize, Serialize};
/// Describes the severity level of a [`Diagnostic`].
@@ -19,19 +20,6 @@ impl Default for Severity {
}
}
/// A range of `char`s within the text.
#[derive(Debug, Clone, Copy, PartialOrd, Ord, PartialEq, Eq)]
pub struct Range {
pub start: usize,
pub end: usize,
}
impl Range {
pub fn contains(self, pos: usize) -> bool {
(self.start..self.end).contains(&pos)
}
}
#[derive(Debug, Eq, Hash, PartialEq, Clone, Deserialize, Serialize)]
pub enum NumberOrString {
Number(i32),

View File

@@ -1,4 +1,4 @@
use std::{borrow::Cow, collections::HashMap};
use std::{borrow::Cow, collections::HashMap, iter};
use helix_stdx::rope::RopeSliceExt;
use tree_sitter::{Query, QueryCursor, QueryPredicateArg};
@@ -8,7 +8,7 @@ use crate::{
graphemes::{grapheme_width, tab_width_at},
syntax::{IndentationHeuristic, LanguageConfiguration, RopeProvider, Syntax},
tree_sitter::Node,
Position, Rope, RopeGraphemes, RopeSlice,
Position, Rope, RopeGraphemes, RopeSlice, Tendril,
};
/// Enum representing indentation style.
@@ -210,6 +210,36 @@ fn whitespace_with_same_width(text: RopeSlice) -> String {
s
}
/// normalizes indentation to tabs/spaces based on user configuration
/// This function does not change the actual indentation width, just the character
/// composition.
pub fn normalize_indentation(
prefix: RopeSlice<'_>,
line: RopeSlice<'_>,
dst: &mut Tendril,
indent_style: IndentStyle,
tab_width: usize,
) -> usize {
#[allow(deprecated)]
let off = crate::visual_coords_at_pos(prefix, prefix.len_chars(), tab_width).col;
let mut len = 0;
let mut original_len = 0;
for ch in line.chars() {
match ch {
'\t' => len += tab_width_at(len + off, tab_width as u16),
' ' => len += 1,
_ => break,
}
original_len += 1;
}
if indent_style == IndentStyle::Tabs {
dst.extend(iter::repeat('\t').take(len / tab_width));
len %= tab_width;
}
dst.extend(iter::repeat(' ').take(len));
original_len
}
fn add_indent_level(
mut base_indent: String,
added_indent_level: isize,

View File

@@ -1,6 +1,7 @@
pub use encoding_rs as encoding;
pub mod auto_pairs;
pub mod case_conversion;
pub mod chars;
pub mod comment;
pub mod completion;
@@ -22,6 +23,7 @@ mod position;
pub mod search;
pub mod selection;
pub mod shellwords;
pub mod snippets;
pub mod surround;
pub mod syntax;
pub mod test;

View File

@@ -11,6 +11,7 @@ use crate::{
movement::Direction,
Assoc, ChangeSet, RopeGraphemes, RopeSlice,
};
use helix_stdx::range::is_subset;
use helix_stdx::rope::{self, RopeSliceExt};
use smallvec::{smallvec, SmallVec};
use std::{borrow::Cow, iter, slice};
@@ -401,6 +402,15 @@ impl From<(usize, usize)> for Range {
}
}
impl From<Range> for helix_stdx::Range {
fn from(range: Range) -> Self {
Self {
start: range.from(),
end: range.to(),
}
}
}
/// A selection consists of one or more selection ranges.
/// invariant: A selection can never be empty (always contains at least primary range).
#[derive(Debug, Clone, PartialEq, Eq)]
@@ -513,6 +523,10 @@ impl Selection {
}
}
pub fn range_bounds(&self) -> impl Iterator<Item = helix_stdx::Range> + '_ {
self.ranges.iter().map(|&range| range.into())
}
pub fn primary_index(&self) -> usize {
self.primary_index
}
@@ -683,32 +697,9 @@ impl Selection {
self.ranges.len()
}
// returns true if self ⊇ other
/// returns true if self ⊇ other
pub fn contains(&self, other: &Selection) -> bool {
let (mut iter_self, mut iter_other) = (self.iter(), other.iter());
let (mut ele_self, mut ele_other) = (iter_self.next(), iter_other.next());
loop {
match (ele_self, ele_other) {
(Some(ra), Some(rb)) => {
if !ra.contains_range(rb) {
// `self` doesn't contain next element from `other`, advance `self`, we need to match all from `other`
ele_self = iter_self.next();
} else {
// matched element from `other`, advance `other`
ele_other = iter_other.next();
};
}
(None, Some(_)) => {
// exhausted `self`, we can't match the reminder of `other`
return false;
}
(_, None) => {
// no elements from `other` left to match, `self` contains `other`
return true;
}
}
}
is_subset::<true>(self.range_bounds(), other.range_bounds())
}
}

View File

@@ -0,0 +1,13 @@
mod active;
mod elaborate;
mod parser;
mod render;
#[derive(PartialEq, Eq, Hash, Debug, PartialOrd, Ord, Clone, Copy)]
pub struct TabstopIdx(usize);
pub const LAST_TABSTOP_IDX: TabstopIdx = TabstopIdx(usize::MAX);
pub use active::ActiveSnippet;
pub use elaborate::{Snippet, SnippetElement, Transform};
pub use render::RenderedSnippet;
pub use render::SnippetRenderCtx;

View File

@@ -0,0 +1,255 @@
use std::ops::{Index, IndexMut};
use hashbrown::HashSet;
use helix_stdx::range::{is_exact_subset, is_subset};
use helix_stdx::Range;
use ropey::Rope;
use crate::movement::Direction;
use crate::snippets::render::{RenderedSnippet, Tabstop};
use crate::snippets::TabstopIdx;
use crate::{Assoc, ChangeSet, Selection, Transaction};
pub struct ActiveSnippet {
ranges: Vec<Range>,
active_tabstops: HashSet<TabstopIdx>,
current_tabstop: TabstopIdx,
tabstops: Vec<Tabstop>,
}
impl Index<TabstopIdx> for ActiveSnippet {
type Output = Tabstop;
fn index(&self, index: TabstopIdx) -> &Tabstop {
&self.tabstops[index.0]
}
}
impl IndexMut<TabstopIdx> for ActiveSnippet {
fn index_mut(&mut self, index: TabstopIdx) -> &mut Tabstop {
&mut self.tabstops[index.0]
}
}
impl ActiveSnippet {
pub fn new(snippet: RenderedSnippet) -> Option<Self> {
let snippet = Self {
ranges: snippet.ranges,
tabstops: snippet.tabstops,
active_tabstops: HashSet::new(),
current_tabstop: TabstopIdx(0),
};
(snippet.tabstops.len() != 1).then_some(snippet)
}
pub fn is_valid(&self, new_selection: &Selection) -> bool {
is_subset::<false>(self.ranges.iter().copied(), new_selection.range_bounds())
}
pub fn tabstops(&self) -> impl Iterator<Item = &Tabstop> {
self.tabstops.iter()
}
pub fn delete_placeholder(&self, doc: &Rope) -> Transaction {
Transaction::delete(
doc,
self[self.current_tabstop]
.ranges
.iter()
.map(|range| (range.start, range.end)),
)
}
/// maps the active snippets through a `ChangeSet` updating all tabstop ranges
pub fn map(&mut self, changes: &ChangeSet) -> bool {
let positions_to_map = self.ranges.iter_mut().flat_map(|range| {
[
(&mut range.start, Assoc::After),
(&mut range.end, Assoc::Before),
]
});
changes.update_positions(positions_to_map);
for (i, tabstop) in self.tabstops.iter_mut().enumerate() {
if self.active_tabstops.contains(&TabstopIdx(i)) {
let positions_to_map = tabstop.ranges.iter_mut().flat_map(|range| {
let end_assoc = if range.start == range.end {
Assoc::Before
} else {
Assoc::After
};
[
(&mut range.start, Assoc::Before),
(&mut range.end, end_assoc),
]
});
changes.update_positions(positions_to_map);
} else {
let positions_to_map = tabstop.ranges.iter_mut().flat_map(|range| {
let end_assoc = if range.start == range.end {
Assoc::After
} else {
Assoc::Before
};
[
(&mut range.start, Assoc::After),
(&mut range.end, end_assoc),
]
});
changes.update_positions(positions_to_map);
}
let mut snippet_ranges = self.ranges.iter();
let mut snippet_range = snippet_ranges.next().unwrap();
let mut tabstop_i = 0;
let mut prev = Range { start: 0, end: 0 };
let num_ranges = tabstop.ranges.len() / self.ranges.len();
tabstop.ranges.retain_mut(|range| {
if tabstop_i == num_ranges {
snippet_range = snippet_ranges.next().unwrap();
tabstop_i = 0;
}
tabstop_i += 1;
let retain = snippet_range.start <= snippet_range.end;
if retain {
range.start = range.start.max(snippet_range.start);
range.end = range.end.max(range.start).min(snippet_range.end);
// guaranteed by assoc
debug_assert!(prev.start <= range.start);
debug_assert!(range.start <= range.end);
if prev.end > range.start {
// not really sure what to do in this case. It shouldn't
// really occur in practice, the below just ensures
// our invariants hold
range.start = prev.end;
range.end = range.end.max(range.start)
}
prev = *range;
}
retain
});
}
self.ranges.iter().all(|range| range.end <= range.start)
}
pub fn next_tabstop(&mut self, current_selection: &Selection) -> (Selection, bool) {
let primary_idx = self.primary_idx(current_selection);
while self.current_tabstop.0 + 1 < self.tabstops.len() {
self.current_tabstop.0 += 1;
if self.activate_tabstop() {
let selection = self.tabstop_selection(primary_idx, Direction::Forward);
return (selection, self.current_tabstop.0 + 1 == self.tabstops.len());
}
}
(
self.tabstop_selection(primary_idx, Direction::Forward),
true,
)
}
pub fn prev_tabstop(&mut self, current_selection: &Selection) -> Option<Selection> {
let primary_idx = self.primary_idx(current_selection);
while self.current_tabstop.0 != 0 {
self.current_tabstop.0 -= 1;
if self.activate_tabstop() {
return Some(self.tabstop_selection(primary_idx, Direction::Forward));
}
}
None
}
// computes the primary idx adjusted for the number of cursors in the current tabstop
fn primary_idx(&self, current_selection: &Selection) -> usize {
let primary: Range = current_selection.primary().into();
let res = self
.ranges
.iter()
.position(|&range| range.contains(primary));
res.unwrap_or_else(|| {
unreachable!(
"active snippet must be valid {current_selection:?} {:?}",
self.ranges
)
})
}
fn activate_tabstop(&mut self) -> bool {
let tabstop = &self[self.current_tabstop];
if tabstop.has_placeholder() && tabstop.ranges.iter().all(|range| range.is_empty()) {
return false;
}
self.active_tabstops.clear();
self.active_tabstops.insert(self.current_tabstop);
let mut parent = self[self.current_tabstop].parent;
while let Some(tabstop) = parent {
self.active_tabstops.insert(tabstop);
parent = self[tabstop].parent;
}
true
// TODO: if the user removes the selection(s) in one snippet (but
// there are still other cursors in other snippets) and jumps to the
// next tabstop the selection in that tabstop is restored (at the
// next tabstop). This could be annoying since its not possible to
// remove a snippet cursor until the snippet is complete. On the other
// hand it may be useful since the user may just have meant to edit
// a subselection (like with s) of the tabstops and so the selection
// removal was just temporary. Potentially this could have some sort of
// separate keymap
}
pub fn tabstop_selection(&self, primary_idx: usize, direction: Direction) -> Selection {
let tabstop = &self[self.current_tabstop];
tabstop.selection(direction, primary_idx, self.ranges.len())
}
pub fn insert_subsnippet(mut self, snippet: RenderedSnippet) -> Option<Self> {
if snippet.ranges.len() % self.ranges.len() != 0
|| !is_exact_subset(self.ranges.iter().copied(), snippet.ranges.iter().copied())
{
log::warn!("number of subsnippets did not match, discarding outer snippet");
return ActiveSnippet::new(snippet);
}
let mut cnt = 0;
let parent = self[self.current_tabstop].parent;
let tabstops = snippet.tabstops.into_iter().map(|mut tabstop| {
cnt += 1;
if let Some(parent) = &mut tabstop.parent {
parent.0 += self.current_tabstop.0;
} else {
tabstop.parent = parent;
}
tabstop
});
self.tabstops
.splice(self.current_tabstop.0..=self.current_tabstop.0, tabstops);
self.activate_tabstop();
Some(self)
}
}
#[cfg(test)]
mod tests {
use std::iter::{self};
use ropey::Rope;
use crate::snippets::{ActiveSnippet, Snippet, SnippetRenderCtx};
use crate::{Selection, Transaction};
#[test]
fn fully_remove() {
let snippet = Snippet::parse("foo(${1:bar})$0").unwrap();
let mut doc = Rope::from("bar.\n");
let (transaction, _, snippet) = snippet.render(
&doc,
&Selection::point(4),
|_| (4, 4),
&mut SnippetRenderCtx::test_ctx(),
);
assert!(transaction.apply(&mut doc));
assert_eq!(doc, "bar.foo(bar)\n");
let mut snippet = ActiveSnippet::new(snippet).unwrap();
let edit = Transaction::change(&doc, iter::once((4, 12, None)));
assert!(edit.apply(&mut doc));
snippet.map(edit.changes());
assert!(!snippet.is_valid(&Selection::point(4)))
}
}

View File

@@ -0,0 +1,378 @@
use std::mem::swap;
use std::ops::Index;
use std::sync::Arc;
use anyhow::{anyhow, Result};
use helix_stdx::rope::RopeSliceExt;
use helix_stdx::Range;
use regex_cursor::engines::meta::Builder as RegexBuilder;
use regex_cursor::engines::meta::Regex;
use regex_cursor::regex_automata::util::syntax::Config as RegexConfig;
use ropey::RopeSlice;
use crate::case_conversion::to_lower_case_with;
use crate::case_conversion::to_upper_case_with;
use crate::case_conversion::{to_camel_case_with, to_pascal_case_with};
use crate::snippets::parser::{self, CaseChange, FormatItem};
use crate::snippets::{TabstopIdx, LAST_TABSTOP_IDX};
use crate::Tendril;
#[derive(Debug)]
pub struct Snippet {
elements: Vec<SnippetElement>,
tabstops: Vec<Tabstop>,
}
impl Snippet {
pub fn parse(snippet: &str) -> Result<Self> {
let parsed_snippet = parser::parse(snippet)
.map_err(|rest| anyhow!("Failed to parse snippet. Remaining input: {}", rest))?;
Ok(Snippet::new(parsed_snippet))
}
pub fn new(elements: Vec<parser::SnippetElement>) -> Snippet {
let mut res = Snippet {
elements: Vec::new(),
tabstops: Vec::new(),
};
res.elements = res.elaborate(elements, None).into();
res.fixup_tabstops();
res.ensure_last_tabstop();
res.renumber_tabstops();
res
}
pub fn elements(&self) -> &[SnippetElement] {
&self.elements
}
pub fn tabstops(&self) -> impl Iterator<Item = &Tabstop> {
self.tabstops.iter()
}
fn renumber_tabstops(&mut self) {
Self::renumber_tabstops_in(&self.tabstops, &mut self.elements);
for i in 0..self.tabstops.len() {
if let Some(parent) = self.tabstops[i].parent {
let parent = self
.tabstops
.binary_search_by_key(&parent, |tabstop| tabstop.idx)
.expect("all tabstops have been resolved");
self.tabstops[i].parent = Some(TabstopIdx(parent));
}
let tabstop = &mut self.tabstops[i];
if let TabstopKind::Placeholder { default } = &tabstop.kind {
let mut default = default.clone();
tabstop.kind = TabstopKind::Empty;
Self::renumber_tabstops_in(&self.tabstops, Arc::get_mut(&mut default).unwrap());
self.tabstops[i].kind = TabstopKind::Placeholder { default };
}
}
}
fn renumber_tabstops_in(tabstops: &[Tabstop], elements: &mut [SnippetElement]) {
for elem in elements {
match elem {
SnippetElement::Tabstop { idx } => {
idx.0 = tabstops
.binary_search_by_key(&*idx, |tabstop| tabstop.idx)
.expect("all tabstops have been resolved")
}
SnippetElement::Variable { default, .. } => {
if let Some(default) = default {
Self::renumber_tabstops_in(tabstops, default);
}
}
SnippetElement::Text(_) => (),
}
}
}
fn fixup_tabstops(&mut self) {
self.tabstops.sort_by_key(|tabstop| tabstop.idx);
self.tabstops.dedup_by(|tabstop1, tabstop2| {
if tabstop1.idx != tabstop2.idx {
return false;
}
// use the first non empty tabstop for all multicursor tabstops
if tabstop2.kind.is_empty() {
swap(tabstop2, tabstop1)
}
true
})
}
fn ensure_last_tabstop(&mut self) {
if matches!(self.tabstops.last(), Some(tabstop) if tabstop.idx == LAST_TABSTOP_IDX) {
return;
}
self.tabstops.push(Tabstop {
idx: LAST_TABSTOP_IDX,
parent: None,
kind: TabstopKind::Empty,
});
self.elements.push(SnippetElement::Tabstop {
idx: LAST_TABSTOP_IDX,
})
}
fn elaborate(
&mut self,
default: Vec<parser::SnippetElement>,
parent: Option<TabstopIdx>,
) -> Box<[SnippetElement]> {
default
.into_iter()
.map(|val| match val {
parser::SnippetElement::Tabstop {
tabstop,
transform: None,
} => SnippetElement::Tabstop {
idx: self.elaborate_placeholder(tabstop, parent, Vec::new()),
},
parser::SnippetElement::Tabstop {
tabstop,
transform: Some(transform),
} => SnippetElement::Tabstop {
idx: self.elaborate_transform(tabstop, parent, transform),
},
parser::SnippetElement::Placeholder { tabstop, value } => SnippetElement::Tabstop {
idx: self.elaborate_placeholder(tabstop, parent, value),
},
parser::SnippetElement::Choice { tabstop, choices } => SnippetElement::Tabstop {
idx: self.elaborate_choice(tabstop, parent, choices),
},
parser::SnippetElement::Variable {
name,
default,
transform,
} => SnippetElement::Variable {
name,
default: default.map(|default| self.elaborate(default, parent)),
// TODO: error for invalid transforms
transform: transform.and_then(Transform::new).map(Box::new),
},
parser::SnippetElement::Text(text) => SnippetElement::Text(text),
})
.collect()
}
fn elaborate_choice(
&mut self,
idx: usize,
parent: Option<TabstopIdx>,
choices: Vec<Tendril>,
) -> TabstopIdx {
let idx = TabstopIdx::elaborate(idx);
self.tabstops.push(Tabstop {
idx,
parent,
kind: TabstopKind::Choice {
choices: choices.into(),
},
});
idx
}
fn elaborate_placeholder(
&mut self,
idx: usize,
parent: Option<TabstopIdx>,
default: Vec<parser::SnippetElement>,
) -> TabstopIdx {
let idx = TabstopIdx::elaborate(idx);
let default = self.elaborate(default, Some(idx));
self.tabstops.push(Tabstop {
idx,
parent,
kind: TabstopKind::Placeholder {
default: default.into(),
},
});
idx
}
fn elaborate_transform(
&mut self,
idx: usize,
parent: Option<TabstopIdx>,
transform: parser::Transform,
) -> TabstopIdx {
let idx = TabstopIdx::elaborate(idx);
if let Some(transform) = Transform::new(transform) {
self.tabstops.push(Tabstop {
idx,
parent,
kind: TabstopKind::Transform(Arc::new(transform)),
})
} else {
// TODO: proper error
self.tabstops.push(Tabstop {
idx,
parent,
kind: TabstopKind::Empty,
})
}
idx
}
}
impl Index<TabstopIdx> for Snippet {
type Output = Tabstop;
fn index(&self, index: TabstopIdx) -> &Tabstop {
&self.tabstops[index.0]
}
}
#[derive(Debug)]
pub enum SnippetElement {
Tabstop {
idx: TabstopIdx,
},
Variable {
name: Tendril,
default: Option<Box<[SnippetElement]>>,
transform: Option<Box<Transform>>,
},
Text(Tendril),
}
#[derive(Debug)]
pub struct Tabstop {
idx: TabstopIdx,
pub parent: Option<TabstopIdx>,
pub kind: TabstopKind,
}
#[derive(Debug)]
pub enum TabstopKind {
Choice { choices: Arc<[Tendril]> },
Placeholder { default: Arc<[SnippetElement]> },
Empty,
Transform(Arc<Transform>),
}
impl TabstopKind {
pub fn is_empty(&self) -> bool {
matches!(self, TabstopKind::Empty)
}
}
#[derive(Debug)]
pub struct Transform {
regex: Regex,
regex_str: Box<str>,
global: bool,
replacement: Box<[FormatItem]>,
}
impl PartialEq for Transform {
fn eq(&self, other: &Self) -> bool {
self.replacement == other.replacement
&& self.global == other.global
// doens't compare m and i setting but close enough
&& self.regex_str == other.regex_str
}
}
impl Transform {
fn new(transform: parser::Transform) -> Option<Transform> {
let mut config = RegexConfig::new();
let mut global = false;
let mut invalid_config = false;
for c in transform.options.chars() {
match c {
'i' => {
config = config.case_insensitive(true);
}
'm' => {
config = config.multi_line(true);
}
'g' => {
global = true;
}
// we ignore 'u' since we always want to
// do unicode aware matching
_ => invalid_config = true,
}
}
if invalid_config {
log::error!("invalid transform configuration characters {transform:?}");
}
let regex = match RegexBuilder::new().syntax(config).build(&transform.regex) {
Ok(regex) => regex,
Err(err) => {
log::error!("invalid transform {err} {transform:?}");
return None;
}
};
Some(Transform {
regex,
regex_str: transform.regex.as_str().into(),
global,
replacement: transform.replacement.into(),
})
}
pub fn apply(&self, mut doc: RopeSlice<'_>, range: Range) -> Tendril {
let mut buf = Tendril::new();
let it = self
.regex
.captures_iter(doc.regex_input_at(range))
.enumerate();
doc = doc.slice(range);
let mut last_match = 0;
for (_, cap) in it {
// unwrap on 0 is OK because captures only reports matches
let m = cap.get_group(0).unwrap();
buf.extend(doc.byte_slice(last_match..m.start).chunks());
last_match = m.end;
for fmt in &*self.replacement {
match *fmt {
FormatItem::Text(ref text) => {
buf.push_str(text);
}
FormatItem::Capture(i) => {
if let Some(cap) = cap.get_group(i) {
buf.extend(doc.byte_slice(cap.range()).chunks());
}
}
FormatItem::CaseChange(i, change) => {
if let Some(cap) = cap.get_group(i).filter(|i| !i.is_empty()) {
let mut chars = doc.byte_slice(cap.range()).chars();
match change {
CaseChange::Upcase => to_upper_case_with(chars, &mut buf),
CaseChange::Downcase => to_lower_case_with(chars, &mut buf),
CaseChange::Capitalize => {
let first_char = chars.next().unwrap();
buf.extend(first_char.to_uppercase());
buf.extend(chars);
}
CaseChange::PascalCase => to_pascal_case_with(chars, &mut buf),
CaseChange::CamelCase => to_camel_case_with(chars, &mut buf),
}
}
}
FormatItem::Conditional(i, ref if_, ref else_) => {
if cap.get_group(i).map_or(true, |mat| mat.is_empty()) {
buf.push_str(else_)
} else {
buf.push_str(if_)
}
}
}
}
if !self.global {
break;
}
}
buf.extend(doc.byte_slice(last_match..).chunks());
buf
}
}
impl TabstopIdx {
fn elaborate(idx: usize) -> Self {
TabstopIdx(idx.wrapping_sub(1))
}
}

View File

@@ -0,0 +1,922 @@
/*!
A parser for LSP/VSCode style snippet syntax
See <https://microsoft.github.io/language-server-protocol/specifications/lsp/3.17/specification/#snippet_syntax>.
``` text
any ::= tabstop | placeholder | choice | variable | text
tabstop ::= '$' int | '${' int '}'
placeholder ::= '${' int ':' any '}'
choice ::= '${' int '|' text (',' text)* '|}'
variable ::= '$' var | '${' var }'
| '${' var ':' any '}'
| '${' var '/' regex '/' (format | text)+ '/' options '}'
format ::= '$' int | '${' int '}'
| '${' int ':' '/upcase' | '/downcase' | '/capitalize' '}'
| '${' int ':+' if '}'
| '${' int ':?' if ':' else '}'
| '${' int ':-' else '}' | '${' int ':' else '}'
regex ::= Regular Expression value (ctor-string)
options ::= Regular Expression option (ctor-options)
var ::= [_a-zA-Z] [_a-zA-Z0-9]*
int ::= [0-9]+
text ::= .*
if ::= text
else ::= text
```
*/
use crate::Tendril;
use helix_parsec::*;
#[derive(Debug, PartialEq, Eq, Clone, Copy)]
pub enum CaseChange {
Upcase,
Downcase,
Capitalize,
PascalCase,
CamelCase,
}
#[derive(Debug, PartialEq, Eq)]
pub enum FormatItem {
Text(Tendril),
Capture(usize),
CaseChange(usize, CaseChange),
Conditional(usize, Tendril, Tendril),
}
#[derive(Debug, PartialEq, Eq)]
pub struct Transform {
pub regex: Tendril,
pub replacement: Vec<FormatItem>,
pub options: Tendril,
}
#[derive(Debug, PartialEq, Eq)]
pub enum SnippetElement {
Tabstop {
tabstop: usize,
transform: Option<Transform>,
},
Placeholder {
tabstop: usize,
value: Vec<SnippetElement>,
},
Choice {
tabstop: usize,
choices: Vec<Tendril>,
},
Variable {
name: Tendril,
default: Option<Vec<SnippetElement>>,
transform: Option<Transform>,
},
Text(Tendril),
}
pub fn parse(s: &str) -> Result<Vec<SnippetElement>, &str> {
snippet().parse(s).and_then(|(remainder, snippet)| {
if remainder.is_empty() {
Ok(snippet)
} else {
Err(remainder)
}
})
}
fn var<'a>() -> impl Parser<'a, Output = &'a str> {
// var = [_a-zA-Z][_a-zA-Z0-9]*
move |input: &'a str| {
input
.char_indices()
.take_while(|(p, c)| {
*c == '_'
|| if *p == 0 {
c.is_ascii_alphabetic()
} else {
c.is_ascii_alphanumeric()
}
})
.last()
.map(|(index, c)| {
let index = index + c.len_utf8();
(&input[index..], &input[0..index])
})
.ok_or(input)
}
}
const TEXT_ESCAPE_CHARS: &[char] = &['\\', '}', '$'];
const CHOICE_TEXT_ESCAPE_CHARS: &[char] = &['\\', '|', ','];
fn text<'a>(
escape_chars: &'static [char],
term_chars: &'static [char],
) -> impl Parser<'a, Output = Tendril> {
move |input: &'a str| {
let mut chars = input.char_indices().peekable();
let mut res = Tendril::new();
while let Some((i, c)) = chars.next() {
match c {
'\\' => {
if let Some(&(_, c)) = chars.peek() {
if escape_chars.contains(&c) {
chars.next();
res.push(c);
continue;
}
}
res.push('\\');
}
c if term_chars.contains(&c) => return Ok((&input[i..], res)),
c => res.push(c),
}
}
Ok(("", res))
}
}
fn digit<'a>() -> impl Parser<'a, Output = usize> {
filter_map(take_while(|c| c.is_ascii_digit()), |s| s.parse().ok())
}
fn case_change<'a>() -> impl Parser<'a, Output = CaseChange> {
use CaseChange::*;
choice!(
map("upcase", |_| Upcase),
map("downcase", |_| Downcase),
map("capitalize", |_| Capitalize),
map("pascalcase", |_| PascalCase),
map("camelcase", |_| CamelCase),
)
}
fn format<'a>() -> impl Parser<'a, Output = FormatItem> {
use FormatItem::*;
choice!(
// '$' int
map(right("$", digit()), Capture),
// '${' int '}'
map(seq!("${", digit(), "}"), |seq| Capture(seq.1)),
// '${' int ':' '/upcase' | '/downcase' | '/capitalize' '}'
map(seq!("${", digit(), ":/", case_change(), "}"), |seq| {
CaseChange(seq.1, seq.3)
}),
// '${' int ':+' if '}'
map(
seq!("${", digit(), ":+", text(TEXT_ESCAPE_CHARS, &['}']), "}"),
|seq| { Conditional(seq.1, seq.3, Tendril::new()) }
),
// '${' int ':?' if ':' else '}'
map(
seq!(
"${",
digit(),
":?",
text(TEXT_ESCAPE_CHARS, &[':']),
":",
text(TEXT_ESCAPE_CHARS, &['}']),
"}"
),
|seq| { Conditional(seq.1, seq.3, seq.5) }
),
// '${' int ':-' else '}' | '${' int ':' else '}'
map(
seq!(
"${",
digit(),
":",
optional("-"),
text(TEXT_ESCAPE_CHARS, &['}']),
"}"
),
|seq| { Conditional(seq.1, Tendril::new(), seq.4) }
),
)
}
fn regex<'a>() -> impl Parser<'a, Output = Transform> {
map(
seq!(
"/",
// TODO parse as ECMAScript and convert to rust regex
text(&['/'], &['/']),
"/",
zero_or_more(choice!(
format(),
// text doesn't parse $, if format fails we just accept the $ as text
map("$", |_| FormatItem::Text("$".into())),
map(text(&['\\', '/'], &['/', '$']), FormatItem::Text),
)),
"/",
// vscode really doesn't allow escaping } here
// so it's impossible to write a regex escape containing a }
// we can consider deviating here and allowing the escape
text(&[], &['}']),
),
|(_, value, _, replacement, _, options)| Transform {
regex: value,
replacement,
options,
},
)
}
fn tabstop<'a>() -> impl Parser<'a, Output = SnippetElement> {
map(
or(
map(right("$", digit()), |i| (i, None)),
map(
seq!("${", digit(), optional(regex()), "}"),
|(_, i, transform, _)| (i, transform),
),
),
|(tabstop, transform)| SnippetElement::Tabstop { tabstop, transform },
)
}
fn placeholder<'a>() -> impl Parser<'a, Output = SnippetElement> {
map(
seq!(
"${",
digit(),
":",
// according to the grammar there is just a single anything here.
// However in the prose it is explained that placeholders can be nested.
// The example there contains both a placeholder text and a nested placeholder
// which indicates a list. Looking at the VSCode sourcecode, the placeholder
// is indeed parsed as zero_or_more so the grammar is simply incorrect here
zero_or_more(anything(TEXT_ESCAPE_CHARS, true)),
"}"
),
|seq| SnippetElement::Placeholder {
tabstop: seq.1,
value: seq.3,
},
)
}
fn choice<'a>() -> impl Parser<'a, Output = SnippetElement> {
map(
seq!(
"${",
digit(),
"|",
sep(text(CHOICE_TEXT_ESCAPE_CHARS, &['|', ',']), ","),
"|}",
),
|seq| SnippetElement::Choice {
tabstop: seq.1,
choices: seq.3,
},
)
}
fn variable<'a>() -> impl Parser<'a, Output = SnippetElement> {
choice!(
// $var
map(right("$", var()), |name| SnippetElement::Variable {
name: name.into(),
default: None,
transform: None,
}),
// ${var}
map(seq!("${", var(), "}",), |values| SnippetElement::Variable {
name: values.1.into(),
default: None,
transform: None,
}),
// ${var:default}
map(
seq!(
"${",
var(),
":",
zero_or_more(anything(TEXT_ESCAPE_CHARS, true)),
"}",
),
|values| SnippetElement::Variable {
name: values.1.into(),
default: Some(values.3),
transform: None,
}
),
// ${var/value/format/options}
map(seq!("${", var(), regex(), "}"), |values| {
SnippetElement::Variable {
name: values.1.into(),
default: None,
transform: Some(values.2),
}
}),
)
}
fn anything<'a>(
escape_chars: &'static [char],
end_at_brace: bool,
) -> impl Parser<'a, Output = SnippetElement> {
let term_chars: &[_] = if end_at_brace { &['$', '}'] } else { &['$'] };
move |input: &'a str| {
let parser = choice!(
tabstop(),
placeholder(),
choice(),
variable(),
map("$", |_| SnippetElement::Text("$".into())),
map(text(escape_chars, term_chars), SnippetElement::Text),
);
parser.parse(input)
}
}
fn snippet<'a>() -> impl Parser<'a, Output = Vec<SnippetElement>> {
one_or_more(anything(TEXT_ESCAPE_CHARS, false))
}
#[cfg(test)]
mod test {
use crate::snippets::{Snippet, SnippetRenderCtx};
use super::SnippetElement::*;
use super::*;
#[test]
fn empty_string_is_error() {
assert_eq!(Err(""), parse(""));
}
#[test]
fn parse_placeholders_in_function_call() {
assert_eq!(
Ok(vec![
Text("match(".into()),
Placeholder {
tabstop: 1,
value: vec![Text("Arg1".into())],
},
Text(")".into()),
]),
parse("match(${1:Arg1})")
)
}
#[test]
fn unterminated_placeholder() {
assert_eq!(
Ok(vec![
Text("match(".into()),
Text("$".into()),
Text("{1:)".into())
]),
parse("match(${1:)")
)
}
#[test]
fn parse_empty_placeholder() {
assert_eq!(
Ok(vec![
Text("match(".into()),
Placeholder {
tabstop: 1,
value: vec![],
},
Text(")".into()),
]),
parse("match(${1:})")
)
}
#[test]
fn parse_placeholders_in_statement() {
assert_eq!(
Ok(vec![
Text("local ".into()),
Placeholder {
tabstop: 1,
value: vec![Text("var".into())],
},
Text(" = ".into()),
Placeholder {
tabstop: 1,
value: vec![Text("value".into())],
},
]),
parse("local ${1:var} = ${1:value}")
)
}
#[test]
fn parse_tabstop_nested_in_placeholder() {
assert_eq!(
Ok(vec![Placeholder {
tabstop: 1,
value: vec![
Text("var, ".into()),
Tabstop {
tabstop: 2,
transform: None
}
],
}]),
parse("${1:var, $2}")
)
}
#[test]
fn parse_placeholder_nested_in_placeholder() {
assert_eq!(
Ok({
vec![Placeholder {
tabstop: 1,
value: vec![
Text("foo ".into()),
Placeholder {
tabstop: 2,
value: vec![Text("bar".into())],
},
],
}]
}),
parse("${1:foo ${2:bar}}")
)
}
#[test]
fn parse_all() {
assert_eq!(
Ok(vec![
Text("hello ".into()),
Tabstop {
tabstop: 1,
transform: None
},
Tabstop {
tabstop: 2,
transform: None
},
Text(" ".into()),
Choice {
tabstop: 1,
choices: vec!["one".into(), "two".into(), "three".into()],
},
Text(" ".into()),
Variable {
name: "name".into(),
default: Some(vec![Text("foo".into())]),
transform: None,
},
Text(" ".into()),
Variable {
name: "var".into(),
default: None,
transform: None,
},
Text(" ".into()),
Variable {
name: "TM".into(),
default: None,
transform: None,
},
]),
parse("hello $1${2} ${1|one,two,three|} ${name:foo} $var $TM")
);
}
#[test]
fn regex_capture_replace() {
assert_eq!(
Ok({
vec![Variable {
name: "TM_FILENAME".into(),
default: None,
transform: Some(Transform {
regex: "(.*).+$".into(),
replacement: vec![FormatItem::Capture(1), FormatItem::Text("$".into())],
options: Tendril::new(),
}),
}]
}),
parse("${TM_FILENAME/(.*).+$/$1$/}")
);
}
#[test]
fn rust_macro() {
assert_eq!(
Ok({
vec![
Text("macro_rules! ".into()),
Tabstop {
tabstop: 1,
transform: None,
},
Text(" {\n (".into()),
Tabstop {
tabstop: 2,
transform: None,
},
Text(") => {\n ".into()),
Tabstop {
tabstop: 0,
transform: None,
},
Text("\n };\n}".into()),
]
}),
parse("macro_rules! $1 {\n ($2) => {\n $0\n };\n}")
);
}
fn assert_text(snippet: &str, parsed_text: &str) {
let snippet = Snippet::parse(snippet).unwrap();
let mut rendered_snippet = snippet.prepare_render();
let rendered_text = snippet
.render_at(
&mut rendered_snippet,
"".into(),
false,
&mut SnippetRenderCtx::test_ctx(),
0,
)
.0;
assert_eq!(rendered_text, parsed_text)
}
#[test]
fn robust_parsing() {
assert_text("$", "$");
assert_text("\\\\$", "\\$");
assert_text("{", "{");
assert_text("\\}", "}");
assert_text("\\abc", "\\abc");
assert_text("foo${f:\\}}bar", "foo}bar");
assert_text("\\{", "\\{");
assert_text("I need \\\\\\$", "I need \\$");
assert_text("\\", "\\");
assert_text("\\{{", "\\{{");
assert_text("{{", "{{");
assert_text("{{dd", "{{dd");
assert_text("}}", "}}");
assert_text("ff}}", "ff}}");
assert_text("farboo", "farboo");
assert_text("far{{}}boo", "far{{}}boo");
assert_text("far{{123}}boo", "far{{123}}boo");
assert_text("far\\{{123}}boo", "far\\{{123}}boo");
assert_text("far{{id:bern}}boo", "far{{id:bern}}boo");
assert_text("far{{id:bern {{basel}}}}boo", "far{{id:bern {{basel}}}}boo");
assert_text(
"far{{id:bern {{id:basel}}}}boo",
"far{{id:bern {{id:basel}}}}boo",
);
assert_text(
"far{{id:bern {{id2:basel}}}}boo",
"far{{id:bern {{id2:basel}}}}boo",
);
assert_text("${}$\\a\\$\\}\\\\", "${}$\\a$}\\");
assert_text("farboo", "farboo");
assert_text("far{{}}boo", "far{{}}boo");
assert_text("far{{123}}boo", "far{{123}}boo");
assert_text("far\\{{123}}boo", "far\\{{123}}boo");
assert_text("far`123`boo", "far`123`boo");
assert_text("far\\`123\\`boo", "far\\`123\\`boo");
assert_text("\\$far-boo", "$far-boo");
}
fn assert_snippet(snippet: &str, expect: &[SnippetElement]) {
let elements = parse(snippet).unwrap();
assert_eq!(elements, expect.to_owned())
}
#[test]
fn parse_variable() {
use SnippetElement::*;
assert_snippet(
"$far-boo",
&[
Variable {
name: "far".into(),
default: None,
transform: None,
},
Text("-boo".into()),
],
);
assert_snippet(
"far$farboo",
&[
Text("far".into()),
Variable {
name: "farboo".into(),
transform: None,
default: None,
},
],
);
assert_snippet(
"far${farboo}",
&[
Text("far".into()),
Variable {
name: "farboo".into(),
transform: None,
default: None,
},
],
);
assert_snippet(
"$123",
&[Tabstop {
tabstop: 123,
transform: None,
}],
);
assert_snippet(
"$farboo",
&[Variable {
name: "farboo".into(),
transform: None,
default: None,
}],
);
assert_snippet(
"$far12boo",
&[Variable {
name: "far12boo".into(),
transform: None,
default: None,
}],
);
assert_snippet(
"000_${far}_000",
&[
Text("000_".into()),
Variable {
name: "far".into(),
transform: None,
default: None,
},
Text("_000".into()),
],
);
}
#[test]
fn parse_variable_transform() {
assert_snippet(
"${foo///}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: Tendril::new(),
replacement: Vec::new(),
options: Tendril::new(),
}),
default: None,
}],
);
assert_snippet(
"${foo/regex/format/gmi}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: "regex".into(),
replacement: vec![FormatItem::Text("format".into())],
options: "gmi".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/([A-Z][a-z])/format/}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: "([A-Z][a-z])".into(),
replacement: vec![FormatItem::Text("format".into())],
options: Tendril::new(),
}),
default: None,
}],
);
// invalid regex TODO: reneable tests once we actually parse this regex flavor
// assert_text(
// "${foo/([A-Z][a-z])/format/GMI}",
// "${foo/([A-Z][a-z])/format/GMI}",
// );
// assert_text(
// "${foo/([A-Z][a-z])/format/funky}",
// "${foo/([A-Z][a-z])/format/funky}",
// );
// assert_text("${foo/([A-Z][a-z]/format/}", "${foo/([A-Z][a-z]/format/}");
assert_text(
"${foo/regex\\/format/options}",
"${foo/regex\\/format/options}",
);
// tricky regex
assert_snippet(
"${foo/m\\/atch/$1/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: "m/atch".into(),
replacement: vec![FormatItem::Capture(1)],
options: "i".into(),
}),
default: None,
}],
);
// incomplete
assert_text("${foo///", "${foo///");
assert_text("${foo/regex/format/options", "${foo/regex/format/options");
// format string
assert_snippet(
"${foo/.*/${0:fooo}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![FormatItem::Conditional(0, Tendril::new(), "fooo".into())],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/${1}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![FormatItem::Capture(1)],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/$1/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![FormatItem::Capture(1)],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/This-$1-encloses/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("This-".into()),
FormatItem::Capture(1),
FormatItem::Text("-encloses".into()),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/complex${1:else}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("complex".into()),
FormatItem::Conditional(1, Tendril::new(), "else".into()),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/complex${1:-else}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("complex".into()),
FormatItem::Conditional(1, Tendril::new(), "else".into()),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/complex${1:+if}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("complex".into()),
FormatItem::Conditional(1, "if".into(), Tendril::new()),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/complex${1:?if:else}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("complex".into()),
FormatItem::Conditional(1, "if".into(), "else".into()),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${foo/.*/complex${1:/upcase}/i}",
&[Variable {
name: "foo".into(),
transform: Some(Transform {
regex: ".*".into(),
replacement: vec![
FormatItem::Text("complex".into()),
FormatItem::CaseChange(1, CaseChange::Upcase),
],
options: "i".into(),
}),
default: None,
}],
);
assert_snippet(
"${TM_DIRECTORY/src\\//$1/}",
&[Variable {
name: "TM_DIRECTORY".into(),
transform: Some(Transform {
regex: "src/".into(),
replacement: vec![FormatItem::Capture(1)],
options: Tendril::new(),
}),
default: None,
}],
);
assert_snippet(
"${TM_SELECTED_TEXT/a/\\/$1/g}",
&[Variable {
name: "TM_SELECTED_TEXT".into(),
transform: Some(Transform {
regex: "a".into(),
replacement: vec![FormatItem::Text("/".into()), FormatItem::Capture(1)],
options: "g".into(),
}),
default: None,
}],
);
assert_snippet(
"${TM_SELECTED_TEXT/a/in\\/$1ner/g}",
&[Variable {
name: "TM_SELECTED_TEXT".into(),
transform: Some(Transform {
regex: "a".into(),
replacement: vec![
FormatItem::Text("in/".into()),
FormatItem::Capture(1),
FormatItem::Text("ner".into()),
],
options: "g".into(),
}),
default: None,
}],
);
assert_snippet(
"${TM_SELECTED_TEXT/a/end\\//g}",
&[Variable {
name: "TM_SELECTED_TEXT".into(),
transform: Some(Transform {
regex: "a".into(),
replacement: vec![FormatItem::Text("end/".into())],
options: "g".into(),
}),
default: None,
}],
);
}
// TODO port more tests from https://github.com/microsoft/vscode/blob/dce493cb6e36346ef2714e82c42ce14fc461b15c/src/vs/editor/contrib/snippet/test/browser/snippetParser.test.ts
}

View File

@@ -0,0 +1,355 @@
use std::borrow::Cow;
use std::ops::{Index, IndexMut};
use std::sync::Arc;
use helix_stdx::Range;
use ropey::{Rope, RopeSlice};
use smallvec::SmallVec;
use crate::indent::{normalize_indentation, IndentStyle};
use crate::movement::Direction;
use crate::snippets::elaborate;
use crate::snippets::TabstopIdx;
use crate::snippets::{Snippet, SnippetElement, Transform};
use crate::{selection, Selection, Tendril, Transaction};
#[derive(Debug, Clone, PartialEq)]
pub enum TabstopKind {
Choice { choices: Arc<[Tendril]> },
Placeholder,
Empty,
Transform(Arc<Transform>),
}
#[derive(Debug, PartialEq)]
pub struct Tabstop {
pub ranges: SmallVec<[Range; 1]>,
pub parent: Option<TabstopIdx>,
pub kind: TabstopKind,
}
impl Tabstop {
pub fn has_placeholder(&self) -> bool {
matches!(
self.kind,
TabstopKind::Choice { .. } | TabstopKind::Placeholder
)
}
pub fn selection(
&self,
direction: Direction,
primary_idx: usize,
snippet_ranges: usize,
) -> Selection {
Selection::new(
self.ranges
.iter()
.map(|&range| {
let mut range = selection::Range::new(range.start, range.end);
if direction == Direction::Backward {
range = range.flip()
}
range
})
.collect(),
primary_idx * (self.ranges.len() / snippet_ranges),
)
}
}
#[derive(Debug, Default, PartialEq)]
pub struct RenderedSnippet {
pub tabstops: Vec<Tabstop>,
pub ranges: Vec<Range>,
}
impl RenderedSnippet {
pub fn first_selection(&self, direction: Direction, primary_idx: usize) -> Selection {
self.tabstops[0].selection(direction, primary_idx, self.ranges.len())
}
}
impl Index<TabstopIdx> for RenderedSnippet {
type Output = Tabstop;
fn index(&self, index: TabstopIdx) -> &Tabstop {
&self.tabstops[index.0]
}
}
impl IndexMut<TabstopIdx> for RenderedSnippet {
fn index_mut(&mut self, index: TabstopIdx) -> &mut Tabstop {
&mut self.tabstops[index.0]
}
}
impl Snippet {
pub fn prepare_render(&self) -> RenderedSnippet {
let tabstops =
self.tabstops()
.map(|tabstop| Tabstop {
ranges: SmallVec::new(),
parent: tabstop.parent,
kind: match &tabstop.kind {
elaborate::TabstopKind::Choice { choices } => TabstopKind::Choice {
choices: choices.clone(),
},
// start out as empty: the first non-empty placeholder will change this to
// a placeholder automatically
elaborate::TabstopKind::Empty
| elaborate::TabstopKind::Placeholder { .. } => TabstopKind::Empty,
elaborate::TabstopKind::Transform(transform) => {
TabstopKind::Transform(transform.clone())
}
},
})
.collect();
RenderedSnippet {
tabstops,
ranges: Vec::new(),
}
}
pub fn render_at(
&self,
snippet: &mut RenderedSnippet,
indent: RopeSlice<'_>,
at_newline: bool,
ctx: &mut SnippetRenderCtx,
pos: usize,
) -> (Tendril, usize) {
let mut ctx = SnippetRender {
dst: snippet,
src: self,
indent,
text: Tendril::new(),
off: pos,
ctx,
at_newline,
};
ctx.render_elements(self.elements());
let end = ctx.off;
let text = ctx.text;
snippet.ranges.push(Range { start: pos, end });
(text, end - pos)
}
pub fn render(
&self,
doc: &Rope,
selection: &Selection,
change_range: impl FnMut(&selection::Range) -> (usize, usize),
ctx: &mut SnippetRenderCtx,
) -> (Transaction, Selection, RenderedSnippet) {
let mut snippet = self.prepare_render();
let mut off = 0;
let (transaction, selection) = Transaction::change_by_selection_ignore_overlapping(
doc,
selection,
change_range,
|replacement_start, replacement_end| {
let line_idx = doc.char_to_line(replacement_start);
let line_start = doc.line_to_char(line_idx);
let prefix = doc.slice(line_start..replacement_start);
let indent_len = prefix.chars().take_while(|c| c.is_whitespace()).count();
let indent = prefix.slice(..indent_len);
let at_newline = indent_len == replacement_start - line_start;
let (replacement, replacement_len) = self.render_at(
&mut snippet,
indent,
at_newline,
ctx,
(replacement_start as i128 + off) as usize,
);
off +=
replacement_start as i128 - replacement_end as i128 + replacement_len as i128;
Some(replacement)
},
);
(transaction, selection, snippet)
}
}
pub type VariableResolver = dyn FnMut(&str) -> Option<Cow<str>>;
pub struct SnippetRenderCtx {
pub resolve_var: Box<VariableResolver>,
pub tab_width: usize,
pub indent_style: IndentStyle,
pub line_ending: &'static str,
}
impl SnippetRenderCtx {
#[cfg(test)]
pub(super) fn test_ctx() -> SnippetRenderCtx {
SnippetRenderCtx {
resolve_var: Box::new(|_| None),
tab_width: 4,
indent_style: IndentStyle::Spaces(4),
line_ending: "\n",
}
}
}
struct SnippetRender<'a> {
ctx: &'a mut SnippetRenderCtx,
dst: &'a mut RenderedSnippet,
src: &'a Snippet,
indent: RopeSlice<'a>,
text: Tendril,
off: usize,
at_newline: bool,
}
impl SnippetRender<'_> {
fn render_elements(&mut self, elements: &[SnippetElement]) {
for element in elements {
self.render_element(element)
}
}
fn render_element(&mut self, element: &SnippetElement) {
match *element {
SnippetElement::Tabstop { idx } => self.render_tabstop(idx),
SnippetElement::Variable {
ref name,
ref default,
ref transform,
} => {
// TODO: allow resolve_var access to the doc and make it return rope slice
// so we can access selections and other document content without allocating
if let Some(val) = (self.ctx.resolve_var)(name) {
if let Some(transform) = transform {
self.push_multiline_str(&transform.apply(
(&*val).into(),
Range {
start: 0,
end: val.chars().count(),
},
));
} else {
self.push_multiline_str(&val)
}
} else if let Some(default) = default {
self.render_elements(default)
}
}
SnippetElement::Text(ref text) => self.push_multiline_str(text),
}
}
fn push_multiline_str(&mut self, text: &str) {
let mut lines = text
.split('\n')
.map(|line| line.strip_suffix('\r').unwrap_or(line));
let first_line = lines.next().unwrap();
self.push_str(first_line, self.at_newline);
for line in lines {
self.push_newline();
self.push_str(line, true);
}
}
fn push_str(&mut self, mut text: &str, at_newline: bool) {
if at_newline {
let old_len = self.text.len();
let old_indent_len = normalize_indentation(
self.indent,
text.into(),
&mut self.text,
self.ctx.indent_style,
self.ctx.tab_width,
);
// this is ok because indentation can only be ascii chars (' ' and '\t')
self.off += self.text.len() - old_len;
text = &text[old_indent_len..];
if text.is_empty() {
self.at_newline = true;
return;
}
}
self.text.push_str(text);
self.off += text.chars().count();
}
fn push_newline(&mut self) {
self.off += self.ctx.line_ending.chars().count() + self.indent.len_chars();
self.text.push_str(self.ctx.line_ending);
self.text.extend(self.indent.chunks());
}
fn render_tabstop(&mut self, tabstop: TabstopIdx) {
let start = self.off;
let end = match &self.src[tabstop].kind {
elaborate::TabstopKind::Placeholder { default } if !default.is_empty() => {
self.render_elements(default);
self.dst[tabstop].kind = TabstopKind::Placeholder;
self.off
}
_ => start,
};
self.dst[tabstop].ranges.push(Range { start, end });
}
}
#[cfg(test)]
mod tests {
use helix_stdx::Range;
use crate::snippets::render::Tabstop;
use crate::snippets::{Snippet, SnippetRenderCtx};
use super::TabstopKind;
fn assert_snippet(snippet: &str, expect: &str, tabstops: &[Tabstop]) {
let snippet = Snippet::parse(snippet).unwrap();
let mut rendered_snippet = snippet.prepare_render();
let rendered_text = snippet
.render_at(
&mut rendered_snippet,
"\t".into(),
false,
&mut SnippetRenderCtx::test_ctx(),
0,
)
.0;
assert_eq!(rendered_text, expect);
assert_eq!(&rendered_snippet.tabstops, tabstops);
assert_eq!(
rendered_snippet.ranges.last().unwrap().end,
rendered_text.chars().count()
);
assert_eq!(rendered_snippet.ranges.last().unwrap().start, 0)
}
#[test]
fn rust_macro() {
assert_snippet(
"macro_rules! ${1:name} {\n\t($3) => {\n\t\t$2\n\t};\n}",
"macro_rules! name {\n\t () => {\n\t \n\t };\n\t}",
&[
Tabstop {
ranges: vec![Range { start: 13, end: 17 }].into(),
parent: None,
kind: TabstopKind::Placeholder,
},
Tabstop {
ranges: vec![Range { start: 42, end: 42 }].into(),
parent: None,
kind: TabstopKind::Empty,
},
Tabstop {
ranges: vec![Range { start: 26, end: 26 }].into(),
parent: None,
kind: TabstopKind::Empty,
},
Tabstop {
ranges: vec![Range { start: 53, end: 53 }].into(),
parent: None,
kind: TabstopKind::Empty,
},
],
);
}
}

View File

@@ -1,6 +1,7 @@
use std::{
fmt,
path::{Path, PathBuf},
str::FromStr,
sync::Arc,
};
@@ -16,14 +17,6 @@ pub enum Uri {
}
impl Uri {
// This clippy allow mirrors url::Url::from_file_path
#[allow(clippy::result_unit_err)]
pub fn to_url(&self) -> Result<url::Url, ()> {
match self {
Uri::File(path) => url::Url::from_file_path(path),
}
}
pub fn as_path(&self) -> Option<&Path> {
match self {
Self::File(path) => Some(path),
@@ -45,81 +38,96 @@ impl fmt::Display for Uri {
}
}
#[derive(Debug)]
pub struct UrlConversionError {
source: url::Url,
kind: UrlConversionErrorKind,
#[derive(Debug, Clone, PartialEq, Eq)]
pub struct UriParseError {
source: String,
kind: UriParseErrorKind,
}
#[derive(Debug)]
pub enum UrlConversionErrorKind {
UnsupportedScheme,
UnableToConvert,
#[derive(Debug, Clone, PartialEq, Eq)]
pub enum UriParseErrorKind {
UnsupportedScheme(String),
MalformedUri,
}
impl fmt::Display for UrlConversionError {
impl fmt::Display for UriParseError {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> fmt::Result {
match self.kind {
UrlConversionErrorKind::UnsupportedScheme => {
match &self.kind {
UriParseErrorKind::UnsupportedScheme(scheme) => {
write!(f, "unsupported scheme '{scheme}' in URI {}", self.source)
}
UriParseErrorKind::MalformedUri => {
write!(
f,
"unsupported scheme '{}' in URL {}",
self.source.scheme(),
"unable to convert malformed URI to file path: {}",
self.source
)
}
UrlConversionErrorKind::UnableToConvert => {
write!(f, "unable to convert URL to file path: {}", self.source)
}
}
}
}
impl std::error::Error for UrlConversionError {}
impl std::error::Error for UriParseError {}
fn convert_url_to_uri(url: &url::Url) -> Result<Uri, UrlConversionErrorKind> {
if url.scheme() == "file" {
url.to_file_path()
.map(|path| Uri::File(helix_stdx::path::normalize(path).into()))
.map_err(|_| UrlConversionErrorKind::UnableToConvert)
} else {
Err(UrlConversionErrorKind::UnsupportedScheme)
impl FromStr for Uri {
type Err = UriParseError;
fn from_str(s: &str) -> Result<Self, Self::Err> {
use std::ffi::OsStr;
#[cfg(any(unix, target_os = "redox"))]
use std::os::unix::prelude::OsStrExt;
#[cfg(target_os = "wasi")]
use std::os::wasi::prelude::OsStrExt;
let Some((scheme, rest)) = s.split_once("://") else {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::MalformedUri,
});
};
if scheme != "file" {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::UnsupportedScheme(scheme.to_string()),
});
}
// Assert there is no query or fragment in the URI.
if s.find(['?', '#']).is_some() {
return Err(Self::Err {
source: s.to_string(),
kind: UriParseErrorKind::MalformedUri,
});
}
let mut bytes = Vec::new();
bytes.extend(percent_encoding::percent_decode(rest.as_bytes()));
Ok(PathBuf::from(OsStr::from_bytes(&bytes)).into())
}
}
impl TryFrom<url::Url> for Uri {
type Error = UrlConversionError;
impl TryFrom<&str> for Uri {
type Error = UriParseError;
fn try_from(url: url::Url) -> Result<Self, Self::Error> {
convert_url_to_uri(&url).map_err(|kind| Self::Error { source: url, kind })
}
}
impl TryFrom<&url::Url> for Uri {
type Error = UrlConversionError;
fn try_from(url: &url::Url) -> Result<Self, Self::Error> {
convert_url_to_uri(url).map_err(|kind| Self::Error {
source: url.clone(),
kind,
})
fn try_from(s: &str) -> Result<Self, Self::Error> {
s.parse()
}
}
#[cfg(test)]
mod test {
use super::*;
use url::Url;
#[test]
fn unknown_scheme() {
let url = Url::parse("csharp:/metadata/foo/bar/Baz.cs").unwrap();
assert!(matches!(
Uri::try_from(url),
Err(UrlConversionError {
kind: UrlConversionErrorKind::UnsupportedScheme,
..
let uri = "csharp://metadata/foo/barBaz.cs";
assert_eq!(
uri.parse::<Uri>(),
Err(UriParseError {
source: uri.to_string(),
kind: UriParseErrorKind::UnsupportedScheme("csharp".to_string()),
})
));
);
}
}

View File

@@ -1,176 +0,0 @@
# This file is automatically @generated by Cargo.
# It is not intended for manual editing.
version = 3
[[package]]
name = "bitflags"
version = "1.3.2"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bef38d45163c2f1dde094a7dfd33ccf595c92905c8f8f4fdc18d06fb1037718a"
[[package]]
name = "form_urlencoded"
version = "1.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a9c384f161156f5260c24a097c56119f9be8c798586aecc13afbcbe7b7e26bf8"
dependencies = [
"percent-encoding",
]
[[package]]
name = "idna"
version = "0.3.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "e14ddfc70884202db2244c223200c204c2bda1bc6e0998d11b5e024d657209e6"
dependencies = [
"unicode-bidi",
"unicode-normalization",
]
[[package]]
name = "itoa"
version = "1.0.4"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4217ad341ebadf8d8e724e264f13e593e0648f5b3e94b3896a5df283be015ecc"
[[package]]
name = "lsp-types"
version = "0.95.1"
dependencies = [
"bitflags",
"serde",
"serde_json",
"serde_repr",
"url",
]
[[package]]
name = "percent-encoding"
version = "2.2.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "478c572c3d73181ff3c2539045f6eb99e5491218eae919370993b890cdbdd98e"
[[package]]
name = "proc-macro2"
version = "1.0.47"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5ea3d908b0e36316caf9e9e2c4625cdde190a7e6f440d794667ed17a1855e725"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.21"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bbe448f377a7d6961e30f5955f9b8d106c3f5e449d493ee1b125c1d43c2b5179"
dependencies = [
"proc-macro2",
]
[[package]]
name = "ryu"
version = "1.0.11"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4501abdff3ae82a1c1b477a17252eb69cee9e66eb915c1abaa4f44d873df9f09"
[[package]]
name = "serde"
version = "1.0.145"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "728eb6351430bccb993660dfffc5a72f91ccc1295abaa8ce19b27ebe4f75568b"
dependencies = [
"serde_derive",
]
[[package]]
name = "serde_derive"
version = "1.0.145"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "81fa1584d3d1bcacd84c277a0dfe21f5b0f6accf4a23d04d4c6d61f1af522b4c"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "serde_json"
version = "1.0.86"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "41feea4228a6f1cd09ec7a3593a682276702cd67b5273544757dae23c096f074"
dependencies = [
"itoa",
"ryu",
"serde",
]
[[package]]
name = "serde_repr"
version = "0.1.9"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "1fe39d9fbb0ebf5eb2c7cb7e2a47e4f462fad1379f1166b8ae49ad9eae89a7ca"
dependencies = [
"proc-macro2",
"quote",
"syn",
]
[[package]]
name = "syn"
version = "1.0.102"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3fcd952facd492f9be3ef0d0b7032a6e442ee9b361d4acc2b1d0c4aaa5f613a1"
dependencies = [
"proc-macro2",
"quote",
"unicode-ident",
]
[[package]]
name = "tinyvec"
version = "1.6.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "87cc5ceb3875bb20c2890005a4e226a4651264a5c75edb2421b52861a0a0cb50"
dependencies = [
"tinyvec_macros",
]
[[package]]
name = "tinyvec_macros"
version = "0.1.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "cda74da7e1a664f795bb1f8a87ec406fb89a02522cf6e50620d016add6dbbf5c"
[[package]]
name = "unicode-bidi"
version = "0.3.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "099b7128301d285f79ddd55b9a83d5e6b9e97c92e0ea0daebee7263e932de992"
[[package]]
name = "unicode-ident"
version = "1.0.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "6ceab39d59e4c9499d4e5a8ee0e2735b891bb7308ac83dfb4e80cad195c9f6f3"
[[package]]
name = "unicode-normalization"
version = "0.1.22"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "5c5713f0fc4b5db668a2ac63cdb7bb4469d8c9fed047b1d0292cc7b0ce2ba921"
dependencies = [
"tinyvec",
]
[[package]]
name = "url"
version = "2.3.1"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0d68c799ae75762b8c3fe375feb6600ef5602c883c5d21eb51c09f22b83c4643"
dependencies = [
"form_urlencoded",
"idna",
"percent-encoding",
"serde",
]

View File

@@ -22,10 +22,10 @@ license = "MIT"
[dependencies]
bitflags = "2.6.0"
serde = { version = "1.0.215", features = ["derive"] }
serde = { version = "1.0.216", features = ["derive"] }
serde_json = "1.0.133"
serde_repr = "0.1"
url = {version = "2.5.4", features = ["serde"]}
percent-encoding.workspace = true
[features]
default = []

View File

@@ -1,3 +1,5 @@
# Helix's `lsp-types`
This is a fork of the [`lsp-types`](https://crates.io/crates/lsp-types) crate ([`gluon-lang/lsp-types`](https://github.com/gluon-lang/lsp-types)) taken at version v0.95.1 (commit [3e6daee](https://github.com/gluon-lang/lsp-types/commit/3e6daee771d14db4094a554b8d03e29c310dfcbe)). This fork focuses usability improvements that make the types easier to work with for the Helix codebase. For example the URL type - the `uri` crate at this version of `lsp-types` - will be replaced with a wrapper around a string.
This is a fork of the [`lsp-types`](https://crates.io/crates/lsp-types) crate ([`gluon-lang/lsp-types`](https://github.com/gluon-lang/lsp-types)) taken at version v0.95.1 (commit [3e6daee](https://github.com/gluon-lang/lsp-types/commit/3e6daee771d14db4094a554b8d03e29c310dfcbe)). This fork focuses on usability improvements that make the types easier to work with for the Helix codebase.
The URL type has been replaced with a newtype wrapper of a `String`. The `lsp-types` crate at the forked version used [`url::Url`](https://docs.rs/url/2.5.0/url/struct.Url.html) which provides conveniences for using URLs according to [the WHATWG URL spec](https://url.spec.whatwg.org). Helix supports a subset of valid URLs, namely the `file://` scheme, so a wrapper around a normal `String` is sufficient. Plus the LSP spec requires URLs to be in [RFC3986](https://tools.ietf.org/html/rfc3986) format instead.

View File

@@ -1,10 +1,9 @@
use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
use crate::{
DynamicRegistrationClientCapabilities, PartialResultParams, Range, SymbolKind, SymbolTag,
TextDocumentPositionParams, WorkDoneProgressOptions, WorkDoneProgressParams,
TextDocumentPositionParams, Url, WorkDoneProgressOptions, WorkDoneProgressParams,
};
pub type CallHierarchyClientCapabilities = DynamicRegistrationClientCapabilities;

View File

@@ -1,11 +1,10 @@
use std::collections::HashMap;
use serde::{Deserialize, Serialize};
use url::Url;
use crate::{
Diagnostic, PartialResultParams, StaticRegistrationOptions, TextDocumentIdentifier,
TextDocumentRegistrationOptions, WorkDoneProgressOptions, WorkDoneProgressParams,
TextDocumentRegistrationOptions, Url, WorkDoneProgressOptions, WorkDoneProgressParams,
};
/// Client capabilities specific to diagnostic pull requests.

View File

@@ -1,10 +1,9 @@
use crate::{
PartialResultParams, Range, TextDocumentIdentifier, WorkDoneProgressOptions,
PartialResultParams, Range, TextDocumentIdentifier, Url, WorkDoneProgressOptions,
WorkDoneProgressParams,
};
use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
#[derive(Debug, Eq, PartialEq, Clone, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]

View File

@@ -3,27 +3,151 @@
Language Server Protocol types for Rust.
Based on: <https://microsoft.github.io/language-server-protocol/specification>
This library uses the URL crate for parsing URIs. Note that there is
some confusion on the meaning of URLs vs URIs:
<http://stackoverflow.com/a/28865728/393898>. According to that
information, on the classical sense of "URLs", "URLs" are a subset of
URIs, But on the modern/new meaning of URLs, they are the same as
URIs. The important take-away aspect is that the URL crate should be
able to parse any URI, such as `urn:isbn:0451450523`.
*/
#![allow(non_upper_case_globals)]
#![forbid(unsafe_code)]
#[macro_use]
extern crate bitflags;
use std::{collections::HashMap, fmt::Debug};
use bitflags::bitflags;
use std::{collections::HashMap, fmt::Debug, path::Path};
use serde::{de, de::Error as Error_, Deserialize, Serialize};
use serde_json::Value;
pub use url::Url;
#[derive(Debug, Clone, PartialEq, Eq, PartialOrd, Ord, Hash, Deserialize, Serialize)]
pub struct Url(String);
// <https://datatracker.ietf.org/doc/html/rfc3986#section-2.2>, also see
// <https://github.com/microsoft/vscode-uri/blob/6dec22d7dcc6c63c30343d3a8d56050d0078cb6a/src/uri.ts#L454-L477>
const RESERVED: &percent_encoding::AsciiSet = &percent_encoding::CONTROLS
// GEN_DELIMS
.add(b':')
.add(b'/')
.add(b'?')
.add(b'#')
.add(b'[')
.add(b']')
.add(b'@')
// SUB_DELIMS
.add(b'!')
.add(b'$')
.add(b'&')
.add(b'\'')
.add(b'(')
.add(b')')
.add(b'*')
.add(b'+')
.add(b',')
.add(b';')
.add(b'=');
impl Url {
#[cfg(any(unix, target_os = "redox", target_os = "wasi"))]
pub fn from_file_path<P: AsRef<Path>>(path: P) -> Self {
#[cfg(any(unix, target_os = "redox"))]
use std::os::unix::prelude::OsStrExt;
#[cfg(target_os = "wasi")]
use std::os::wasi::prelude::OsStrExt;
let mut serialization = String::from("file://");
// skip the root component
for component in path.as_ref().components().skip(1) {
serialization.push('/');
serialization.extend(percent_encoding::percent_encode(
component.as_os_str().as_bytes(),
RESERVED,
));
}
if &serialization == "file://" {
// An URL's path must not be empty.
serialization.push('/');
}
Self(serialization)
}
#[cfg(windows)]
pub fn from_file_path<P: AsRef<Path>>(path: P) -> Self {
from_file_path_windows(path.as_ref())
}
#[cfg_attr(not(windows), allow(dead_code))]
fn from_file_path_windows(path: &Path) -> Self {
use std::path::{Component, Prefix};
fn is_windows_drive_letter(segment: &str) -> bool {
segment.len() == 2
&& (segment.as_bytes()[0] as char).is_ascii_alphabetic()
&& matches!(segment.as_bytes()[1], b':' | b'|')
}
assert!(path.is_absolute());
let mut serialization = String::from("file://");
let mut components = path.components();
let host_start = serialization.len() + 1;
match components.next() {
Some(Component::Prefix(ref p)) => match p.kind() {
Prefix::Disk(letter) | Prefix::VerbatimDisk(letter) => {
serialization.push('/');
serialization.push(letter as char);
serialization.push(':');
}
// TODO: Prefix::UNC | Prefix::VerbatimUNC
_ => todo!("support UNC drives"),
},
_ => unreachable!("absolute windows paths must start with a prefix"),
}
let mut path_only_has_prefix = true;
for component in components {
if component == Component::RootDir {
continue;
}
path_only_has_prefix = false;
serialization.push('/');
serialization.extend(percent_encoding::percent_encode(
component.as_os_str().as_encoded_bytes(),
RESERVED,
));
}
if serialization.len() > host_start
&& is_windows_drive_letter(&serialization[host_start..])
&& path_only_has_prefix
{
serialization.push('/');
}
Self(serialization)
}
pub fn from_directory_path<P: AsRef<Path>>(path: P) -> Self {
let Self(mut serialization) = Self::from_file_path(path);
if !serialization.ends_with('/') {
serialization.push('/');
}
Self(serialization)
}
/// Returns the serialized representation of the URL as a `&str`
pub fn as_str(&self) -> &str {
&self.0
}
/// Consumes the URL, converting into a `String`.
/// Note that the string is the serialized representation of the URL.
pub fn into_string(self) -> String {
self.0
}
}
impl From<&str> for Url {
fn from(value: &str) -> Self {
Self(value.to_string())
}
}
// Large enough to contain any enumeration name defined in this crate
type PascalCaseBuf = [u8; 32];
@@ -2843,14 +2967,14 @@ mod tests {
test_serialization(
&WorkspaceEdit {
changes: Some(
vec![(Url::parse("file://test").unwrap(), vec![])]
vec![(Url::from("file://test"), vec![])]
.into_iter()
.collect(),
),
document_changes: None,
..Default::default()
},
r#"{"changes":{"file://test/":[]}}"#,
r#"{"changes":{"file://test":[]}}"#,
);
}

View File

@@ -4,9 +4,7 @@ use serde::{Deserialize, Serialize};
use serde_json::Value;
use url::Url;
use crate::Range;
use crate::{Range, Url};
#[derive(Eq, PartialEq, Clone, Copy, Deserialize, Serialize)]
#[serde(transparent)]

View File

@@ -1,8 +1,7 @@
use serde::{Deserialize, Serialize};
use url::Url;
use crate::{
FullDocumentDiagnosticReport, PartialResultParams, UnchangedDocumentDiagnosticReport,
FullDocumentDiagnosticReport, PartialResultParams, UnchangedDocumentDiagnosticReport, Url,
WorkDoneProgressParams,
};

View File

@@ -1,7 +1,6 @@
use serde::{Deserialize, Serialize};
use url::Url;
use crate::OneOf;
use crate::{OneOf, Url};
#[derive(Debug, Eq, PartialEq, Clone, Default, Deserialize, Serialize)]
#[serde(rename_all = "camelCase")]

View File

@@ -26,8 +26,8 @@ globset = "0.4.15"
log = "0.4"
serde = { version = "1.0", features = ["derive"] }
serde_json = "1.0"
tokio = { version = "1.41", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] }
tokio-stream = "0.1.15"
tokio = { version = "1.42", features = ["rt", "rt-multi-thread", "io-util", "io-std", "time", "process", "macros", "fs", "parking_lot", "sync"] }
tokio-stream = "0.1.17"
parking_lot = "0.12.3"
arc-swap = "1"
slotmap.workspace = true

View File

@@ -32,14 +32,17 @@ use tokio::{
},
};
fn workspace_for_uri(uri: lsp::Url) -> WorkspaceFolder {
fn workspace_for_path(path: &Path) -> WorkspaceFolder {
let name = path
.iter()
.last()
.expect("workspace paths should be non-empty")
.to_string_lossy()
.to_string();
lsp::WorkspaceFolder {
name: uri
.path_segments()
.and_then(|segments| segments.last())
.map(|basename| basename.to_string())
.unwrap_or_default(),
uri,
name,
uri: lsp::Url::from_directory_path(path),
}
}
@@ -55,7 +58,7 @@ pub struct Client {
config: Option<Value>,
root_path: std::path::PathBuf,
root_uri: Option<lsp::Url>,
workspace_folders: Mutex<Vec<lsp::WorkspaceFolder>>,
workspace_folders: Mutex<Vec<PathBuf>>,
initialize_notify: Arc<Notify>,
/// workspace folders added while the server is still initializing
req_timeout: u64,
@@ -80,16 +83,13 @@ impl Client {
&workspace,
workspace_is_cwd,
);
let root_uri = root
.as_ref()
.and_then(|root| lsp::Url::from_file_path(root).ok());
if self.root_path == root.unwrap_or(workspace)
|| root_uri.as_ref().map_or(false, |root_uri| {
if &self.root_path == root.as_ref().unwrap_or(&workspace)
|| root.as_ref().is_some_and(|root| {
self.workspace_folders
.lock()
.iter()
.any(|workspace| &workspace.uri == root_uri)
.any(|workspace| workspace == root)
})
{
// workspace URI is already registered so we can use this client
@@ -113,15 +113,16 @@ impl Client {
// wait and see if anyone ever runs into it.
tokio::spawn(async move {
client.initialize_notify.notified().await;
if let Some(workspace_folders_caps) = client
if let Some((workspace_folders_caps, root)) = client
.capabilities()
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
.zip(root)
{
client.add_workspace_folder(
root_uri,
root,
workspace_folders_caps.change_notifications.as_ref(),
);
}
@@ -129,16 +130,14 @@ impl Client {
return true;
};
if let Some(workspace_folders_caps) = capabilities
if let Some((workspace_folders_caps, root)) = capabilities
.workspace
.as_ref()
.and_then(|cap| cap.workspace_folders.as_ref())
.filter(|cap| cap.supported.unwrap_or(false))
.zip(root)
{
self.add_workspace_folder(
root_uri,
workspace_folders_caps.change_notifications.as_ref(),
);
self.add_workspace_folder(root, workspace_folders_caps.change_notifications.as_ref());
true
} else {
// the server doesn't support multi workspaces, we need a new client
@@ -148,29 +147,19 @@ impl Client {
fn add_workspace_folder(
&self,
root_uri: Option<lsp::Url>,
root: PathBuf,
change_notifications: Option<&OneOf<bool, String>>,
) {
// root_uri is None just means that there isn't really any LSP workspace
// associated with this file. For servers that support multiple workspaces
// there is just one server so we can always just use that shared instance.
// No need to add a new workspace root here as there is no logical root for this file
// let the server deal with this
let Some(root_uri) = root_uri else {
return;
};
let workspace = workspace_for_path(&root);
// server supports workspace folders, let's add the new root to the list
self.workspace_folders
.lock()
.push(workspace_for_uri(root_uri.clone()));
self.workspace_folders.lock().push(root);
if Some(&OneOf::Left(false)) == change_notifications {
// server specifically opted out of DidWorkspaceChange notifications
// let's assume the server will request the workspace folders itself
// and that we can therefore reuse the client (but are done now)
return;
}
tokio::spawn(self.did_change_workspace(vec![workspace_for_uri(root_uri)], Vec::new()));
tokio::spawn(self.did_change_workspace(vec![workspace], Vec::new()));
}
#[allow(clippy::type_complexity, clippy::too_many_arguments)]
@@ -179,8 +168,8 @@ impl Client {
args: &[String],
config: Option<Value>,
server_environment: HashMap<String, String>,
root_path: PathBuf,
root_uri: Option<lsp::Url>,
root: Option<PathBuf>,
workspace: PathBuf,
id: LanguageServerId,
name: String,
req_timeout: u64,
@@ -212,10 +201,11 @@ impl Client {
let (server_rx, server_tx, initialize_notify) =
Transport::start(reader, writer, stderr, id, name.clone());
let workspace_folders = root_uri
.clone()
.map(|root| vec![workspace_for_uri(root)])
.unwrap_or_default();
let workspace_folders = root.clone().into_iter().collect();
let root_uri = root.clone().map(lsp::Url::from_file_path);
// `root_uri` and `workspace_folder` can be empty in case there is no workspace
// `root_url` can not, use `workspace` as a fallback
let root_path = root.unwrap_or(workspace);
let client = Self {
id,
@@ -376,10 +366,12 @@ impl Client {
self.config.as_ref()
}
pub async fn workspace_folders(
&self,
) -> parking_lot::MutexGuard<'_, Vec<lsp::WorkspaceFolder>> {
self.workspace_folders.lock()
pub async fn workspace_folders(&self) -> Vec<lsp::WorkspaceFolder> {
self.workspace_folders
.lock()
.iter()
.map(|path| workspace_for_path(path))
.collect()
}
/// Execute a RPC request on the language server.
@@ -526,7 +518,7 @@ impl Client {
#[allow(deprecated)]
let params = lsp::InitializeParams {
process_id: Some(std::process::id()),
workspace_folders: Some(self.workspace_folders.lock().clone()),
workspace_folders: Some(self.workspace_folders().await),
// root_path is obsolete, but some clients like pyright still use it so we specify both.
// clients will prefer _uri if possible
root_path: self.root_path.to_str().map(|path| path.to_owned()),
@@ -748,11 +740,11 @@ impl Client {
} else {
Url::from_file_path(path)
};
Some(url.ok()?.to_string())
url.into_string()
};
let files = vec![lsp::FileRename {
old_uri: url_from_path(old_path)?,
new_uri: url_from_path(new_path)?,
old_uri: url_from_path(old_path),
new_uri: url_from_path(new_path),
}];
let request = self.call_with_timeout::<lsp::request::WillRenameFiles>(
&lsp::RenameFilesParams { files },
@@ -782,12 +774,12 @@ impl Client {
} else {
Url::from_file_path(path)
};
Some(url.ok()?.to_string())
url.into_string()
};
let files = vec![lsp::FileRename {
old_uri: url_from_path(old_path)?,
new_uri: url_from_path(new_path)?,
old_uri: url_from_path(old_path),
new_uri: url_from_path(new_path),
}];
Some(self.notify::<lsp::notification::DidRenameFiles>(lsp::RenameFilesParams { files }))
}

View File

@@ -106,9 +106,7 @@ impl Handler {
log::warn!("LSP client was dropped: {id}");
return false;
};
let Ok(uri) = lsp::Url::from_file_path(&path) else {
return true;
};
let uri = lsp::Url::from_file_path(&path);
log::debug!(
"Sending didChangeWatchedFiles notification to client '{}'",
client.name()

View File

@@ -2,7 +2,6 @@ mod client;
pub mod file_event;
mod file_operations;
pub mod jsonrpc;
pub mod snippet;
mod transport;
use arc_swap::ArcSwap;
@@ -67,7 +66,8 @@ pub enum OffsetEncoding {
pub mod util {
use super::*;
use helix_core::line_ending::{line_end_byte_index, line_end_char_index};
use helix_core::{chars, RopeSlice, SmallVec};
use helix_core::snippets::{RenderedSnippet, Snippet, SnippetRenderCtx};
use helix_core::{chars, RopeSlice};
use helix_core::{diagnostic::NumberOrString, Range, Rope, Selection, Tendril, Transaction};
/// Converts a diagnostic in the document to [`lsp::Diagnostic`].
@@ -355,25 +355,17 @@ pub mod util {
transaction.with_selection(selection)
}
/// Creates a [Transaction] from the [snippet::Snippet] in a completion response.
/// Creates a [Transaction] from the [Snippet] in a completion response.
/// The transaction applies the edit to all cursors.
#[allow(clippy::too_many_arguments)]
pub fn generate_transaction_from_snippet(
doc: &Rope,
selection: &Selection,
edit_offset: Option<(i128, i128)>,
replace_mode: bool,
snippet: snippet::Snippet,
line_ending: &str,
include_placeholder: bool,
tab_width: usize,
indent_width: usize,
) -> Transaction {
snippet: Snippet,
cx: &mut SnippetRenderCtx,
) -> (Transaction, RenderedSnippet) {
let text = doc.slice(..);
let mut off = 0i128;
let mut mapped_doc = doc.clone();
let mut selection_tabstops: SmallVec<[_; 1]> = SmallVec::new();
let (removed_start, removed_end) = completion_range(
text,
edit_offset,
@@ -382,8 +374,7 @@ pub mod util {
)
.expect("transaction must be valid for primary selection");
let removed_text = text.slice(removed_start..removed_end);
let (transaction, mut selection) = Transaction::change_by_selection_ignore_overlapping(
let (transaction, mapped_selection, snippet) = snippet.render(
doc,
selection,
|range| {
@@ -392,108 +383,15 @@ pub mod util {
.filter(|(start, end)| text.slice(start..end) == removed_text)
.unwrap_or_else(|| find_completion_range(text, replace_mode, cursor))
},
|replacement_start, replacement_end| {
let mapped_replacement_start = (replacement_start as i128 + off) as usize;
let mapped_replacement_end = (replacement_end as i128 + off) as usize;
let line_idx = mapped_doc.char_to_line(mapped_replacement_start);
let indent_level = helix_core::indent::indent_level_for_line(
mapped_doc.line(line_idx),
tab_width,
indent_width,
) * indent_width;
let newline_with_offset = format!(
"{line_ending}{blank:indent_level$}",
line_ending = line_ending,
blank = ""
);
let (replacement, tabstops) =
snippet::render(&snippet, &newline_with_offset, include_placeholder);
selection_tabstops.push((mapped_replacement_start, tabstops));
mapped_doc.remove(mapped_replacement_start..mapped_replacement_end);
mapped_doc.insert(mapped_replacement_start, &replacement);
off +=
replacement_start as i128 - replacement_end as i128 + replacement.len() as i128;
Some(replacement)
},
cx,
);
let changes = transaction.changes();
if changes.is_empty() {
return transaction;
}
// Don't normalize to avoid merging/reording selections which would
// break the association between tabstops and selections. Most ranges
// will be replaced by tabstops anyways and the final selection will be
// normalized anyways
selection = selection.map_no_normalize(changes);
let mut mapped_selection = SmallVec::with_capacity(selection.len());
let mut mapped_primary_idx = 0;
let primary_range = selection.primary();
for (range, (tabstop_anchor, tabstops)) in selection.into_iter().zip(selection_tabstops) {
if range == primary_range {
mapped_primary_idx = mapped_selection.len()
}
let tabstops = tabstops.first().filter(|tabstops| !tabstops.is_empty());
let Some(tabstops) = tabstops else {
// no tabstop normal mapping
mapped_selection.push(range);
continue;
};
// expand the selection to cover the tabstop to retain the helix selection semantic
// the tabstop closest to the range simply replaces `head` while anchor remains in place
// the remaining tabstops receive their own single-width cursor
if range.head < range.anchor {
let last_idx = tabstops.len() - 1;
let last_tabstop = tabstop_anchor + tabstops[last_idx].0;
// if selection is forward but was moved to the right it is
// contained entirely in the replacement text, just do a point
// selection (fallback below)
if range.anchor > last_tabstop {
let range = Range::new(range.anchor, last_tabstop);
mapped_selection.push(range);
let rem_tabstops = tabstops[..last_idx]
.iter()
.map(|tabstop| Range::point(tabstop_anchor + tabstop.0));
mapped_selection.extend(rem_tabstops);
continue;
}
} else {
let first_tabstop = tabstop_anchor + tabstops[0].0;
// if selection is forward but was moved to the right it is
// contained entirely in the replacement text, just do a point
// selection (fallback below)
if range.anchor < first_tabstop {
// we can't properly compute the the next grapheme
// here because the transaction hasn't been applied yet
// that is not a problem because the range gets grapheme aligned anyway
// tough so just adding one will always cause head to be grapheme
// aligned correctly when applied to the document
let range = Range::new(range.anchor, first_tabstop + 1);
mapped_selection.push(range);
let rem_tabstops = tabstops[1..]
.iter()
.map(|tabstop| Range::point(tabstop_anchor + tabstop.0));
mapped_selection.extend(rem_tabstops);
continue;
}
};
let tabstops = tabstops
.iter()
.map(|tabstop| Range::point(tabstop_anchor + tabstop.0));
mapped_selection.extend(tabstops);
}
transaction.with_selection(Selection::new(mapped_selection, mapped_primary_idx))
let transaction = transaction.with_selection(snippet.first_selection(
// we keep the direction of the old primary selection in case it changed during mapping
// but use the primary idx from the mapped selection in case ranges had to be merged
selection.primary().direction(),
mapped_selection.primary_index(),
));
(transaction, snippet)
}
pub fn generate_transaction_from_edits(
@@ -955,12 +853,8 @@ fn start_client(
workspace_is_cwd,
);
// `root_uri` and `workspace_folder` can be empty in case there is no workspace
// `root_url` can not, use `workspace` as a fallback
let root_path = root.clone().unwrap_or_else(|| workspace.clone());
let root_uri = root.and_then(|root| lsp::Url::from_file_path(root).ok());
if let Some(globset) = &ls_config.required_root_patterns {
let root_path = root.as_ref().unwrap_or(&workspace);
if !root_path
.read_dir()?
.flatten()
@@ -976,8 +870,8 @@ fn start_client(
&ls_config.args,
ls_config.config.clone(),
ls_config.environment.clone(),
root_path,
root_uri,
root,
workspace,
id,
name,
ls_config.timeout,

File diff suppressed because it is too large Load Diff

View File

@@ -1,4 +1,7 @@
pub mod env;
pub mod faccess;
pub mod path;
pub mod range;
pub mod rope;
pub use range::Range;

103
helix-stdx/src/range.rs Normal file
View File

@@ -0,0 +1,103 @@
use std::ops::{self, RangeBounds};
/// A range of `char`s within the text.
#[derive(Debug, Clone, Copy, PartialOrd, Ord, PartialEq, Eq)]
pub struct Range<T = usize> {
pub start: T,
pub end: T,
}
impl<T: PartialOrd> Range<T> {
pub fn contains(&self, other: Self) -> bool {
self.start <= other.start && other.end <= self.end
}
pub fn is_empty(&self) -> bool {
self.end <= self.start
}
}
impl<T> RangeBounds<T> for Range<T> {
fn start_bound(&self) -> ops::Bound<&T> {
ops::Bound::Included(&self.start)
}
fn end_bound(&self) -> ops::Bound<&T> {
ops::Bound::Excluded(&self.end)
}
}
/// Returns true if all ranges yielded by `sub_set` are contained by
/// `super_set`. This is essentially an optimized implementation of
/// `sub_set.all(|rb| super_set.any(|ra| ra.contains(rb)))` that runs in O(m+n)
/// instead of O(mn) (and in many cases faster).
///
/// Both iterators must uphold a the following invariants:
/// * ranges must not overlap (but they can be adjacent)
/// * ranges must be sorted
pub fn is_subset<const ALLOW_EMPTY: bool>(
mut super_set: impl Iterator<Item = Range>,
mut sub_set: impl Iterator<Item = Range>,
) -> bool {
let (mut super_range, mut sub_range) = (super_set.next(), sub_set.next());
loop {
match (super_range, sub_range) {
// skip over irrelevant ranges
(Some(ra), Some(rb))
if ra.end <= rb.start && (ra.start != rb.start || !ALLOW_EMPTY) =>
{
super_range = super_set.next();
}
(Some(ra), Some(rb)) => {
if ra.contains(rb) {
sub_range = sub_set.next();
} else {
return false;
}
}
(None, Some(_)) => {
// exhausted `super_set`, we can't match the reminder of `sub_set`
return false;
}
(_, None) => {
// no elements from `sub_sut` left to match, `super_set` contains `sub_set`
return true;
}
}
}
}
pub fn is_exact_subset(
mut super_set: impl Iterator<Item = Range>,
mut sub_set: impl Iterator<Item = Range>,
) -> bool {
let (mut super_range, mut sub_range) = (super_set.next(), sub_set.next());
let mut super_range_matched = true;
loop {
match (super_range, sub_range) {
// skip over irrelevant ranges
(Some(ra), Some(rb)) if ra.end <= rb.start && ra.start < rb.start => {
if !super_range_matched {
return false;
}
super_range_matched = false;
super_range = super_set.next();
}
(Some(ra), Some(rb)) => {
if ra.contains(rb) {
super_range_matched = true;
sub_range = sub_set.next();
} else {
return false;
}
}
(None, Some(_)) => {
// exhausted `super_set`, we can't match the reminder of `sub_set`
return false;
}
(_, None) => {
// no elements from `sub_sut` left to match, `super_set` contains `sub_set`
return super_set.next().is_none();
}
}
}
}

View File

@@ -74,7 +74,7 @@ grep-searcher = "0.1.14"
[target.'cfg(not(windows))'.dependencies] # https://github.com/vorner/signal-hook/issues/100
signal-hook-tokio = { version = "0.3", features = ["futures-v0_3"] }
libc = "0.2.167"
libc = "0.2.168"
[target.'cfg(target_os = "macos")'.dependencies]
crossterm = { version = "0.28", features = ["event-stream", "use-dev-tty", "libc"] }

View File

@@ -175,7 +175,7 @@ impl Application {
nr_of_files += 1;
if file.is_dir() {
return Err(anyhow::anyhow!(
"expected a path to file, found a directory. (to open a directory pass it as first argument)"
"expected a path to file, but found a directory: {file:?}. (to open a directory pass it as first argument)"
));
} else {
// If the user passes in either `--vsplit` or
@@ -189,6 +189,7 @@ impl Application {
Some(Layout::Horizontal) => Action::HorizontalSplit,
None => Action::Load,
};
let old_id = editor.document_id_by_path(&file);
let doc_id = match editor.open(&file, action) {
// Ignore irregular files during application init.
Err(DocumentOpenError::IrregularFile) => {
@@ -196,6 +197,11 @@ impl Application {
continue;
}
Err(err) => return Err(anyhow::anyhow!(err)),
// We can't open more than 1 buffer for 1 file, in this case we already have opened this file previously
Ok(doc_id) if old_id == Some(doc_id) => {
nr_of_files -= 1;
doc_id
}
Ok(doc_id) => doc_id,
};
// with Action::Load all documents have the same view
@@ -738,7 +744,7 @@ impl Application {
}
}
Notification::PublishDiagnostics(mut params) => {
let uri = match helix_core::Uri::try_from(params.uri) {
let uri = match helix_core::Uri::try_from(params.uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");
@@ -1137,7 +1143,8 @@ impl Application {
..
} = params
{
self.jobs.callback(crate::open_external_url_callback(uri));
self.jobs
.callback(crate::open_external_url_callback(uri.as_str()));
return lsp::ShowDocumentResult { success: true };
};
@@ -1148,7 +1155,7 @@ impl Application {
..
} = params;
let uri = match helix_core::Uri::try_from(uri) {
let uri = match helix_core::Uri::try_from(uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");

View File

@@ -129,7 +129,7 @@ pub(crate) fn parse_file(s: &str) -> (PathBuf, Position) {
///
/// Does not validate if file.rs is a file or directory.
fn split_path_row_col(s: &str) -> Option<(PathBuf, Position)> {
let mut s = s.rsplitn(3, ':');
let mut s = s.trim_end_matches(':').rsplitn(3, ':');
let col: usize = s.next()?.parse().ok()?;
let row: usize = s.next()?.parse().ok()?;
let path = s.next()?.into();
@@ -141,7 +141,7 @@ fn split_path_row_col(s: &str) -> Option<(PathBuf, Position)> {
///
/// Does not validate if file.rs is a file or directory.
fn split_path_row(s: &str) -> Option<(PathBuf, Position)> {
let (path, row) = s.rsplit_once(':')?;
let (path, row) = s.trim_end_matches(':').rsplit_once(':')?;
let row: usize = row.parse().ok()?;
let path = path.into();
let pos = Position::new(row.saturating_sub(1), 0);

View File

@@ -87,6 +87,11 @@ use grep_searcher::{sinks, BinaryDetection, SearcherBuilder};
use ignore::{DirEntry, WalkBuilder, WalkState};
pub type OnKeyCallback = Box<dyn FnOnce(&mut Context, KeyEvent)>;
#[derive(PartialEq, Eq, Clone, Copy, Debug)]
pub enum OnKeyCallbackKind {
PseudoPending,
Fallback,
}
pub struct Context<'a> {
pub register: Option<char>,
@@ -94,7 +99,7 @@ pub struct Context<'a> {
pub editor: &'a mut Editor,
pub callback: Vec<crate::compositor::Callback>,
pub on_next_key_callback: Option<OnKeyCallback>,
pub on_next_key_callback: Option<(OnKeyCallback, OnKeyCallbackKind)>,
pub jobs: &'a mut Jobs,
}
@@ -120,7 +125,19 @@ impl Context<'_> {
&mut self,
on_next_key_callback: impl FnOnce(&mut Context, KeyEvent) + 'static,
) {
self.on_next_key_callback = Some(Box::new(on_next_key_callback));
self.on_next_key_callback = Some((
Box::new(on_next_key_callback),
OnKeyCallbackKind::PseudoPending,
));
}
#[inline]
pub fn on_next_key_fallback(
&mut self,
on_next_key_callback: impl FnOnce(&mut Context, KeyEvent) + 'static,
) {
self.on_next_key_callback =
Some((Box::new(on_next_key_callback), OnKeyCallbackKind::Fallback));
}
#[inline]
@@ -568,6 +585,8 @@ impl MappableCommand {
command_palette, "Open command palette",
goto_word, "Jump to a two-character label",
extend_to_word, "Extend to a two-character label",
goto_next_tabstop, "goto next snippet placeholder",
goto_prev_tabstop, "goto next snippet placeholder",
);
}
@@ -1328,7 +1347,9 @@ fn open_url(cx: &mut Context, url: Url, action: Action) {
.unwrap_or_default();
if url.scheme() != "file" {
return cx.jobs.callback(crate::open_external_url_callback(url));
return cx
.jobs
.callback(crate::open_external_url_callback(url.as_str()));
}
let content_type = std::fs::File::open(url.path()).and_then(|file| {
@@ -1341,9 +1362,9 @@ fn open_url(cx: &mut Context, url: Url, action: Action) {
// we attempt to open binary files - files that can't be open in helix - using external
// program as well, e.g. pdf files or images
match content_type {
Ok(content_inspector::ContentType::BINARY) => {
cx.jobs.callback(crate::open_external_url_callback(url))
}
Ok(content_inspector::ContentType::BINARY) => cx
.jobs
.callback(crate::open_external_url_callback(url.as_str())),
Ok(_) | Err(_) => {
let path = &rel_path.join(url.path());
if path.is_dir() {
@@ -2164,7 +2185,7 @@ fn searcher(cx: &mut Context, direction: Direction) {
completions
.iter()
.filter(|comp| comp.starts_with(input))
.map(|comp| (0.., std::borrow::Cow::Owned(comp.clone())))
.map(|comp| (0.., comp.clone().into()))
.collect()
},
move |cx, regex, event| {
@@ -3458,40 +3479,42 @@ fn open(cx: &mut Context, open: Open) {
let selection = doc.selection(view.id);
let mut ranges = SmallVec::with_capacity(selection.len());
let mut offs = 0;
let mut transaction = Transaction::change_by_selection(contents, selection, |range| {
let cursor_line = text.char_to_line(match open {
// the line number, where the cursor is currently
let curr_line_num = text.char_to_line(match open {
Open::Below => graphemes::prev_grapheme_boundary(text, range.to()),
Open::Above => range.from(),
});
let new_line = match open {
// adjust position to the end of the line (next line - 1)
Open::Below => cursor_line + 1,
// adjust position to the end of the previous line (current line - 1)
Open::Above => cursor_line,
// the next line number, where the cursor will be, after finishing the transaction
let next_new_line_num = match open {
Open::Below => curr_line_num + 1,
Open::Above => curr_line_num,
};
let line_num = new_line.saturating_sub(1);
let above_next_new_line_num = next_new_line_num.saturating_sub(1);
let continue_comment_token = if doc.config.load().continue_comments {
doc.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, curr_line_num))
} else {
None
};
// Index to insert newlines after, as well as the char width
// to use to compensate for those inserted newlines.
let (line_end_index, line_end_offset_width) = if new_line == 0 {
let (above_next_line_end_index, above_next_line_end_width) = if next_new_line_num == 0 {
(0, 0)
} else {
(
line_end_char_index(&text, line_num),
line_end_char_index(&text, above_next_new_line_num),
doc.line_ending.len_chars(),
)
};
let continue_comment_token = doc
.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, cursor_line));
let line = text.line(cursor_line);
let line = text.line(curr_line_num);
let indent = match line.first_non_whitespace_char() {
Some(pos) if continue_comment_token.is_some() => line.slice(..pos).to_string(),
_ => indent::indent_for_newline(
@@ -3501,26 +3524,36 @@ fn open(cx: &mut Context, open: Open) {
&doc.indent_style,
doc.tab_width(),
text,
line_num,
line_end_index,
cursor_line,
above_next_new_line_num,
above_next_line_end_index,
curr_line_num,
),
};
let indent_len = indent.len();
let mut text = String::with_capacity(1 + indent_len);
text.push_str(doc.line_ending.as_str());
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
if open == Open::Above && next_new_line_num == 0 {
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
}
text.push_str(doc.line_ending.as_str());
} else {
text.push_str(doc.line_ending.as_str());
text.push_str(&indent);
if let Some(token) = continue_comment_token {
text.push_str(token);
text.push(' ');
}
}
let text = text.repeat(count);
// calculate new selection ranges
let pos = offs + line_end_index + line_end_offset_width;
let pos = above_next_line_end_index + above_next_line_end_width;
let comment_len = continue_comment_token
.map(|token| token.len() + 1) // `+ 1` for the extra space added
.unwrap_or_default();
@@ -3533,9 +3566,11 @@ fn open(cx: &mut Context, open: Open) {
));
}
offs += text.chars().count();
(line_end_index, line_end_index, Some(text.into()))
(
above_next_line_end_index,
above_next_line_end_index,
Some(text.into()),
)
});
transaction = transaction.with_selection(Selection::new(ranges, selection.primary_index()));
@@ -3917,7 +3952,11 @@ pub mod insert {
});
if !cursors_after_whitespace {
move_parent_node_end(cx);
if doc.active_snippet.is_some() {
goto_next_tabstop(cx);
} else {
move_parent_node_end(cx);
}
return;
}
}
@@ -3965,10 +4004,13 @@ pub mod insert {
let mut new_text = String::new();
let continue_comment_token = doc
.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, current_line));
let continue_comment_token = if doc.config.load().continue_comments {
doc.language_config()
.and_then(|config| config.comment_tokens.as_ref())
.and_then(|tokens| comment::get_comment_token(text, tokens, current_line))
} else {
None
};
let (from, to, local_offs) = if let Some(idx) =
text.slice(line_start..pos).last_non_whitespace_char()
@@ -6153,6 +6195,47 @@ fn increment_impl(cx: &mut Context, increment_direction: IncrementDirection) {
}
}
fn goto_next_tabstop(cx: &mut Context) {
goto_next_tabstop_impl(cx, Direction::Forward)
}
fn goto_prev_tabstop(cx: &mut Context) {
goto_next_tabstop_impl(cx, Direction::Backward)
}
fn goto_next_tabstop_impl(cx: &mut Context, direction: Direction) {
let (view, doc) = current!(cx.editor);
let view_id = view.id;
let Some(mut snippet) = doc.active_snippet.take() else {
cx.editor.set_error("no snippet is currently active");
return;
};
let tabstop = match direction {
Direction::Forward => Some(snippet.next_tabstop(doc.selection(view_id))),
Direction::Backward => snippet
.prev_tabstop(doc.selection(view_id))
.map(|selection| (selection, false)),
};
let Some((selection, last_tabstop)) = tabstop else {
return;
};
doc.set_selection(view_id, selection);
if !last_tabstop {
doc.active_snippet = Some(snippet)
}
if cx.editor.mode() == Mode::Insert {
cx.on_next_key_fallback(|cx, key| {
if let Some(c) = key.char() {
let (view, doc) = current!(cx.editor);
if let Some(snippet) = &doc.active_snippet {
doc.apply(&snippet.delete_placeholder(doc.text()), view.id);
}
insert_char(cx, c);
}
})
}
}
fn record_macro(cx: &mut Context) {
if let Some((reg, mut keys)) = cx.editor.macro_recording.take() {
// Remove the keypress which ends the recording

View File

@@ -69,7 +69,7 @@ struct Location {
}
fn lsp_location_to_location(location: lsp::Location) -> Option<Location> {
let uri = match location.uri.try_into() {
let uri = match location.uri.as_str().try_into() {
Ok(uri) => uri,
Err(err) => {
log::warn!("discarding invalid or unsupported URI: {err}");
@@ -456,7 +456,7 @@ pub fn workspace_symbol_picker(cx: &mut Context) {
.unwrap_or_default()
.into_iter()
.filter_map(|symbol| {
let uri = match Uri::try_from(&symbol.location.uri) {
let uri = match Uri::try_from(symbol.location.uri.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::warn!("discarding symbol with invalid URI: {err}");
@@ -510,7 +510,7 @@ pub fn workspace_symbol_picker(cx: &mut Context) {
.to_string()
.into()
} else {
item.symbol.location.uri.to_string().into()
item.symbol.location.uri.as_str().into()
}
}),
];

View File

@@ -2507,7 +2507,8 @@ fn read(cx: &mut compositor::Context, args: &[Cow<str>], event: PromptEvent) ->
ensure!(args.len() == 1, "only the file name is expected");
let filename = args.first().unwrap();
let path = PathBuf::from(filename.to_string());
let path = helix_stdx::path::expand_tilde(PathBuf::from(filename.to_string()));
ensure!(
path.exists() && path.is_file(),
"path is not a file: {:?}",
@@ -3197,8 +3198,8 @@ pub(super) fn command_mode(cx: &mut Context) {
{
completer(editor, word)
.into_iter()
.map(|(range, file)| {
let file = shellwords::escape(file);
.map(|(range, mut file)| {
file.content = shellwords::escape(file.content);
// offset ranges to input
let offset = input.len() - word_len;

View File

@@ -1,6 +1,8 @@
use helix_event::{events, register_event};
use helix_view::document::Mode;
use helix_view::events::{DiagnosticsDidChange, DocumentDidChange, SelectionDidChange};
use helix_view::events::{
DiagnosticsDidChange, DocumentDidChange, DocumentFocusLost, SelectionDidChange,
};
use crate::commands;
use crate::keymap::MappableCommand;
@@ -16,6 +18,7 @@ pub fn register() {
register_event::<PostInsertChar>();
register_event::<PostCommand>();
register_event::<DocumentDidChange>();
register_event::<DocumentFocusLost>();
register_event::<SelectionDidChange>();
register_event::<DiagnosticsDidChange>();
}

View File

@@ -16,6 +16,7 @@ mod auto_save;
pub mod completion;
mod diagnostics;
mod signature_help;
mod snippet;
pub fn setup(config: Arc<ArcSwap<Config>>) -> Handlers {
events::register();
@@ -34,5 +35,6 @@ pub fn setup(config: Arc<ArcSwap<Config>>) -> Handlers {
signature_help::register_hooks(&handlers);
auto_save::register_hooks(&handlers);
diagnostics::register_hooks(&handlers);
snippet::register_hooks(&handlers);
handlers
}

View File

@@ -353,7 +353,7 @@ pub(super) fn register_hooks(handlers: &Handlers) {
let tx = handlers.signature_hints.clone();
register_hook!(move |event: &mut DocumentDidChange<'_>| {
if event.doc.config.load().lsp.auto_signature_help {
if event.doc.config.load().lsp.auto_signature_help && !event.ghost_transaction {
send_blocking(&tx, SignatureHelpEvent::ReTrigger);
}
Ok(())

View File

@@ -0,0 +1,28 @@
use helix_event::register_hook;
use helix_view::events::{DocumentDidChange, DocumentFocusLost, SelectionDidChange};
use helix_view::handlers::Handlers;
pub(super) fn register_hooks(_handlers: &Handlers) {
register_hook!(move |event: &mut SelectionDidChange<'_>| {
if let Some(snippet) = &event.doc.active_snippet {
if !snippet.is_valid(event.doc.selection(event.view)) {
event.doc.active_snippet = None;
}
}
Ok(())
});
register_hook!(move |event: &mut DocumentDidChange<'_>| {
if let Some(snippet) = &mut event.doc.active_snippet {
let invalid = snippet.map(event.changes);
if invalid {
event.doc.active_snippet = None;
}
}
Ok(())
});
register_hook!(move |event: &mut DocumentFocusLost<'_>| {
let editor = &mut event.editor;
doc_mut!(editor).active_snippet = None;
Ok(())
});
}

View File

@@ -307,6 +307,8 @@ pub fn language(lang_str: String) -> std::io::Result<()> {
.map(|formatter| formatter.command.to_string()),
)?;
probe_parser(lang.grammar.as_ref().unwrap_or(&lang.language_id))?;
for ts_feat in TsFeature::all() {
probe_treesitter_feature(&lang_str, *ts_feat)?
}
@@ -314,6 +316,18 @@ pub fn language(lang_str: String) -> std::io::Result<()> {
Ok(())
}
fn probe_parser(grammar_name: &str) -> std::io::Result<()> {
let stdout = std::io::stdout();
let mut stdout = stdout.lock();
write!(stdout, "Tree-sitter parser: ")?;
match helix_loader::grammar::get_language(grammar_name) {
Ok(_) => writeln!(stdout, "{}", "".green()),
Err(_) => writeln!(stdout, "{}", "None".yellow()),
}
}
/// Display diagnostics about multiple LSPs and DAPs.
fn probe_protocols<'a, I: Iterator<Item = &'a str> + 'a>(
protocol_name: &str,

View File

@@ -18,7 +18,6 @@ use futures_util::Future;
mod handlers;
use ignore::DirEntry;
use url::Url;
#[cfg(windows)]
fn true_color() -> bool {
@@ -70,10 +69,10 @@ fn filter_picker_entry(entry: &DirEntry, root: &Path, dedup_symlinks: bool) -> b
}
/// Opens URL in external program.
fn open_external_url_callback(
url: Url,
fn open_external_url_callback<U: AsRef<std::ffi::OsStr>>(
url: U,
) -> impl Future<Output = Result<job::Callback, anyhow::Error>> + Send + 'static {
let commands = open::commands(url.as_str());
let commands = open::commands(url);
async {
for cmd in commands {
let mut command = tokio::process::Command::new(cmd.get_program());

View File

@@ -40,8 +40,15 @@ fn main() -> Result<()> {
#[tokio::main]
async fn main_impl() -> Result<i32> {
let help = format!(
"\
let mut args = Args::parse_args().context("could not parse arguments")?;
helix_loader::initialize_config_file(args.config_file.clone());
helix_loader::initialize_log_file(args.log_file.clone());
// Help has a higher priority and should be handled separately.
if args.display_help {
print!(
"\
{} {}
{}
{}
@@ -69,21 +76,12 @@ FLAGS:
-w, --working-dir <path> Specify an initial working directory
+N Open the first given file at line number N
",
env!("CARGO_PKG_NAME"),
VERSION_AND_GIT_HASH,
env!("CARGO_PKG_AUTHORS"),
env!("CARGO_PKG_DESCRIPTION"),
helix_loader::default_log_file().display(),
);
let mut args = Args::parse_args().context("could not parse arguments")?;
helix_loader::initialize_config_file(args.config_file.clone());
helix_loader::initialize_log_file(args.log_file.clone());
// Help has a higher priority and should be handled separately.
if args.display_help {
print!("{}", help);
env!("CARGO_PKG_NAME"),
VERSION_AND_GIT_HASH,
env!("CARGO_PKG_AUTHORS"),
env!("CARGO_PKG_DESCRIPTION"),
helix_loader::default_log_file().display(),
);
std::process::exit(0);
}
@@ -154,8 +152,7 @@ FLAGS:
});
// TODO: use the thread local executor to spawn the application task separately from the work pool
let mut app =
Application::new(args, config, lang_loader).context("unable to create new application")?;
let mut app = Application::new(args, config, lang_loader).context("unable to start Helix")?;
let exit_code = app.run(&mut EventStream::new()).await?;

View File

@@ -9,14 +9,21 @@ use helix_view::{
document::SavePoint,
editor::CompleteAction,
handlers::lsp::SignatureHelpInvoked,
theme::{Modifier, Style},
theme::{Color, Modifier, Style},
ViewId,
};
use tui::{buffer::Buffer as Surface, text::Span};
use tui::{
buffer::Buffer as Surface,
text::{Span, Spans},
};
use std::{borrow::Cow, sync::Arc};
use helix_core::{self as core, chars, Change, Transaction};
use helix_core::{
self as core, chars,
snippets::{ActiveSnippet, RenderedSnippet, Snippet},
Change, Transaction,
};
use helix_view::{graphics::Rect, Document, Editor};
use crate::ui::{menu, Markdown, Menu, Popup, PromptEvent};
@@ -24,7 +31,7 @@ use crate::ui::{menu, Markdown, Menu, Popup, PromptEvent};
use helix_lsp::{lsp, util, OffsetEncoding};
impl menu::Item for CompletionItem {
type Data = ();
type Data = Style;
fn sort_text(&self, data: &Self::Data) -> Cow<str> {
self.filter_text(data)
}
@@ -42,7 +49,7 @@ impl menu::Item for CompletionItem {
}
}
fn format(&self, _data: &Self::Data) -> menu::Row {
fn format(&self, dir_style: &Self::Data) -> menu::Row {
let deprecated = match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => {
item.deprecated.unwrap_or_default()
@@ -60,51 +67,69 @@ impl menu::Item for CompletionItem {
let kind = match self {
CompletionItem::Lsp(LspCompletionItem { item, .. }) => match item.kind {
Some(lsp::CompletionItemKind::TEXT) => "text",
Some(lsp::CompletionItemKind::METHOD) => "method",
Some(lsp::CompletionItemKind::FUNCTION) => "function",
Some(lsp::CompletionItemKind::CONSTRUCTOR) => "constructor",
Some(lsp::CompletionItemKind::FIELD) => "field",
Some(lsp::CompletionItemKind::VARIABLE) => "variable",
Some(lsp::CompletionItemKind::CLASS) => "class",
Some(lsp::CompletionItemKind::INTERFACE) => "interface",
Some(lsp::CompletionItemKind::MODULE) => "module",
Some(lsp::CompletionItemKind::PROPERTY) => "property",
Some(lsp::CompletionItemKind::UNIT) => "unit",
Some(lsp::CompletionItemKind::VALUE) => "value",
Some(lsp::CompletionItemKind::ENUM) => "enum",
Some(lsp::CompletionItemKind::KEYWORD) => "keyword",
Some(lsp::CompletionItemKind::SNIPPET) => "snippet",
Some(lsp::CompletionItemKind::COLOR) => "color",
Some(lsp::CompletionItemKind::FILE) => "file",
Some(lsp::CompletionItemKind::REFERENCE) => "reference",
Some(lsp::CompletionItemKind::FOLDER) => "folder",
Some(lsp::CompletionItemKind::ENUM_MEMBER) => "enum_member",
Some(lsp::CompletionItemKind::CONSTANT) => "constant",
Some(lsp::CompletionItemKind::STRUCT) => "struct",
Some(lsp::CompletionItemKind::EVENT) => "event",
Some(lsp::CompletionItemKind::OPERATOR) => "operator",
Some(lsp::CompletionItemKind::TYPE_PARAMETER) => "type_param",
Some(lsp::CompletionItemKind::TEXT) => "text".into(),
Some(lsp::CompletionItemKind::METHOD) => "method".into(),
Some(lsp::CompletionItemKind::FUNCTION) => "function".into(),
Some(lsp::CompletionItemKind::CONSTRUCTOR) => "constructor".into(),
Some(lsp::CompletionItemKind::FIELD) => "field".into(),
Some(lsp::CompletionItemKind::VARIABLE) => "variable".into(),
Some(lsp::CompletionItemKind::CLASS) => "class".into(),
Some(lsp::CompletionItemKind::INTERFACE) => "interface".into(),
Some(lsp::CompletionItemKind::MODULE) => "module".into(),
Some(lsp::CompletionItemKind::PROPERTY) => "property".into(),
Some(lsp::CompletionItemKind::UNIT) => "unit".into(),
Some(lsp::CompletionItemKind::VALUE) => "value".into(),
Some(lsp::CompletionItemKind::ENUM) => "enum".into(),
Some(lsp::CompletionItemKind::KEYWORD) => "keyword".into(),
Some(lsp::CompletionItemKind::SNIPPET) => "snippet".into(),
Some(lsp::CompletionItemKind::COLOR) => item
.documentation
.as_ref()
.and_then(|docs| {
let text = match docs {
lsp::Documentation::String(text) => text,
lsp::Documentation::MarkupContent(lsp::MarkupContent {
value, ..
}) => value,
};
Color::from_hex(text)
})
.map_or("color".into(), |color| {
Spans::from(vec![
Span::raw("color "),
Span::styled("", Style::default().fg(color)),
])
}),
Some(lsp::CompletionItemKind::FILE) => "file".into(),
Some(lsp::CompletionItemKind::REFERENCE) => "reference".into(),
Some(lsp::CompletionItemKind::FOLDER) => "folder".into(),
Some(lsp::CompletionItemKind::ENUM_MEMBER) => "enum_member".into(),
Some(lsp::CompletionItemKind::CONSTANT) => "constant".into(),
Some(lsp::CompletionItemKind::STRUCT) => "struct".into(),
Some(lsp::CompletionItemKind::EVENT) => "event".into(),
Some(lsp::CompletionItemKind::OPERATOR) => "operator".into(),
Some(lsp::CompletionItemKind::TYPE_PARAMETER) => "type_param".into(),
Some(kind) => {
log::error!("Received unknown completion item kind: {:?}", kind);
""
"".into()
}
None => "",
None => "".into(),
},
CompletionItem::Other(core::CompletionItem { kind, .. }) => kind,
CompletionItem::Other(core::CompletionItem { kind, .. }) => kind.as_ref().into(),
};
menu::Row::new([
menu::Cell::from(Span::styled(
label,
if deprecated {
Style::default().add_modifier(Modifier::CROSSED_OUT)
} else {
Style::default()
},
)),
menu::Cell::from(kind),
])
let label = Span::styled(
label,
if deprecated {
Style::default().add_modifier(Modifier::CROSSED_OUT)
} else if kind.0[0].content == "folder" {
*dir_style
} else {
Style::default()
},
);
menu::Row::new([menu::Cell::from(label), menu::Cell::from(kind)])
}
}
@@ -131,103 +156,10 @@ impl Completion {
// Sort completion items according to their preselect status (given by the LSP server)
items.sort_by_key(|item| !item.preselect());
let dir_style = editor.theme.get("ui.text.directory");
// Then create the menu
let menu = Menu::new(items, (), move |editor: &mut Editor, item, event| {
fn lsp_item_to_transaction(
doc: &Document,
view_id: ViewId,
item: &lsp::CompletionItem,
offset_encoding: OffsetEncoding,
trigger_offset: usize,
include_placeholder: bool,
replace_mode: bool,
) -> Transaction {
use helix_lsp::snippet;
let selection = doc.selection(view_id);
let text = doc.text().slice(..);
let primary_cursor = selection.primary().cursor(text);
let (edit_offset, new_text) = if let Some(edit) = &item.text_edit {
let edit = match edit {
lsp::CompletionTextEdit::Edit(edit) => edit.clone(),
lsp::CompletionTextEdit::InsertAndReplace(item) => {
let range = if replace_mode {
item.replace
} else {
item.insert
};
lsp::TextEdit::new(range, item.new_text.clone())
}
};
let Some(range) =
util::lsp_range_to_range(doc.text(), edit.range, offset_encoding)
else {
return Transaction::new(doc.text());
};
let start_offset = range.anchor as i128 - primary_cursor as i128;
let end_offset = range.head as i128 - primary_cursor as i128;
(Some((start_offset, end_offset)), edit.new_text)
} else {
let new_text = item
.insert_text
.clone()
.unwrap_or_else(|| item.label.clone());
// check that we are still at the correct savepoint
// we can still generate a transaction regardless but if the
// document changed (and not just the selection) then we will
// likely delete the wrong text (same if we applied an edit sent by the LS)
debug_assert!(primary_cursor == trigger_offset);
(None, new_text)
};
if matches!(item.kind, Some(lsp::CompletionItemKind::SNIPPET))
|| matches!(
item.insert_text_format,
Some(lsp::InsertTextFormat::SNIPPET)
)
{
match snippet::parse(&new_text) {
Ok(snippet) => util::generate_transaction_from_snippet(
doc.text(),
selection,
edit_offset,
replace_mode,
snippet,
doc.line_ending.as_str(),
include_placeholder,
doc.tab_width(),
doc.indent_width(),
),
Err(err) => {
log::error!(
"Failed to parse snippet: {:?}, remaining output: {}",
&new_text,
err
);
Transaction::new(doc.text())
}
}
} else {
util::generate_transaction_from_completion_edit(
doc.text(),
selection,
edit_offset,
replace_mode,
new_text,
)
}
}
fn completion_changes(transaction: &Transaction, trigger_offset: usize) -> Vec<Change> {
transaction
.changes_iter()
.filter(|(start, end, _)| (*start..=*end).contains(&trigger_offset))
.collect()
}
let menu = Menu::new(items, dir_style, move |editor: &mut Editor, item, event| {
let (view, doc) = current!(editor);
macro_rules! language_server {
@@ -272,18 +204,17 @@ impl Completion {
let item = item.unwrap();
match item {
CompletionItem::Lsp(item) => doc.apply_temporary(
&lsp_item_to_transaction(
CompletionItem::Lsp(item) => {
let (transaction, _) = lsp_item_to_transaction(
doc,
view.id,
&item.item,
language_server!(item).offset_encoding(),
trigger_offset,
true,
replace_mode,
),
view.id,
),
);
doc.apply_temporary(&transaction, view.id)
}
CompletionItem::Other(core::CompletionItem { transaction, .. }) => {
doc.apply_temporary(transaction, view.id)
}
@@ -303,7 +234,7 @@ impl Completion {
doc.append_changes_to_history(view);
// item always present here
let (transaction, additional_edits) = match item.unwrap().clone() {
let (transaction, additional_edits, snippet) = match item.unwrap().clone() {
CompletionItem::Lsp(mut item) => {
let language_server = language_server!(item);
@@ -318,29 +249,40 @@ impl Completion {
};
let encoding = language_server.offset_encoding();
let transaction = lsp_item_to_transaction(
let (transaction, snippet) = lsp_item_to_transaction(
doc,
view.id,
&item.item,
encoding,
trigger_offset,
false,
replace_mode,
);
let add_edits = item.item.additional_text_edits;
(transaction, add_edits.map(|edits| (edits, encoding)))
(
transaction,
add_edits.map(|edits| (edits, encoding)),
snippet,
)
}
CompletionItem::Other(core::CompletionItem { transaction, .. }) => {
(transaction, None)
(transaction, None, None)
}
};
doc.apply(&transaction, view.id);
let placeholder = snippet.is_some();
if let Some(snippet) = snippet {
doc.active_snippet = match doc.active_snippet.take() {
Some(active) => active.insert_subsnippet(snippet),
None => ActiveSnippet::new(snippet),
};
}
editor.last_completion = Some(CompleteAction::Applied {
trigger_offset,
changes: completion_changes(&transaction, trigger_offset),
placeholder,
});
// TODO: add additional _edits to completion_changes?
@@ -581,3 +523,86 @@ impl Component for Completion {
markdown_doc.render(doc_area, surface, cx);
}
}
fn lsp_item_to_transaction(
doc: &Document,
view_id: ViewId,
item: &lsp::CompletionItem,
offset_encoding: OffsetEncoding,
trigger_offset: usize,
replace_mode: bool,
) -> (Transaction, Option<RenderedSnippet>) {
let selection = doc.selection(view_id);
let text = doc.text().slice(..);
let primary_cursor = selection.primary().cursor(text);
let (edit_offset, new_text) = if let Some(edit) = &item.text_edit {
let edit = match edit {
lsp::CompletionTextEdit::Edit(edit) => edit.clone(),
lsp::CompletionTextEdit::InsertAndReplace(item) => {
let range = if replace_mode {
item.replace
} else {
item.insert
};
lsp::TextEdit::new(range, item.new_text.clone())
}
};
let Some(range) = util::lsp_range_to_range(doc.text(), edit.range, offset_encoding) else {
return (Transaction::new(doc.text()), None);
};
let start_offset = range.anchor as i128 - primary_cursor as i128;
let end_offset = range.head as i128 - primary_cursor as i128;
(Some((start_offset, end_offset)), edit.new_text)
} else {
let new_text = item
.insert_text
.clone()
.unwrap_or_else(|| item.label.clone());
// check that we are still at the correct savepoint
// we can still generate a transaction regardless but if the
// document changed (and not just the selection) then we will
// likely delete the wrong text (same if we applied an edit sent by the LS)
debug_assert!(primary_cursor == trigger_offset);
(None, new_text)
};
if matches!(item.kind, Some(lsp::CompletionItemKind::SNIPPET))
|| matches!(
item.insert_text_format,
Some(lsp::InsertTextFormat::SNIPPET)
)
{
let Ok(snippet) = Snippet::parse(&new_text) else {
log::error!("Failed to parse snippet: {new_text:?}",);
return (Transaction::new(doc.text()), None);
};
let (transaction, snippet) = util::generate_transaction_from_snippet(
doc.text(),
selection,
edit_offset,
replace_mode,
snippet,
&mut doc.snippet_ctx(),
);
(transaction, Some(snippet))
} else {
let transaction = util::generate_transaction_from_completion_edit(
doc.text(),
selection,
edit_offset,
replace_mode,
new_text,
);
(transaction, None)
}
}
fn completion_changes(transaction: &Transaction, trigger_offset: usize) -> Vec<Change> {
transaction
.changes_iter()
.filter(|(start, end, _)| (*start..=*end).contains(&trigger_offset))
.collect()
}

View File

@@ -1,5 +1,5 @@
use crate::{
commands::{self, OnKeyCallback},
commands::{self, OnKeyCallback, OnKeyCallbackKind},
compositor::{Component, Context, Event, EventResult},
events::{OnModeSwitch, PostCommand},
handlers::completion::CompletionItem,
@@ -37,7 +37,7 @@ use tui::{buffer::Buffer as Surface, text::Span};
pub struct EditorView {
pub keymaps: Keymaps,
on_next_key: Option<OnKeyCallback>,
on_next_key: Option<(OnKeyCallback, OnKeyCallbackKind)>,
pseudo_pending: Vec<KeyEvent>,
pub(crate) last_insert: (commands::MappableCommand, Vec<InsertEvent>),
pub(crate) completion: Option<Completion>,
@@ -147,6 +147,9 @@ impl EditorView {
}
if is_focused {
if let Some(tabstops) = Self::tabstop_highlights(doc, theme) {
overlay_highlights = Box::new(syntax::merge(overlay_highlights, tabstops));
}
let highlights = syntax::merge(
overlay_highlights,
Self::doc_selection_highlights(
@@ -592,6 +595,24 @@ impl EditorView {
Vec::new()
}
pub fn tabstop_highlights(
doc: &Document,
theme: &Theme,
) -> Option<Vec<(usize, std::ops::Range<usize>)>> {
let snippet = doc.active_snippet.as_ref()?;
let highlight = theme.find_scope_index_exact("tabstop")?;
let mut highlights = Vec::new();
for tabstop in snippet.tabstops() {
highlights.extend(
tabstop
.ranges
.iter()
.map(|range| (highlight, range.start..range.end)),
);
}
(!highlights.is_empty()).then_some(highlights)
}
/// Render bufferline at the top
pub fn render_bufferline(editor: &Editor, viewport: Rect, surface: &mut Surface) {
let scratch = PathBuf::from(SCRATCH_BUFFER_NAME); // default filename to use for scratch buffer
@@ -918,8 +939,10 @@ impl EditorView {
if let Some(keyresult) = self.handle_keymap_event(Mode::Insert, cx, event) {
match keyresult {
KeymapResult::NotFound => {
if let Some(ch) = event.char() {
commands::insert::insert_char(cx, ch)
if !self.on_next_key(OnKeyCallbackKind::Fallback, cx, event) {
if let Some(ch) = event.char() {
commands::insert::insert_char(cx, ch)
}
}
}
KeymapResult::Cancelled(pending) => {
@@ -1015,7 +1038,10 @@ impl EditorView {
// set the register
cxt.register = cxt.editor.selected_register.take();
self.handle_keymap_event(mode, cxt, event);
let res = self.handle_keymap_event(mode, cxt, event);
if matches!(&res, Some(KeymapResult::NotFound)) {
self.on_next_key(OnKeyCallbackKind::Fallback, cxt, event);
}
if self.keymaps.pending().is_empty() {
cxt.editor.count = None
} else {
@@ -1050,24 +1076,38 @@ impl EditorView {
Some(area)
}
pub fn clear_completion(&mut self, editor: &mut Editor) {
pub fn clear_completion(&mut self, editor: &mut Editor) -> Option<OnKeyCallback> {
self.completion = None;
let mut on_next_key: Option<OnKeyCallback> = None;
if let Some(last_completion) = editor.last_completion.take() {
match last_completion {
CompleteAction::Triggered => (),
CompleteAction::Applied {
trigger_offset,
changes,
} => self.last_insert.1.push(InsertEvent::CompletionApply {
trigger_offset,
changes,
}),
placeholder,
} => {
self.last_insert.1.push(InsertEvent::CompletionApply {
trigger_offset,
changes,
});
on_next_key = placeholder.then_some(Box::new(|cx, key| {
if let Some(c) = key.char() {
let (view, doc) = current!(cx.editor);
if let Some(snippet) = &doc.active_snippet {
doc.apply(&snippet.delete_placeholder(doc.text()), view.id);
}
commands::insert::insert_char(cx, c);
}
}))
}
CompleteAction::Selected { savepoint } => {
let (view, doc) = current!(editor);
doc.restore(view, &savepoint, false);
}
}
}
on_next_key
}
pub fn handle_idle_timeout(&mut self, cx: &mut commands::Context) -> EventResult {
@@ -1091,7 +1131,7 @@ impl EditorView {
modifiers: KeyModifiers::empty(),
};
// dismiss any pending keys
if let Some(on_next_key) = self.on_next_key.take() {
if let Some((on_next_key, _)) = self.on_next_key.take() {
on_next_key(cxt, null_key_event);
}
self.handle_keymap_event(cxt.editor.mode, cxt, null_key_event);
@@ -1314,6 +1354,24 @@ impl EditorView {
_ => EventResult::Ignored(None),
}
}
fn on_next_key(
&mut self,
kind: OnKeyCallbackKind,
ctx: &mut commands::Context,
event: KeyEvent,
) -> bool {
if let Some((on_next_key, kind_)) = self.on_next_key.take() {
if kind == kind_ {
on_next_key(ctx, event);
true
} else {
self.on_next_key = Some((on_next_key, kind_));
false
}
} else {
false
}
}
}
impl Component for EditorView {
@@ -1365,10 +1423,7 @@ impl Component for EditorView {
let mode = cx.editor.mode();
if let Some(on_next_key) = self.on_next_key.take() {
// if there's a command waiting input, do that first
on_next_key(&mut cx, key);
} else {
if !self.on_next_key(OnKeyCallbackKind::PseudoPending, &mut cx, key) {
match mode {
Mode::Insert => {
// let completion swallow the event if necessary
@@ -1399,7 +1454,15 @@ impl Component for EditorView {
if let Some(callback) = res {
if callback.is_some() {
// assume close_fn
self.clear_completion(cx.editor);
if let Some(cb) = self.clear_completion(cx.editor) {
if consumed {
cx.on_next_key_callback =
Some((cb, OnKeyCallbackKind::Fallback))
} else {
self.on_next_key =
Some((cb, OnKeyCallbackKind::Fallback));
}
}
}
}
}
@@ -1418,8 +1481,8 @@ impl Component for EditorView {
self.on_next_key = cx.on_next_key_callback.take();
match self.on_next_key {
Some(_) => self.pseudo_pending.push(key),
None => self.pseudo_pending.clear(),
Some((_, OnKeyCallbackKind::PseudoPending)) => self.pseudo_pending.push(key),
_ => self.pseudo_pending.clear(),
}
// appease borrowck

View File

@@ -32,6 +32,17 @@ use helix_view::Editor;
use std::{error::Error, path::PathBuf};
struct Utf8PathBuf {
path: String,
is_dir: bool,
}
impl AsRef<str> for Utf8PathBuf {
fn as_ref(&self) -> &str {
&self.path
}
}
pub fn prompt(
cx: &mut crate::commands::Context,
prompt: std::borrow::Cow<'static, str>,
@@ -266,6 +277,7 @@ pub fn file_picker(root: PathBuf, config: &helix_view::editor::Config) -> FilePi
}
pub mod completers {
use super::Utf8PathBuf;
use crate::ui::prompt::Completion;
use helix_core::fuzzy::fuzzy_match;
use helix_core::syntax::LanguageServerFeature;
@@ -274,6 +286,7 @@ pub mod completers {
use helix_view::{editor::Config, Editor};
use once_cell::sync::Lazy;
use std::borrow::Cow;
use tui::text::Span;
pub type Completer = fn(&Editor, &str) -> Vec<Completion>;
@@ -290,7 +303,7 @@ pub mod completers {
fuzzy_match(input, names, true)
.into_iter()
.map(|(name, _)| ((0..), name))
.map(|(name, _)| ((0..), name.into()))
.collect()
}
@@ -336,7 +349,7 @@ pub mod completers {
fuzzy_match(input, &*KEYS, false)
.into_iter()
.map(|(name, _)| ((0..), name.into()))
.map(|(name, _)| ((0..), Span::raw(name)))
.collect()
}
@@ -424,7 +437,7 @@ pub mod completers {
// TODO: we could return an iter/lazy thing so it can fetch as many as it needs.
fn filename_impl<F>(
_editor: &Editor,
editor: &Editor,
input: &str,
git_ignore: bool,
filter_fn: F,
@@ -482,7 +495,7 @@ pub mod completers {
return None;
}
//let is_dir = entry.file_type().map_or(false, |entry| entry.is_dir());
let is_dir = entry.file_type().is_some_and(|entry| entry.is_dir());
let path = entry.path();
let mut path = if is_tilde {
@@ -501,23 +514,35 @@ pub mod completers {
}
let path = path.into_os_string().into_string().ok()?;
Some(Cow::from(path))
Some(Utf8PathBuf { path, is_dir })
})
}) // TODO: unwrap or skip
.filter(|path| !path.is_empty());
.filter(|path| !path.path.is_empty());
let directory_color = editor.theme.get("ui.text.directory");
let style_from_file = |file: Utf8PathBuf| {
if file.is_dir {
Span::styled(file.path, directory_color)
} else {
Span::raw(file.path)
}
};
// if empty, return a list of dirs and files in current dir
if let Some(file_name) = file_name {
let range = (input.len().saturating_sub(file_name.len()))..;
fuzzy_match(&file_name, files, true)
.into_iter()
.map(|(name, _)| (range.clone(), name))
.map(|(name, _)| (range.clone(), style_from_file(name)))
.collect()
// TODO: complete to longest common match
} else {
let mut files: Vec<_> = files.map(|file| (end.clone(), file)).collect();
files.sort_unstable_by(|(_, path1), (_, path2)| path1.cmp(path2));
let mut files: Vec<_> = files
.map(|file| (end.clone(), style_from_file(file)))
.collect();
files.sort_unstable_by(|(_, path1), (_, path2)| path1.content.cmp(&path2.content));
files
}
}

View File

@@ -8,6 +8,7 @@ use helix_view::keyboard::KeyCode;
use std::sync::Arc;
use std::{borrow::Cow, ops::RangeFrom};
use tui::buffer::Buffer as Surface;
use tui::text::Span;
use tui::widgets::{Block, Widget};
use helix_core::{
@@ -19,7 +20,8 @@ use helix_view::{
};
type PromptCharHandler = Box<dyn Fn(&mut Prompt, char, &Context)>;
pub type Completion = (RangeFrom<usize>, Cow<'static, str>);
pub type Completion = (RangeFrom<usize>, Span<'static>);
type CompletionFn = Box<dyn FnMut(&Editor, &str) -> Vec<Completion>>;
type CallbackFn = Box<dyn FnMut(&mut Context, &str, PromptEvent)>;
pub type DocFn = Box<dyn Fn(&str) -> Option<Cow<str>>>;
@@ -233,15 +235,7 @@ impl Prompt {
position
}
Movement::StartOfLine => 0,
Movement::EndOfLine => {
let mut cursor =
GraphemeCursor::new(self.line.len().saturating_sub(1), self.line.len(), false);
if let Ok(Some(pos)) = cursor.next_boundary(&self.line, 0) {
pos
} else {
self.cursor
}
}
Movement::EndOfLine => self.line.len(),
Movement::None => self.cursor,
}
}
@@ -382,7 +376,7 @@ impl Prompt {
let (range, item) = &self.completion[index];
self.line.replace_range(range.clone(), item);
self.line.replace_range(range.clone(), &item.content);
self.move_end();
}
@@ -407,7 +401,7 @@ impl Prompt {
let max_len = self
.completion
.iter()
.map(|(_, completion)| completion.len() as u16)
.map(|(_, completion)| completion.content.len() as u16)
.max()
.unwrap_or(BASE_WIDTH)
.max(BASE_WIDTH);
@@ -446,18 +440,22 @@ impl Prompt {
for (i, (_range, completion)) in
self.completion.iter().enumerate().skip(offset).take(items)
{
let color = if Some(i) == self.selection {
selected_color // TODO: just invert bg
let is_selected = Some(i) == self.selection;
let completion_item_style = if is_selected {
selected_color
} else {
completion_color
completion_color.patch(completion.style)
};
surface.set_stringn(
area.x + col * (1 + col_width),
area.y + row,
completion,
&completion.content,
col_width.saturating_sub(1) as usize,
color,
completion_item_style,
);
row += 1;
if row > area.height - 1 {
row = 0;

View File

@@ -119,3 +119,128 @@ async fn insert_newline_continue_line_comment() -> anyhow::Result<()> {
Ok(())
}
/// NOTE: Language is set to markdown to check if the indentation is correct for the new line
#[tokio::test(flavor = "multi_thread")]
async fn test_open_above() -> anyhow::Result<()> {
// `O` is pressed in the first line
test((
indoc! {"Helix #[is|]# cool"},
":lang markdown<ret>O",
indoc! {"\
#[\n|]#
Helix is cool
"},
))
.await?;
// `O` is pressed in the first line, but the current line has some indentation
test((
indoc! {"\
··This line has 2 spaces in front of it#[\n|]#
"}
.replace('·', " "),
":lang markdown<ret>Oa",
indoc! {"\
··a#[\n|]#
··This line has 2 spaces in front of it
"}
.replace('·', " "),
))
.await?;
// `O` is pressed but *not* in the first line
test((
indoc! {"\
I use
b#[t|]#w.
"},
":lang markdown<ret>Oarch",
indoc! {"\
I use
arch#[\n|]#
btw.
"},
))
.await?;
// `O` is pressed but *not* in the first line and the line has some indentation
test((
indoc! {"\
I use
····b#[t|]#w.
"}
.replace("·", " "),
":lang markdown<ret>Ohelix",
indoc! {"\
I use
····helix#[\n|]#
····btw.
"}
.replace("·", " "),
))
.await?;
Ok(())
}
/// NOTE: To make the `open_above` comment-aware, we're setting the language for each test to rust.
#[tokio::test(flavor = "multi_thread")]
async fn test_open_above_with_comments() -> anyhow::Result<()> {
// `O` is pressed in the first line inside a line comment
test((
indoc! {"// a commen#[t|]#"},
":lang rust<ret>O",
indoc! {"\
// #[\n|]#
// a comment
"},
))
.await?;
// `O` is pressed in the first line inside a line comment, but with indentation
test((
indoc! {"····// a comm#[e|]#nt"}.replace("·", " "),
":lang rust<ret>O",
indoc! {"\
····// #[\n|]#
····// a comment
"}
.replace("·", " "),
))
.await?;
// `O` is pressed but not in the first line but inside a line comment
test((
indoc! {"\
fn main() { }
// yeetus deletus#[\n|]#
"},
":lang rust<ret>O",
indoc! {"\
fn main() { }
// #[\n|]#
// yeetus deletus
"},
))
.await?;
// `O` is pressed but not in the first line but inside a line comment and with indentation
test((
indoc! {"\
fn main() { }
····// yeetus deletus#[\n|]#
"}
.replace("·", " "),
":lang rust<ret>O",
indoc! {"\
fn main() { }
····// #[\n|]#
····// yeetus deletus
"}
.replace("·", " "),
))
.await?;
Ok(())
}

View File

@@ -30,9 +30,7 @@ crossterm = { version = "0.28", optional = true }
tempfile = "3.14"
# Conversion traits
once_cell = "1.20"
url = "2.5.4"
arc-swap = { version = "1.7.1" }

View File

@@ -7,6 +7,7 @@ use helix_core::auto_pairs::AutoPairs;
use helix_core::chars::char_is_word;
use helix_core::doc_formatter::TextFormat;
use helix_core::encoding::Encoding;
use helix_core::snippets::{ActiveSnippet, SnippetRenderCtx};
use helix_core::syntax::{Highlight, LanguageServerFeature};
use helix_core::text_annotations::{InlineAnnotation, Overlay};
use helix_lsp::util::lsp_pos_to_pos;
@@ -135,6 +136,7 @@ pub struct Document {
text: Rope,
selections: HashMap<ViewId, Selection>,
view_data: HashMap<ViewId, ViewData>,
pub active_snippet: Option<ActiveSnippet>,
/// Inlay hints annotations for the document, by view.
///
@@ -640,7 +642,6 @@ where
}
use helix_lsp::{lsp, Client, LanguageServerId, LanguageServerName};
use url::Url;
impl Document {
pub fn from(
@@ -655,6 +656,7 @@ impl Document {
Self {
id: DocumentId::default(),
active_snippet: None,
path: None,
encoding,
has_bom,
@@ -1412,6 +1414,8 @@ impl Document {
doc: self,
view: view_id,
old_text: &old_doc,
changes,
ghost_transaction: !emit_lsp_notification,
});
// if specified, the current selection should instead be replaced by transaction.selection
@@ -1817,8 +1821,8 @@ impl Document {
}
/// File path as a URL.
pub fn url(&self) -> Option<Url> {
Url::from_file_path(self.path()?).ok()
pub fn url(&self) -> Option<lsp::Url> {
self.path().map(lsp::Url::from_file_path)
}
pub fn uri(&self) -> Option<helix_core::Uri> {
@@ -1904,7 +1908,7 @@ impl Document {
pub fn lsp_diagnostic_to_diagnostic(
text: &Rope,
language_config: Option<&LanguageConfiguration>,
diagnostic: &helix_lsp::lsp::Diagnostic,
diagnostic: &lsp::Diagnostic,
language_server_id: LanguageServerId,
offset_encoding: helix_lsp::OffsetEncoding,
) -> Option<Diagnostic> {
@@ -2051,6 +2055,16 @@ impl Document {
}
}
pub fn snippet_ctx(&self) -> SnippetRenderCtx {
SnippetRenderCtx {
// TODO snippet variable resolution
resolve_var: Box::new(|_| None),
tab_width: self.tab_width(),
indent_style: self.indent_style,
line_ending: self.line_ending.as_str(),
}
}
pub fn text_format(&self, mut viewport_width: u16, theme: Option<&Theme>) -> TextFormat {
let config = self.config.load();
let text_width = self

View File

@@ -4,6 +4,7 @@ use crate::{
document::{
DocumentOpenError, DocumentSavedEventFuture, DocumentSavedEventResult, Mode, SavePoint,
},
events::DocumentFocusLost,
graphics::{CursorKind, Rect},
handlers::Handlers,
info::Info,
@@ -14,6 +15,7 @@ use crate::{
Document, DocumentId, View, ViewId,
};
use dap::StackFrame;
use helix_event::dispatch;
use helix_vcs::DiffProviderRegistry;
use futures_util::stream::select_all::SelectAll;
@@ -304,6 +306,9 @@ pub struct Config {
/// Whether to instruct the LSP to replace the entire word when applying a completion
/// or to only insert new text
pub completion_replace: bool,
/// `true` if helix should automatically add a line comment token if you're currently in a comment
/// and press `enter`.
pub continue_comments: bool,
/// Whether to display infoboxes. Defaults to true.
pub auto_info: bool,
pub file_picker: FilePickerConfig,
@@ -985,6 +990,7 @@ impl Default for Config {
},
text_width: 80,
completion_replace: false,
continue_comments: true,
workspace_lsp_roots: Vec::new(),
default_line_ending: LineEndingConfig::default(),
insert_final_newline: true,
@@ -1131,6 +1137,7 @@ pub enum CompleteAction {
Applied {
trigger_offset: usize,
changes: Vec<Change>,
placeholder: bool,
},
}
@@ -1586,7 +1593,7 @@ impl Editor {
self.enter_normal_mode();
}
match action {
let focust_lost = match action {
Action::Replace => {
let (view, doc) = current_ref!(self);
// If the current view is an empty scratch buffer and is not displayed in any other views, delete it.
@@ -1636,6 +1643,10 @@ impl Editor {
self.replace_document_in_view(view_id, id);
dispatch(DocumentFocusLost {
editor: self,
doc: id,
});
return;
}
Action::Load => {
@@ -1646,6 +1657,7 @@ impl Editor {
return;
}
Action::HorizontalSplit | Action::VerticalSplit => {
let focus_lost = self.tree.try_get(self.tree.focus).map(|view| view.doc);
// copy the current view, unless there is no view yet
let view = self
.tree
@@ -1665,10 +1677,17 @@ impl Editor {
let doc = doc_mut!(self, &id);
doc.ensure_view_init(view_id);
doc.mark_as_focused();
focus_lost
}
}
};
self._refresh();
if let Some(focus_lost) = focust_lost {
dispatch(DocumentFocusLost {
editor: self,
doc: focus_lost,
});
}
}
/// Generate an id for a new document and register it.
@@ -1718,10 +1737,14 @@ impl Editor {
Ok(doc_id)
}
pub fn document_id_by_path(&self, path: &Path) -> Option<DocumentId> {
self.document_by_path(path).map(|doc| doc.id)
}
// ??? possible use for integration tests
pub fn open(&mut self, path: &Path, action: Action) -> Result<DocumentId, DocumentOpenError> {
let path = helix_stdx::path::canonicalize(path);
let id = self.document_by_path(&path).map(|doc| doc.id);
let id = self.document_id_by_path(&path);
let id = if let Some(id) = id {
id
@@ -1895,11 +1918,15 @@ impl Editor {
let doc = doc_mut!(self, &view.doc);
view.sync_changes(doc);
}
let view = view!(self, view_id);
let doc = doc_mut!(self, &view.doc);
doc.mark_as_focused();
let focus_lost = self.tree.get(prev_id).doc;
dispatch(DocumentFocusLost {
editor: self,
doc: focus_lost,
});
}
let view = view!(self, view_id);
let doc = doc_mut!(self, &view.doc);
doc.mark_as_focused();
}
pub fn focus_next(&mut self) {

View File

@@ -1,10 +1,18 @@
use helix_core::Rope;
use helix_core::{ChangeSet, Rope};
use helix_event::events;
use crate::{Document, DocumentId, Editor, ViewId};
events! {
DocumentDidChange<'a> { doc: &'a mut Document, view: ViewId, old_text: &'a Rope }
DocumentDidChange<'a> {
doc: &'a mut Document,
view: ViewId,
old_text: &'a Rope,
changes: &'a ChangeSet,
ghost_transaction: bool
}
SelectionDidChange<'a> { doc: &'a mut Document, view: ViewId }
DiagnosticsDidChange<'a> { editor: &'a mut Editor, doc: DocumentId }
// called **after** a document loses focus (but not when its closed)
DocumentFocusLost<'a> { editor: &'a mut Editor, doc: DocumentId }
}

View File

@@ -263,6 +263,31 @@ pub enum Color {
Indexed(u8),
}
impl Color {
/// Creates a `Color` from a hex string
///
/// # Examples
///
/// ```rust
/// use helix_view::theme::Color;
///
/// let color1 = Color::from_hex("#c0ffee").unwrap();
/// let color2 = Color::Rgb(192, 255, 238);
///
/// assert_eq!(color1, color2);
/// ```
pub fn from_hex(hex: &str) -> Option<Self> {
if !(hex.starts_with('#') && hex.len() == 7) {
return None;
}
match [1..=2, 3..=4, 5..=6].map(|i| hex.get(i).and_then(|c| u8::from_str_radix(c, 16).ok()))
{
[Some(r), Some(g), Some(b)] => Some(Self::Rgb(r, g, b)),
_ => None,
}
}
}
#[cfg(feature = "term")]
impl From<Color> for crossterm::style::Color {
fn from(color: Color) -> Self {

View File

@@ -57,7 +57,7 @@ pub struct ApplyEditError {
pub enum ApplyEditErrorKind {
DocumentChanged,
FileNotFound,
InvalidUrl(helix_core::uri::UrlConversionError),
InvalidUrl(helix_core::uri::UriParseError),
IoError(std::io::Error),
// TODO: check edits before applying and propagate failure
// InvalidEdit,
@@ -69,8 +69,8 @@ impl From<std::io::Error> for ApplyEditErrorKind {
}
}
impl From<helix_core::uri::UrlConversionError> for ApplyEditErrorKind {
fn from(err: helix_core::uri::UrlConversionError) -> Self {
impl From<helix_core::uri::UriParseError> for ApplyEditErrorKind {
fn from(err: helix_core::uri::UriParseError) -> Self {
ApplyEditErrorKind::InvalidUrl(err)
}
}
@@ -94,7 +94,7 @@ impl Editor {
text_edits: Vec<lsp::TextEdit>,
offset_encoding: OffsetEncoding,
) -> Result<(), ApplyEditErrorKind> {
let uri = match Uri::try_from(url) {
let uri = match Uri::try_from(url.as_str()) {
Ok(uri) => uri,
Err(err) => {
log::error!("{err}");
@@ -242,7 +242,7 @@ impl Editor {
// may no longer be valid.
match op {
ResourceOp::Create(op) => {
let uri = Uri::try_from(&op.uri)?;
let uri = Uri::try_from(op.uri.as_str())?;
let path = uri.as_path().expect("URIs are valid paths");
let ignore_if_exists = op.options.as_ref().map_or(false, |options| {
!options.overwrite.unwrap_or(false) && options.ignore_if_exists.unwrap_or(false)
@@ -262,7 +262,7 @@ impl Editor {
}
}
ResourceOp::Delete(op) => {
let uri = Uri::try_from(&op.uri)?;
let uri = Uri::try_from(op.uri.as_str())?;
let path = uri.as_path().expect("URIs are valid paths");
if path.is_dir() {
let recursive = op
@@ -284,9 +284,9 @@ impl Editor {
}
}
ResourceOp::Rename(op) => {
let from_uri = Uri::try_from(&op.old_uri)?;
let from_uri = Uri::try_from(op.old_uri.as_str())?;
let from = from_uri.as_path().expect("URIs are valid paths");
let to_uri = Uri::try_from(&op.new_uri)?;
let to_uri = Uri::try_from(op.new_uri.as_str())?;
let to = to_uri.as_path().expect("URIs are valid paths");
let ignore_if_exists = op.options.as_ref().map_or(false, |options| {
!options.overwrite.unwrap_or(false) && options.ignore_if_exists.unwrap_or(false)

View File

@@ -162,7 +162,12 @@ pub(crate) mod keys {
impl fmt::Display for KeyEvent {
fn fmt(&self, f: &mut fmt::Formatter<'_>) -> std::fmt::Result {
f.write_fmt(format_args!(
"{}{}{}",
"{}{}{}{}",
if self.modifiers.contains(KeyModifiers::SUPER) {
"Meta-"
} else {
""
},
if self.modifiers.contains(KeyModifiers::SHIFT) {
"S-"
} else {
@@ -312,6 +317,10 @@ impl UnicodeWidthStr for KeyEvent {
if self.modifiers.contains(KeyModifiers::CONTROL) {
width += 2;
}
if self.modifiers.contains(KeyModifiers::SUPER) {
// "-Meta"
width += 5;
}
width
}
@@ -413,6 +422,7 @@ impl std::str::FromStr for KeyEvent {
"S" => KeyModifiers::SHIFT,
"A" => KeyModifiers::ALT,
"C" => KeyModifiers::CONTROL,
"Meta" | "Cmd" | "Win" => KeyModifiers::SUPER,
_ => return Err(anyhow!("Invalid key modifier '{}-'", token)),
};
@@ -733,6 +743,28 @@ mod test {
modifiers: KeyModifiers::NONE
}
);
assert_eq!(
str::parse::<KeyEvent>("Meta-c").unwrap(),
KeyEvent {
code: KeyCode::Char('c'),
modifiers: KeyModifiers::SUPER
}
);
assert_eq!(
str::parse::<KeyEvent>("Win-s").unwrap(),
KeyEvent {
code: KeyCode::Char('s'),
modifiers: KeyModifiers::SUPER
}
);
assert_eq!(
str::parse::<KeyEvent>("Cmd-d").unwrap(),
KeyEvent {
code: KeyCode::Char('d'),
modifiers: KeyModifiers::SUPER
}
);
}
#[test]

View File

@@ -7,6 +7,7 @@ bitflags! {
const SHIFT = 0b0000_0001;
const CONTROL = 0b0000_0010;
const ALT = 0b0000_0100;
const SUPER = 0b0000_1000;
const NONE = 0b0000_0000;
}
}
@@ -27,6 +28,9 @@ impl From<KeyModifiers> for crossterm::event::KeyModifiers {
if key_modifiers.contains(KeyModifiers::ALT) {
result.insert(CKeyModifiers::ALT);
}
if key_modifiers.contains(KeyModifiers::SUPER) {
result.insert(CKeyModifiers::SUPER);
}
result
}
@@ -48,6 +52,9 @@ impl From<crossterm::event::KeyModifiers> for KeyModifiers {
if val.contains(CKeyModifiers::ALT) {
result.insert(KeyModifiers::ALT);
}
if val.contains(CKeyModifiers::SUPER) {
result.insert(KeyModifiers::SUPER);
}
result
}

View File

@@ -53,9 +53,11 @@ jq-lsp = { command = "jq-lsp" }
jsonnet-language-server = { command = "jsonnet-language-server", args= ["-t", "--lint"] }
julia = { command = "julia", timeout = 60, args = [ "--startup-file=no", "--history-file=no", "--quiet", "-e", "using LanguageServer; runserver()", ] }
koka = { command = "koka", args = ["--language-server", "--lsstdio"] }
koto-ls = { command = "koto-ls" }
kotlin-language-server = { command = "kotlin-language-server" }
lean = { command = "lean", args = [ "--server", "--memory=1024" ] }
ltex-ls = { command = "ltex-ls" }
ltex-ls-plus = { command = "ltex-ls-plus" }
markdoc-ls = { command = "markdoc-ls", args = ["--stdio"] }
markdown-oxide = { command = "markdown-oxide" }
marksman = { command = "marksman", args = ["server"] }
@@ -1183,6 +1185,8 @@ file-types = ["java", "jav", "pde"]
roots = ["pom.xml", "build.gradle", "build.gradle.kts"]
language-servers = [ "jdtls" ]
indent = { tab-width = 2, unit = " " }
comment-tokens = ["//"]
block-comment-tokens = { start = "/*", end = "*/" }
[[grammar]]
name = "java"
@@ -1703,7 +1707,7 @@ language-servers = [ "docker-langserver" ]
[[grammar]]
name = "dockerfile"
source = { git = "https://github.com/camdencheek/tree-sitter-dockerfile", rev = "8ee3a0f7587b2bd8c45c8cb7d28bd414604aec62" }
source = { git = "https://github.com/camdencheek/tree-sitter-dockerfile", rev = "087daa20438a6cc01fa5e6fe6906d77c869d19fe" }
[[language]]
name = "docker-compose"
@@ -2286,7 +2290,7 @@ indent = { tab-width = 4, unit = "\t" }
[[grammar]]
name = "v"
source = {git = "https://github.com/v-analyzer/v-analyzer", subpath = "tree_sitter_v", rev = "e14fdf6e661b10edccc744102e4ccf0b187aa8ad"}
source = {git = "https://github.com/vlang/v-analyzer", subpath = "tree_sitter_v", rev = "e14fdf6e661b10edccc744102e4ccf0b187aa8ad"}
[[language]]
name = "verilog"
@@ -2468,7 +2472,7 @@ language-servers = [ "slint-lsp" ]
[[grammar]]
name = "slint"
source = { git = "https://github.com/slint-ui/tree-sitter-slint", rev = "34ccfd58d3baee7636f62d9326f32092264e8407" }
source = { git = "https://github.com/slint-ui/tree-sitter-slint", rev = "f11da7e62051ba8b9d4faa299c26de8aeedfc1cd" }
[[language]]
name = "task"
@@ -3949,7 +3953,7 @@ indent = { tab-width = 4, unit = " " }
[[grammar]]
name = "spade"
source = { git = "https://gitlab.com/spade-lang/tree-sitter-spade/", rev = "4d5b141017c61fe7e168e0a5c5721ee62b0d9572" }
source = { git = "https://gitlab.com/spade-lang/tree-sitter-spade", rev = "4d5b141017c61fe7e168e0a5c5721ee62b0d9572" }
[[language]]
name = "amber"
@@ -3962,6 +3966,20 @@ indent = { tab-width = 4, unit = " " }
name = "amber"
source = { git = "https://github.com/amber-lang/tree-sitter-amber", rev = "c6df3ec2ec243ed76550c525e7ac3d9a10c6c814" }
[[language]]
name = "koto"
scope = "source.koto"
injection-regex = "koto"
file-types = ["koto"]
comment-token = "#"
block-comment-tokens = ["#-", "-#"]
indent = { tab-width = 2, unit = " " }
language-servers = ["koto-ls"]
[[grammar]]
name = "koto"
source = { git = "https://github.com/koto-lang/tree-sitter-koto", rev = "b420f7922d0d74905fd0d771e5b83be9ee8a8a9a" }
[[language]]
name = "gpr"
scope = "source.gpr"

View File

@@ -21,10 +21,10 @@
; Error level tags
((tag (name) @error)
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT)$"))
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT|COMPLIANCE)$"))
("text" @error
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT)$"))
(#match? @error "^(BUG|FIXME|ISSUE|XXX|FIX|SAFETY|FIXIT|FAILED|DEBUG|INVARIANT|COMPLIANCE)$"))
(tag
(name) @ui.text

View File

@@ -19,6 +19,8 @@
"SHELL"
"MAINTAINER"
"CROSS_BUILD"
(heredoc_marker)
(heredoc_end)
] @keyword
[
@@ -35,7 +37,12 @@
(image_digest
"@" @punctuation.special))
(double_quoted_string) @string
[
(double_quoted_string)
(single_quoted_string)
(json_string)
(heredoc_line)
] @string
(expansion
[

View File

@@ -0,0 +1,9 @@
[
(assign)
(comment)
(function)
(list)
(map)
(tuple)
(string)
] @fold

View File

@@ -0,0 +1,152 @@
[
"="
"+"
"-"
"*"
"/"
"%"
"+="
"-="
"*="
"/="
"%="
"=="
"!="
"<"
">"
"<="
">="
".."
"..="
"->"
(null_check)
] @operator
[
"let"
] @keyword
[
"and"
"not"
"or"
] @keyword.operator
[
"return"
"yield"
] @keyword.control.return
[
"if"
"then"
"else"
"else if"
"match"
"switch"
] @keyword.control.conditional
[
(break)
(continue)
"for"
"in"
"loop"
"until"
"while"
] @keyword.control.repeat
[
"throw"
"try"
"catch"
"finally"
] @keyword.control.exception
[
"export"
"from"
"import"
"as"
] @keyword.control.import
(string (interpolation ("{") @punctuation.special))
(string (interpolation ("}") @punctuation.special))
[
"("
")"
"["
"]"
"{"
"}"
"|"
] @punctuation.bracket
[
";"
":"
","
] @punctuation.delimiter
(import_module
(identifier) @module)
(import_item
(identifier) @module)
(export
(identifier) @module)
(call
function: (identifier) @function.method)
(chain
lookup: (identifier) @variable.other.member)
[
(true)
(false)
] @constant.builtin.boolean
(comment) @comment
(debug) @keyword
(string) @string
(fill_char) @punctuation.delimiter
(alignment) @operator
(escape) @constant.character.escape
(null) @constant.builtin
(number) @constant.numeric
(meta) @keyword.directive
(meta
name: (identifier) @variable.other.member)
(entry_inline
key: (identifier) @variable.other.member)
(entry_block
key: (identifier) @variable.other.member)
(self) @variable.builtin
(variable
type: (identifier) @type)
(arg
(_ (identifier) @variable.parameter))
(ellipsis) @variable.parameter
(function
output_type: (identifier) @type)
(identifier) @variable

View File

@@ -0,0 +1,61 @@
[
(list)
(map)
(tuple)
] @indent
[
(for)
(else_if)
(else)
(match)
(switch)
(until)
(while)
] @indent @extend
(assign
"=" @indent @extend
!rhs
)
(assign
"=" @indent @extend
rhs: (_) @anchor
(#not-same-line? @indent @anchor)
)
(if
condition: (_) @indent @extend
!then
)
(if
condition: (_) @indent @extend
then: (_) @anchor
(#not-same-line? @indent @anchor)
)
(function
(args) @indent @extend
!body
)
(function
(args) @indent @extend
body: (_) @anchor
(#not-same-line? @indent @anchor)
)
(match_arm
"then" @indent @extend
!then
)
(match_arm
"then" @indent @extend
then: (_) @anchor
(#not-same-line? @indent @anchor)
)
[
"}"
"]"
")"
] @outdent

View File

@@ -0,0 +1,2 @@
((comment) @injection.content
(#set! injection.language "comment"))

View File

@@ -0,0 +1,30 @@
; Scopes
(module (_) @local.scope)
(function
body: (_) @local.scope)
; Definitions
(assign
lhs: (identifier) @local.definition)
(variable
(identifier) @local.definition)
(arg
(identifier) @local.definition)
(arg
(variable (identifier)) @local.definition)
(import_item
(identifier) @local.definition)
(entry_block
(identifier) @local.definition)
(entry_inline
(identifier) @local.definition)
; References
(identifier) @local.reference

View File

@@ -0,0 +1,38 @@
(comment) @comment.inside
(comment)+ @comment.around
(function
body: (_) @function.inside) @function.around
(args
((arg) @parameter.inside . ","? @parameter.around) @parameter.around)
(call_args
((call_arg) @parameter.inside . ","? @parameter.around) @parameter.around)
(chain
call: (tuple
((element) @parameter.inside . ","? @parameter.around) @parameter.around))
(map
((entry_inline) @entry.inside . ","? @entry.around) @entry.around)
(map_block
((entry_block) @entry.inside) @entry.around)
(list
((element) @entry.inside . ","? @entry.around) @entry.around)
(tuple
(_) @entry.around)
(assign
(meta (test))
(function body: (_) @test.inside)
) @test.around
(entry_block
key: (meta (test))
value: (function body: (_) @test.inside)
) @test.around

View File

@@ -0,0 +1,22 @@
(procedure_declaration (identifier) (procedure (block) @function.inside)) @function.around
(procedure_declaration (identifier) (procedure (uninitialized) @function.inside)) @function.around
(overloaded_procedure_declaration (identifier) @function.inside) @function.around
(procedure_type (parameters (parameter (identifier) @parameter.inside) @parameter.around))
(procedure (parameters (parameter (identifier) @parameter.inside) @parameter.around))
((procedure_declaration
(attributes (attribute "@" "(" (identifier) @attr_name ")"))
(identifier) (procedure (block) @test.inside)) @test.around
(#match? @attr_name "test"))
(comment) @comment.inside
(comment)+ @comment.around
(block_comment) @comment.inside
(block_comment)+ @comment.around
(struct_declaration (identifier) "::") @class.around
(enum_declaration (identifier) "::") @class.around
(union_declaration (identifier) "::") @class.around
(bit_field_declaration (identifier) "::") @class.around
(const_declaration (identifier) "::" [(array_type) (distinct_type) (bit_set_type) (pointer_type)]) @class.around

View File

@@ -14,4 +14,5 @@
[
"}"
")"
"]"
] @outdent

View File

@@ -21,7 +21,7 @@
"variable" = "text"
"variable.parameter" = { fg = "maroon", modifiers = ["italic"] }
"variable.builtin" = "red"
"variable.other.member" = "teal"
"variable.other.member" = "blue"
"label" = "sapphire" # used for lifetimes
@@ -50,10 +50,10 @@
"markup.heading.5" = "pink"
"markup.heading.6" = "teal"
"markup.list" = "mauve"
"markup.bold" = { modifiers = ["bold"] }
"markup.italic" = { modifiers = ["italic"] }
"markup.list.unchecked" = "overlay2"
"markup.list.checked" = "green"
"markup.bold" = { modifiers = ["bold"] }
"markup.italic" = { modifiers = ["italic"] }
"markup.link.url" = { fg = "blue", modifiers = ["italic", "underlined"] }
"markup.link.text" = "blue"
"markup.raw" = "flamingo"
@@ -86,6 +86,7 @@
"ui.text" = "text"
"ui.text.focus" = { fg = "text", bg = "surface0", modifiers = ["bold"] }
"ui.text.inactive" = { fg = "overlay1" }
"ui.text.directory" = { fg = "blue" }
"ui.virtual" = "overlay0"
"ui.virtual.ruler" = { bg = "surface0" }

View File

@@ -72,11 +72,13 @@
"ui.bufferline.background" = { bg = "background" }
"ui.text" = { fg = "text" }
"ui.text.focus" = { fg = "white" }
"ui.text.directory" = { fg = "blue3" }
"ui.text.inactive" = { fg = "dark_gray" }
"ui.virtual.whitespace" = { fg = "#3e3e3d" }
"ui.virtual.ruler" = { bg = "borders" }
"ui.virtual.indent-guide" = { fg = "dark_gray4" }
"ui.virtual.inlay-hint" = { fg = "dark_gray5"}
"ui.virtual.jump-label" = { fg = "dark_gray", modifiers = ["bold"] }
"ui.virtual.jump-label" = { fg = "yellow", modifiers = ["bold"] }
"ui.highlight.frameline" = { bg = "#4b4b18" }
"ui.debug.active" = { fg = "#ffcc00" }
"ui.debug.breakpoint" = { fg = "#e51400" }

View File

@@ -118,6 +118,7 @@
"ui.statusline.select" = { fg = "black", bg = "cyan", modifiers = ["bold"] }
"ui.text" = { fg = "foreground" }
"ui.text.focus" = { fg = "cyan" }
"ui.text.directory" = { fg = "cyan" }
"ui.virtual.indent-guide" = { fg = "indent" }
"ui.virtual.inlay-hint" = { fg = "cyan" }
"ui.virtual.inlay-hint.parameter" = { fg = "cyan", modifiers = ["italic", "dim"] }

View File

@@ -42,6 +42,7 @@
"ui.statusline.select" = { fg = "background_dark", bg = "purple" }
"ui.text" = { fg = "foreground" }
"ui.text.focus" = { fg = "cyan" }
"ui.text.directory" = { fg = "cyan" }
"ui.window" = { fg = "foreground" }
"ui.virtual.jump-label" = { fg = "pink", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "background_dark" }

View File

@@ -94,6 +94,7 @@
"ui.window" = { fg = "bg4", bg = "bg_dim" }
"ui.help" = { fg = "fg", bg = "bg2" }
"ui.text" = "fg"
"ui.text.directory" = { fg = "green" }
"ui.text.focus" = "fg"
"ui.menu" = { fg = "fg", bg = "bg3" }
"ui.menu.selected" = { fg = "bg0", bg = "green" }

View File

@@ -93,6 +93,7 @@
"ui.window" = { fg = "bg4", bg = "bg_dim" }
"ui.help" = { fg = "fg", bg = "bg2" }
"ui.text" = "fg"
"ui.text.directory" = { fg = "green" }
"ui.text.focus" = "fg"
"ui.menu" = { fg = "fg", bg = "bg3" }
"ui.menu.selected" = { fg = "bg0", bg = "green" }

View File

@@ -59,6 +59,7 @@ label = "scale.red.3"
"ui.text" = { fg = "fg.muted" }
"ui.text.focus" = { fg = "fg.default" }
"ui.text.inactive" = "fg.subtle"
"ui.text.directory" = { fg = "scale.blue.2" }
"ui.virtual" = { fg = "scale.gray.6" }
"ui.virtual.ruler" = { bg = "canvas.subtle" }
"ui.virtual.jump-label" = { fg = "scale.red.2", modifiers = ["bold"] }

View File

@@ -59,6 +59,7 @@ label = "scale.red.5"
"ui.text" = { fg = "fg.muted" }
"ui.text.focus" = { fg = "fg.default" }
"ui.text.inactive" = "fg.subtle"
"ui.text.directory" = { fg = "scale.blue.4" }
"ui.virtual" = { fg = "scale.gray.2" }
"ui.virtual.ruler" = { bg = "canvas.subtle" }

View File

@@ -106,6 +106,7 @@
"ui.statusline.select" = { fg = "bg1", bg = "orange1", modifiers = ["bold"] }
"ui.text" = { fg = "fg1" }
"ui.text.directory" = { fg = "blue1" }
"ui.virtual.inlay-hint" = { fg = "gray" }
"ui.virtual.jump-label" = { fg = "purple0", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "bg1" }

View File

@@ -60,6 +60,7 @@
"ui.background" = { bg = "bg", fg = "text" }
"ui.text" = { fg = "text" }
"ui.text.directory" = { fg = "blue" }
"ui.statusline" = { bg = "bg", fg = "text" }
"ui.statusline.inactive" = { bg = "bg", fg = "disabled" }

View File

@@ -36,6 +36,7 @@
"ui.text" = { fg = "text" }
"ui.text.focus" = { bg = "overlay" }
"ui.text.info" = { fg = "subtle" }
"ui.text.directory" = { fg = "iris" }
"ui.virtual.jump-label" = { fg = "love", modifiers = ["bold"] }
"ui.virtual.ruler" = { bg = "overlay" }

View File

@@ -89,6 +89,7 @@ hint = { fg = "hint" }
"ui.text.focus" = { bg = "bg-focus" }
"ui.text.inactive" = { fg = "comment", modifiers = ["italic"] }
"ui.text.info" = { bg = "bg-menu", fg = "fg" }
"ui.text.directory" = { fg = "cyan" }
"ui.virtual.ruler" = { bg = "fg-gutter" }
"ui.virtual.whitespace" = { fg = "fg-gutter" }
"ui.virtual.inlay-hint" = { bg = "bg-inlay", fg = "teal" }

View File

@@ -27,6 +27,7 @@ string = "silver"
"constant.character.escape" = "honey"
# used for lifetimes
label = "honey"
tabstop = { modifiers = ["italic"], bg = "bossanova" }
"markup.heading" = "lilac"
"markup.bold" = { modifiers = ["bold"] }
@@ -55,6 +56,7 @@ label = "honey"
"ui.text" = { fg = "lavender" }
"ui.text.focus" = { fg = "white" }
"ui.text.inactive" = "sirocco"
"ui.text.directory" = { fg = "lilac" }
"ui.virtual" = { fg = "comet" }
"ui.virtual.ruler" = { bg = "bossanova" }
"ui.virtual.jump-label" = { fg = "apricot", modifiers = ["bold"] }