Update upstream source from tag 'upstream/15.5.4+ds2'

Update to upstream version '15.5.4+ds2'
with Debian dir a1d9cffe53
This commit is contained in:
Pirate Praveen 2022-12-03 16:12:37 +05:30
commit 60d6134bd9
2843 changed files with 578254 additions and 0 deletions

View file

@ -0,0 +1,21 @@
---
name: Bug report
about: Create a report to help us improve
title: ''
labels: ''
assignees: ''
---
**Describe the bug**
A clear and concise description of what the bug is. Screenshots are often helpful here.
Do *NOT* just paste a link to other issues on GitHub.
**To Reproduce**
1. Provide copyable *text* (not an image) that reproduces the issue.
2. Provide a `chroma` command-line invocation to reproduce the issue with the above input text. eg. for Hugo the (rough) equivalent would be `chroma -s monokailight --html --html-lines --html-lines-table --html-inline-styles <source>`
Do *NOT* provide configuration for another tool (eg. Hugo). My time is limited and if you want me to fix your issue, help me help you.

View file

@ -0,0 +1,14 @@
---
name: Feature request
about: Suggest an idea for this project
title: ''
labels: ''
assignees: ''
---
**What problem does this feature solve?**
Please check the Chroma [README](https://github.com/alecthomas/chroma) and command-line tool to ensure this isn't an already solved problem.
**What feature do you propose?**

View file

@ -0,0 +1,19 @@
# Binaries for programs and plugins
*.exe
*.dll
*.so
*.dylib
/cmd/chroma/chroma
# Test binary, build with `go test -c`
*.test
# Output of the go coverage tool, specifically when used with LiteIDE
*.out
# Project-local glide cache, RE: https://github.com/Masterminds/glide/issues/736
.glide/
_models/
_examples/

View file

@ -0,0 +1,55 @@
run:
tests: true
skip-dirs:
- _examples
output:
print-issued-lines: false
linters:
enable-all: true
disable:
- maligned
- megacheck
- lll
- gocyclo
- dupl
- gochecknoglobals
- funlen
- godox
- wsl
- gomnd
- gocognit
linters-settings:
govet:
check-shadowing: true
gocyclo:
min-complexity: 10
dupl:
threshold: 100
goconst:
min-len: 8
min-occurrences: 3
issues:
max-per-linter: 0
max-same: 0
exclude-use-default: false
exclude:
# Captured by errcheck.
- '^(G104|G204):'
# Very commonly not checked.
- 'Error return value of .(.*\.Help|.*\.MarkFlagRequired|(os\.)?std(out|err)\..*|.*Close|.*Flush|os\.Remove(All)?|.*printf?|os\.(Un)?Setenv). is not checked'
- 'exported method (.*\.MarshalJSON|.*\.UnmarshalJSON|.*\.EntityURN|.*\.GoString|.*\.Pos) should have comment or be unexported'
- 'composite literal uses unkeyed fields'
- 'declaration of "err" shadows declaration'
- 'should not use dot imports'
- 'Potential file inclusion via variable'
- 'should have comment or be unexported'
- 'comment on exported var .* should be of the form'
- 'at least one file in a package should have a package comment'
- 'string literal contains the Unicode'
- 'methods on the same type should have the same receiver name'
- '_TokenType_name should be _TokenTypeName'
- '`_TokenType_map` should be `_TokenTypeMap`'

View file

@ -0,0 +1,33 @@
project_name: chroma
release:
github:
owner: alecthomas
name: chroma
brews:
-
install: bin.install "chroma"
builds:
- goos:
- linux
- darwin
- windows
goarch:
- amd64
- "386"
goarm:
- "6"
main: ./cmd/chroma/main.go
ldflags: -s -w -X main.version={{.Version}} -X main.commit={{.Commit}} -X main.date={{.Date}}
binary: chroma
archives:
-
format: tar.gz
name_template: '{{ .Binary }}-{{ .Version }}-{{ .Os }}-{{ .Arch }}{{ if .Arm }}v{{
.Arm }}{{ end }}'
files:
- COPYING
- README*
snapshot:
name_template: SNAPSHOT-{{ .Commit }}
checksum:
name_template: '{{ .ProjectName }}-{{ .Version }}-checksums.txt'

View file

@ -0,0 +1,12 @@
sudo: false
language: go
go:
- "1.13.x"
script:
- go test -v ./...
- curl -sfL https://install.goreleaser.com/github.com/golangci/golangci-lint.sh | bash -s v1.22.2
- ./bin/golangci-lint run
- git clean -fdx .
after_success:
curl -sL https://git.io/goreleaser | bash && goreleaser

View file

@ -0,0 +1,19 @@
Copyright (C) 2017 Alec Thomas
Permission is hereby granted, free of charge, to any person obtaining a copy of
this software and associated documentation files (the "Software"), to deal in
the Software without restriction, including without limitation the rights to
use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies
of the Software, and to permit persons to whom the Software is furnished to do
so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View file

@ -0,0 +1,19 @@
.PHONY: chromad upload all
all: README.md tokentype_string.go
README.md: lexers/*/*.go
./table.py
tokentype_string.go: types.go
go generate
chromad:
(cd ./cmd/chromad && go get github.com/GeertJohan/go.rice/rice@master && go install github.com/GeertJohan/go.rice/rice)
rm -f chromad
(export CGOENABLED=0 GOOS=linux ; cd ./cmd/chromad && go build -o ../../chromad .)
rice append -i ./cmd/chromad --exec=./chromad
upload: chromad
scp chromad root@swapoff.org: && \
ssh root@swapoff.org 'install -m755 ./chromad /srv/http/swapoff.org/bin && service chromad restart'

View file

@ -0,0 +1,267 @@
# Chroma — A general purpose syntax highlighter in pure Go [![Golang Documentation](https://godoc.org/github.com/alecthomas/chroma?status.svg)](https://godoc.org/github.com/alecthomas/chroma) [![Build Status](https://travis-ci.org/alecthomas/chroma.svg)](https://travis-ci.org/alecthomas/chroma) [![Gitter chat](https://badges.gitter.im/alecthomas.svg)](https://gitter.im/alecthomas/Lobby)
> **NOTE:** As Chroma has just been released, its API is still in flux. That said, the high-level interface should not change significantly.
Chroma takes source code and other structured text and converts it into syntax
highlighted HTML, ANSI-coloured text, etc.
Chroma is based heavily on [Pygments](http://pygments.org/), and includes
translators for Pygments lexers and styles.
<a id="markdown-table-of-contents" name="table-of-contents"></a>
## Table of Contents
<!-- TOC -->
1. [Table of Contents](#table-of-contents)
2. [Supported languages](#supported-languages)
3. [Try it](#try-it)
4. [Using the library](#using-the-library)
1. [Quick start](#quick-start)
2. [Identifying the language](#identifying-the-language)
3. [Formatting the output](#formatting-the-output)
4. [The HTML formatter](#the-html-formatter)
5. [More detail](#more-detail)
1. [Lexers](#lexers)
2. [Formatters](#formatters)
3. [Styles](#styles)
6. [Command-line interface](#command-line-interface)
7. [What's missing compared to Pygments?](#whats-missing-compared-to-pygments)
<!-- /TOC -->
<a id="markdown-supported-languages" name="supported-languages"></a>
## Supported languages
Prefix | Language
:----: | --------
A | ABAP, ABNF, ActionScript, ActionScript 3, Ada, Angular2, ANTLR, ApacheConf, APL, AppleScript, Arduino, Awk
B | Ballerina, Base Makefile, Bash, Batchfile, BlitzBasic, BNF, Brainfuck
C | C, C#, C++, Cap'n Proto, Cassandra CQL, Ceylon, CFEngine3, cfstatement, ChaiScript, Cheetah, Clojure, CMake, COBOL, CoffeeScript, Common Lisp, Coq, Crystal, CSS, Cython
D | D, Dart, Diff, Django/Jinja, Docker, DTD
E | EBNF, Elixir, Elm, EmacsLisp, Erlang
F | Factor, Fish, Forth, Fortran, FSharp
G | GAS, GDScript, Genshi, Genshi HTML, Genshi Text, GLSL, Gnuplot, Go, Go HTML Template, Go Text Template, GraphQL, Groovy
H | Handlebars, Haskell, Haxe, HCL, Hexdump, HTML, HTTP, Hy
I | Idris, INI, Io
J | J, Java, JavaScript, JSON, Julia, Jungle
K | Kotlin
L | Lighttpd configuration file, LLVM, Lua
M | Mako, markdown, Mason, Mathematica, Matlab, MiniZinc, MLIR, Modula-2, MonkeyC, MorrowindScript, Myghty, MySQL
N | NASM, Newspeak, Nginx configuration file, Nim, Nix
O | Objective-C, OCaml, Octave, OpenSCAD, Org Mode
P | PacmanConf, Perl, PHP, Pig, PkgConfig, PL/pgSQL, plaintext, PostgreSQL SQL dialect, PostScript, POVRay, PowerShell, Prolog, Protocol Buffer, Puppet, Python, Python 3
Q | QBasic
R | R, Racket, Ragel, react, reg, reStructuredText, Rexx, Ruby, Rust
S | Sass, Scala, Scheme, Scilab, SCSS, Smalltalk, Smarty, SML, Snobol, Solidity, SPARQL, SQL, SquidConf, Swift, SYSTEMD, systemverilog
T | TableGen, TASM, Tcl, Tcsh, Termcap, Terminfo, Terraform, TeX, Thrift, TOML, TradingView, Transact-SQL, Turing, Turtle, Twig, TypeScript, TypoScript, TypoScriptCssData, TypoScriptHtmlData
V | VB.net, verilog, VHDL, VimL, vue
W | WDTE
X | XML, Xorg
Y | YAML
_I will attempt to keep this section up to date, but an authoritative list can be
displayed with `chroma --list`._
<a id="markdown-try-it" name="try-it"></a>
## Try it
Try out various languages and styles on the [Chroma Playground](https://swapoff.org/chroma/playground/).
<a id="markdown-using-the-library" name="using-the-library"></a>
## Using the library
Chroma, like Pygments, has the concepts of
[lexers](https://github.com/alecthomas/chroma/tree/master/lexers),
[formatters](https://github.com/alecthomas/chroma/tree/master/formatters) and
[styles](https://github.com/alecthomas/chroma/tree/master/styles).
Lexers convert source text into a stream of tokens, styles specify how token
types are mapped to colours, and formatters convert tokens and styles into
formatted output.
A package exists for each of these, containing a global `Registry` variable
with all of the registered implementations. There are also helper functions
for using the registry in each package, such as looking up lexers by name or
matching filenames, etc.
In all cases, if a lexer, formatter or style can not be determined, `nil` will
be returned. In this situation you may want to default to the `Fallback`
value in each respective package, which provides sane defaults.
<a id="markdown-quick-start" name="quick-start"></a>
### Quick start
A convenience function exists that can be used to simply format some source
text, without any effort:
```go
err := quick.Highlight(os.Stdout, someSourceCode, "go", "html", "monokai")
```
<a id="markdown-identifying-the-language" name="identifying-the-language"></a>
### Identifying the language
To highlight code, you'll first have to identify what language the code is
written in. There are three primary ways to do that:
1. Detect the language from its filename.
```go
lexer := lexers.Match("foo.go")
```
3. Explicitly specify the language by its Chroma syntax ID (a full list is available from `lexers.Names()`).
```go
lexer := lexers.Get("go")
```
3. Detect the language from its content.
```go
lexer := lexers.Analyse("package main\n\nfunc main()\n{\n}\n")
```
In all cases, `nil` will be returned if the language can not be identified.
```go
if lexer == nil {
lexer = lexers.Fallback
}
```
At this point, it should be noted that some lexers can be extremely chatty. To
mitigate this, you can use the coalescing lexer to coalesce runs of identical
token types into a single token:
```go
lexer = chroma.Coalesce(lexer)
```
<a id="markdown-formatting-the-output" name="formatting-the-output"></a>
### Formatting the output
Once a language is identified you will need to pick a formatter and a style (theme).
```go
style := styles.Get("swapoff")
if style == nil {
style = styles.Fallback
}
formatter := formatters.Get("html")
if formatter == nil {
formatter = formatters.Fallback
}
```
Then obtain an iterator over the tokens:
```go
contents, err := ioutil.ReadAll(r)
iterator, err := lexer.Tokenise(nil, string(contents))
```
And finally, format the tokens from the iterator:
```go
err := formatter.Format(w, style, iterator)
```
<a id="markdown-the-html-formatter" name="the-html-formatter"></a>
### The HTML formatter
By default the `html` registered formatter generates standalone HTML with
embedded CSS. More flexibility is available through the `formatters/html` package.
Firstly, the output generated by the formatter can be customised with the
following constructor options:
- `Standalone()` - generate standalone HTML with embedded CSS.
- `WithClasses()` - use classes rather than inlined style attributes.
- `ClassPrefix(prefix)` - prefix each generated CSS class.
- `TabWidth(width)` - Set the rendered tab width, in characters.
- `WithLineNumbers()` - Render line numbers (style with `LineNumbers`).
- `LinkableLineNumbers()` - Make the line numbers linkable.
- `HighlightLines(ranges)` - Highlight lines in these ranges (style with `LineHighlight`).
- `LineNumbersInTable()` - Use a table for formatting line numbers and code, rather than spans.
If `WithClasses()` is used, the corresponding CSS can be obtained from the formatter with:
```go
formatter := html.New(html.WithClasses())
err := formatter.WriteCSS(w, style)
```
<a id="markdown-more-detail" name="more-detail"></a>
## More detail
<a id="markdown-lexers" name="lexers"></a>
### Lexers
See the [Pygments documentation](http://pygments.org/docs/lexerdevelopment/)
for details on implementing lexers. Most concepts apply directly to Chroma,
but see existing lexer implementations for real examples.
In many cases lexers can be automatically converted directly from Pygments by
using the included Python 3 script `pygments2chroma.py`. I use something like
the following:
```sh
python3 ~/Projects/chroma/_tools/pygments2chroma.py \
pygments.lexers.jvm.KotlinLexer \
> ~/Projects/chroma/lexers/kotlin.go \
&& gofmt -s -w ~/Projects/chroma/lexers/*.go
```
See notes in [pygments-lexers.go](https://github.com/alecthomas/chroma/blob/master/pygments-lexers.txt)
for a list of lexers, and notes on some of the issues importing them.
<a id="markdown-formatters" name="formatters"></a>
### Formatters
Chroma supports HTML output, as well as terminal output in 8 colour, 256 colour, and true-colour.
A `noop` formatter is included that outputs the token text only, and a `tokens`
formatter outputs raw tokens. The latter is useful for debugging lexers.
<a id="markdown-styles" name="styles"></a>
### Styles
Chroma styles use the [same syntax](http://pygments.org/docs/styles/) as Pygments.
All Pygments styles have been converted to Chroma using the `_tools/style.py` script.
When you work with one of [Chroma's styles](https://github.com/alecthomas/chroma/tree/master/styles), know that the `chroma.Background` token type provides the default style for tokens. It does so by defining a foreground color and background color.
For example, this gives each token name not defined in the style a default color of `#f8f8f8` and uses `#000000` for the highlighted code block's background:
~~~go
chroma.Background: "#f8f8f2 bg:#000000",
~~~
Also, token types in a style file are hierarchical. For instance, when `CommentSpecial` is not defined, Chroma uses the token style from `Comment`. So when several comment tokens use the same color, you'll only need to define `Comment` and override the one that has a different color.
For a quick overview of the available styles and how they look, check out the [Chroma Style Gallery](https://xyproto.github.io/splash/docs/).
<a id="markdown-command-line-interface" name="command-line-interface"></a>
## Command-line interface
A command-line interface to Chroma is included. It can be installed with:
```sh
go get -u github.com/alecthomas/chroma/cmd/chroma
```
<a id="markdown-whats-missing-compared-to-pygments" name="whats-missing-compared-to-pygments"></a>
## What's missing compared to Pygments?
- Quite a few lexers, for various reasons (pull-requests welcome):
- Pygments lexers for complex languages often include custom code to
handle certain aspects, such as Perl6's ability to nest code inside
regular expressions. These require time and effort to convert.
- I mostly only converted languages I had heard of, to reduce the porting cost.
- Some more esoteric features of Pygments are omitted for simplicity.
- Though the Chroma API supports content detection, very few languages support them.
I have plans to implement a statistical analyser at some point, but not enough time.

View file

@ -0,0 +1,136 @@
package main
import (
"io/ioutil"
"os"
"strings"
"text/template"
"github.com/aymerick/douceur/css"
"github.com/aymerick/douceur/parser"
"gopkg.in/alecthomas/kingpin.v3-unstable"
"github.com/alecthomas/chroma"
)
const (
outputTemplate = `package styles
import (
"github.com/alecthomas/chroma"
)
// {{.Name}} style.
var {{.Name}} = Register(chroma.MustNewStyle("{{.Name|Lower}}", chroma.StyleEntries{
{{- range .Rules}}
{{- if .Prelude|TokenType}}
chroma.{{.Prelude|TokenType}}: "{{.Declarations|TranslateDecls}}",
{{- end}}
{{- end}}
}))
`
)
var (
typeByClass = map[string]chroma.TokenType{
".hll": chroma.Background,
}
cssNamedColours = map[string]string{
"black": "#000000", "silver": "#c0c0c0", "gray": "#808080", "white": "#ffffff",
"maroon": "#800000", "red": "#ff0000", "purple": "#800080", "fuchsia": "#ff00ff",
"green": "#008000", "lime": "#00ff00", "olive": "#808000", "yellow": "#ffff00",
"navy": "#000080", "blue": "#0000ff", "teal": "#008080", "aqua": "#00ffff",
"orange": "#ffa500", "aliceblue": "#f0f8ff", "antiquewhite": "#faebd7", "aquamarine": "#7fffd4",
"azure": "#f0ffff", "beige": "#f5f5dc", "bisque": "#ffe4c4", "blanchedalmond": "#ffebcd",
"blueviolet": "#8a2be2", "brown": "#a52a2a", "burlywood": "#deb887", "cadetblue": "#5f9ea0",
"chartreuse": "#7fff00", "chocolate": "#d2691e", "coral": "#ff7f50", "cornflowerblue": "#6495ed",
"cornsilk": "#fff8dc", "crimson": "#dc143c", "cyan": "#00ffff", "darkblue": "#00008b",
"darkcyan": "#008b8b", "darkgoldenrod": "#b8860b", "darkgray": "#a9a9a9", "darkgreen": "#006400",
"darkgrey": "#a9a9a9", "darkkhaki": "#bdb76b", "darkmagenta": "#8b008b", "darkolivegreen": "#556b2f",
"darkorange": "#ff8c00", "darkorchid": "#9932cc", "darkred": "#8b0000", "darksalmon": "#e9967a",
"darkseagreen": "#8fbc8f", "darkslateblue": "#483d8b", "darkslategray": "#2f4f4f", "darkslategrey": "#2f4f4f",
"darkturquoise": "#00ced1", "darkviolet": "#9400d3", "deeppink": "#ff1493", "deepskyblue": "#00bfff",
"dimgray": "#696969", "dimgrey": "#696969", "dodgerblue": "#1e90ff", "firebrick": "#b22222",
"floralwhite": "#fffaf0", "forestgreen": "#228b22", "gainsboro": "#dcdcdc", "ghostwhite": "#f8f8ff",
"gold": "#ffd700", "goldenrod": "#daa520", "greenyellow": "#adff2f", "grey": "#808080",
"honeydew": "#f0fff0", "hotpink": "#ff69b4", "indianred": "#cd5c5c", "indigo": "#4b0082",
"ivory": "#fffff0", "khaki": "#f0e68c", "lavender": "#e6e6fa", "lavenderblush": "#fff0f5",
"lawngreen": "#7cfc00", "lemonchiffon": "#fffacd", "lightblue": "#add8e6", "lightcoral": "#f08080",
"lightcyan": "#e0ffff", "lightgoldenrodyellow": "#fafad2", "lightgray": "#d3d3d3", "lightgreen": "#90ee90",
"lightgrey": "#d3d3d3", "lightpink": "#ffb6c1", "lightsalmon": "#ffa07a", "lightseagreen": "#20b2aa",
"lightskyblue": "#87cefa", "lightslategray": "#778899", "lightslategrey": "#778899", "lightsteelblue": "#b0c4de",
"lightyellow": "#ffffe0", "limegreen": "#32cd32", "linen": "#faf0e6", "magenta": "#ff00ff",
"mediumaquamarine": "#66cdaa", "mediumblue": "#0000cd", "mediumorchid": "#ba55d3", "mediumpurple": "#9370db",
"mediumseagreen": "#3cb371", "mediumslateblue": "#7b68ee", "mediumspringgreen": "#00fa9a", "mediumturquoise": "#48d1cc",
"mediumvioletred": "#c71585", "midnightblue": "#191970", "mintcream": "#f5fffa", "mistyrose": "#ffe4e1",
"moccasin": "#ffe4b5", "navajowhite": "#ffdead", "oldlace": "#fdf5e6", "olivedrab": "#6b8e23",
"orangered": "#ff4500", "orchid": "#da70d6", "palegoldenrod": "#eee8aa", "palegreen": "#98fb98",
"paleturquoise": "#afeeee", "palevioletred": "#db7093", "papayawhip": "#ffefd5", "peachpuff": "#ffdab9",
"peru": "#cd853f", "pink": "#ffc0cb", "plum": "#dda0dd", "powderblue": "#b0e0e6",
"rosybrown": "#bc8f8f", "royalblue": "#4169e1", "saddlebrown": "#8b4513", "salmon": "#fa8072",
"sandybrown": "#f4a460", "seagreen": "#2e8b57", "seashell": "#fff5ee", "sienna": "#a0522d",
"skyblue": "#87ceeb", "slateblue": "#6a5acd", "slategray": "#708090", "slategrey": "#708090",
"snow": "#fffafa", "springgreen": "#00ff7f", "steelblue": "#4682b4", "tan": "#d2b48c",
"thistle": "#d8bfd8", "tomato": "#ff6347", "turquoise": "#40e0d0", "violet": "#ee82ee",
"wheat": "#f5deb3", "whitesmoke": "#f5f5f5", "yellowgreen": "#9acd32", "rebeccapurple": "#663399",
}
nameArg = kingpin.Arg("name", "Name of output style.").Required().String()
fileArg = kingpin.Arg("stylesheets", ".css file to import").Required().ExistingFile()
)
func init() {
for tt, str := range chroma.StandardTypes {
typeByClass["."+str] = tt
}
}
func translateDecls(decls []*css.Declaration) string {
out := []string{}
for _, decl := range decls {
switch decl.Property {
case "color":
clr := decl.Value
if c, ok := cssNamedColours[clr]; ok {
clr = c
}
out = append(out, clr)
case "background-color":
out = append(out, "bg:"+decl.Value)
case "font-style":
if strings.Contains(decl.Value, "italic") {
out = append(out, "italic")
}
case "font-weight":
if strings.Contains(decl.Value, "bold") {
out = append(out, "bold")
}
case "text-decoration":
if strings.Contains(decl.Value, "underline") {
out = append(out, "underline")
}
}
}
return strings.Join(out, " ")
}
func main() {
kingpin.Parse()
source, err := ioutil.ReadFile(*fileArg)
kingpin.FatalIfError(err, "")
css, err := parser.Parse(string(source))
kingpin.FatalIfError(err, "")
context := map[string]interface{}{
"Name": *nameArg,
"Rules": css.Rules,
}
tmpl := template.Must(template.New("style").Funcs(template.FuncMap{
"Lower": strings.ToLower,
"TranslateDecls": translateDecls,
"TokenType": func(s string) chroma.TokenType { return typeByClass[s] },
}).Parse(outputTemplate))
err = tmpl.Execute(os.Stdout, context)
kingpin.FatalIfError(err, "")
}

View file

@ -0,0 +1,38 @@
package main
import (
"fmt"
"io/ioutil"
"os"
"github.com/alecthomas/chroma/formatters"
"github.com/alecthomas/chroma/lexers"
"github.com/alecthomas/chroma/styles"
"gopkg.in/alecthomas/kingpin.v3-unstable"
)
var (
filesArgs = kingpin.Arg("file", "Files to use to exercise lexers.").Required().ExistingFiles()
)
func main() {
kingpin.CommandLine.Help = "Exercise linters against a list of files."
kingpin.Parse()
for _, file := range *filesArgs {
lexer := lexers.Match(file)
if lexer == nil {
fmt.Printf("warning: could not find lexer for %q\n", file)
continue
}
fmt.Printf("%s: ", file)
os.Stdout.Sync()
text, err := ioutil.ReadFile(file)
kingpin.FatalIfError(err, "")
it, err := lexer.Tokenise(nil, string(text))
kingpin.FatalIfError(err, "%s failed to tokenise %q", lexer.Config().Name, file)
err = formatters.NoOp.Format(ioutil.Discard, styles.SwapOff, it)
kingpin.FatalIfError(err, "%s failed to format %q", lexer.Config().Name, file)
fmt.Printf("ok\n")
}
}

View file

@ -0,0 +1,29 @@
fs default() {
gofmt fs { chromaLexer "pygments.lexers.hlb.HlbLexer"; }
}
fs script() {
local "." with option {
includePatterns "pygments2chroma.py"
}
}
fs runChromaPython(string package) {
image "python:alpine" with option { resolve; }
run "apk add -U git"
run "pip install -U pystache"
run "pip install -U -e git+https://github.com/hinshun/pygments.git@hlb-lexer#egg=pygments"
run string { format "python pygments2chroma.py %s > /out/lexer.go" package; } with option {
dir "/chroma"
mount script "/chroma"
mount fs { scratch; } "/out" as chromaLexer
}
}
fs runGoFormat(fs goFiles) {
image "golang:alpine" with option { resolve; }
run "gofmt -s -w /gofmt/*.go" with option {
dir "/gofmt"
mount goFiles "/gofmt" as gofmt
}
}

View file

@ -0,0 +1,197 @@
import functools
import importlib
import json
import os
import re
import sys
import types
import pystache
from pygments import lexer as pygments_lexer
from pygments.token import _TokenType
TEMPLATE = r'''
package lexers
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// {{upper_name}} lexer.
var {{upper_name}} = internal.Register(MustNewLexer(
&Config{
Name: "{{name}}",
Aliases: []string{ {{#aliases}}"{{.}}", {{/aliases}} },
Filenames: []string{ {{#filenames}}"{{.}}", {{/filenames}} },
MimeTypes: []string{ {{#mimetypes}}"{{.}}", {{/mimetypes}} },
{{#re_not_multiline}}
NotMultiline: true,
{{/re_not_multiline}}
{{#re_dotall}}
DotAll: true,
{{/re_dotall}}
{{#re_ignorecase}}
CaseInsensitive: true,
{{/re_ignorecase}}
},
Rules{
{{#tokens}}
"{{state}}": {
{{#rules}}
{{{.}}},
{{/rules}}
},
{{/tokens}}
},
))
'''
def go_regex(s):
return go_string(s)
def go_string(s):
if '`' not in s:
return '`' + s + '`'
return json.dumps(s)
def to_camel_case(snake_str):
components = snake_str.split('_')
return ''.join(x.title() for x in components)
def warning(message):
print('warning: ' + message, file=sys.stderr)
def resolve_emitter(emitter):
if isinstance(emitter, types.FunctionType):
if repr(emitter).startswith('<function bygroups.'):
args = emitter.__closure__[0].cell_contents
emitter = 'ByGroups(%s)' % ', '.join(resolve_emitter(e) for e in args)
elif repr(emitter).startswith('<function using.'):
args = emitter.__closure__[0].cell_contents
if isinstance(args, dict):
state = 'root'
if 'stack' in args:
state = args['stack'][1]
args.pop('stack')
assert args == {}, args
emitter = 'UsingSelf("%s")' % state
elif issubclass(args, pygments_lexer.Lexer):
name = args.__name__
if name.endswith('Lexer'):
name = name[:-5]
emitter = 'Using(%s)' % name
else:
raise ValueError('only support "using" with lexer classes, not %r' % args)
else:
warning('unsupported emitter function %r' % emitter)
emitter = '?? %r ??' % emitter
elif isinstance(emitter, _TokenType):
emitter = str(emitter).replace('.', '')[5:]
elif emitter is None:
# This generally only occurs when a lookahead/behind assertion is used, so we just allow it
# through.
return 'None'
else:
raise ValueError('unsupported emitter type %r' % emitter)
assert isinstance(emitter, str)
return emitter
def process_state_action(action):
if isinstance(action, tuple):
return functools.reduce(lambda a, b: a + b, (process_state_action(a) for a in action))
if action.startswith('#'):
action = action[1:]
if action== 'pop':
action = 'Pop(1)'
elif action.startswith('pop:'):
action = 'Pop(%s)' % action[4:]
elif action == 'push':
action = 'Push()'
elif action.startswith('push:'):
action = 'Push("%s")' % action[5:]
else:
raise ValueError('unsupported action %r' % (action,))
else:
action = 'Push("%s")' % action
return (action,)
def translate_rules(rules):
out = []
for rule in rules:
if isinstance(rule, tuple):
regex = rule[0]
if isinstance(regex, str):
regex = go_regex(regex)
elif isinstance(regex, pygments_lexer.words):
regex = 'Words(%s, %s, %s)' % (go_string(regex.prefix),
go_string(regex.suffix),
', '.join(go_string(w) for w in regex.words))
else:
raise ValueError('expected regex string but got %r' % regex)
emitter = resolve_emitter(rule[1])
if len(rule) == 2:
modifier = 'nil'
elif type(rule[2]) is str:
modifier = process_state_action(rule[2])[0]
elif isinstance(rule[2], pygments_lexer.combined):
modifier = 'Combined("%s")' % '", "'.join(rule[2])
elif type(rule[2]) is tuple:
modifier = 'Push("%s")' % '", "'.join(rule[2])
else:
raise ValueError('unsupported modifier %r' % (rule[2],))
out.append('{{ {}, {}, {} }}'.format(regex, emitter, modifier))
elif isinstance(rule, pygments_lexer.include):
out.append('Include("{}")'.format(rule))
elif isinstance(rule, pygments_lexer.default):
out.append('Default({})'.format(', '.join(process_state_action(rule.state))))
else:
raise ValueError('unsupported rule %r' % (rule,))
return out
class TemplateView(object):
def __init__(self, **kwargs):
for key, value in kwargs.items():
setattr(self, key, value)
def re_not_multiline(self):
return not (self.regex_flags & re.MULTILINE)
def re_dotall(self):
return self.regex_flags & re.DOTALL
def re_ignorecase(self):
return self.regex_flags & re.IGNORECASE
def main():
package_name, symbol_name = sys.argv[1].rsplit(sep=".", maxsplit=1)
package = importlib.import_module(package_name)
lexer_cls = getattr(package, symbol_name)
assert issubclass(lexer_cls, pygments_lexer.RegexLexer), 'can only translate from RegexLexer'
print(pystache.render(TEMPLATE, TemplateView(
name=lexer_cls.name,
regex_flags=lexer_cls.flags,
upper_name=to_camel_case(lexer_cls.name),
aliases=lexer_cls.aliases,
filenames=lexer_cls.filenames,
mimetypes=lexer_cls.mimetypes,
tokens=[{'state': state, 'rules': translate_rules(rules)} for (state, rules) in lexer_cls.get_tokendefs().items()],
)))
if __name__ == '__main__':
main()

View file

@ -0,0 +1,62 @@
import importlib
import sys
import pystache
from pygments.style import Style
from pygments.token import Token
TEMPLATE = r'''
package styles
import (
"github.com/alecthomas/chroma"
)
// {{upper_name}} style.
var {{upper_name}} = Register(chroma.MustNewStyle("{{name}}", chroma.StyleEntries{
{{#styles}}
chroma.{{type}}: "{{style}}",
{{/styles}}
}))
'''
def to_camel_case(snake_str):
components = snake_str.split('_')
return ''.join(x.title() for x in components)
def translate_token_type(t):
if t == Token:
t = Token.Background
return "".join(map(str, t))
def main():
name = sys.argv[1]
package_name, symbol_name = sys.argv[2].rsplit(sep=".", maxsplit=1)
package = importlib.import_module(package_name)
style_cls = getattr(package, symbol_name)
assert issubclass(style_cls, Style), 'can only translate from Style subclass'
styles = dict(style_cls.styles)
bg = "bg:" + style_cls.background_color
if Token in styles:
styles[Token] += " " + bg
else:
styles[Token] = bg
context = {
'upper_name': style_cls.__name__[:-5],
'name': name,
'styles': [{'type': translate_token_type(t), 'style': s}
for t, s in styles.items() if s],
}
print(pystache.render(TEMPLATE, context))
if __name__ == '__main__':
main()

View file

@ -0,0 +1,284 @@
package main
import (
"bufio"
"fmt"
"io"
"io/ioutil"
"os"
"os/signal"
"runtime"
"runtime/pprof"
"sort"
"strconv"
"strings"
"github.com/alecthomas/kong"
"github.com/mattn/go-colorable"
"github.com/mattn/go-isatty"
"github.com/alecthomas/chroma"
"github.com/alecthomas/chroma/formatters"
"github.com/alecthomas/chroma/formatters/html"
"github.com/alecthomas/chroma/lexers"
"github.com/alecthomas/chroma/styles"
)
var (
// Populated by goreleaser.
version = "?"
commit = "?"
date = "?"
description = `
Chroma is a general purpose syntax highlighting library and corresponding
command, for Go.
`
cli struct {
Version kong.VersionFlag `help:"Show version."`
Profile string `hidden:"" help:"Enable profiling to file."`
List bool `help:"List lexers, styles and formatters."`
Unbuffered bool `help:"Do not buffer output."`
Trace bool `help:"Trace lexer states as they are traversed."`
Check bool `help:"Do not format, check for tokenisation errors instead."`
Filename string `help:"Filename to use for selecting a lexer when reading from stdin."`
Lexer string `help:"Lexer to use when formatting." default:"autodetect" short:"l" enum:"${lexers}"`
Style string `help:"Style to use for formatting." default:"swapoff" short:"s" enum:"${styles}"`
Formatter string `help:"Formatter to use." default:"terminal" short:"f" enum:"${formatters}"`
JSON bool `help:"Output JSON representation of tokens."`
HTML bool `help:"Enable HTML mode (equivalent to '--formatter html')."`
HTMLPrefix string `help:"HTML CSS class prefix." placeholder:"PREFIX"`
HTMLStyles bool `help:"Output HTML CSS styles."`
HTMLAllStyles bool `help:"Output all HTML CSS styles, including redundant ones."`
HTMLOnly bool `help:"Output HTML fragment."`
HTMLInlineStyles bool `help:"Output HTML with inline styles (no classes)."`
HTMLTabWidth int `help:"Set the HTML tab width." default:"8"`
HTMLLines bool `help:"Include line numbers in output."`
HTMLLinesTable bool `help:"Split line numbers and code in a HTML table"`
HTMLLinesStyle string `help:"Style for line numbers."`
HTMLHighlight string `help:"Highlight these lines." placeholder:"N[:M][,...]"`
HTMLHighlightStyle string `help:"Style used for highlighting lines."`
HTMLBaseLine int `help:"Base line number." default:"1"`
HTMLPreventSurroundingPre bool `help:"Prevent the surrounding pre tag."`
SVG bool `help:"Output SVG representation of tokens."`
Files []string `arg:"" optional:"" help:"Files to highlight." type:"existingfile"`
}
)
type flushableWriter interface {
io.Writer
Flush() error
}
type nopFlushableWriter struct{ io.Writer }
func (n *nopFlushableWriter) Flush() error { return nil }
func main() {
ctx := kong.Parse(&cli, kong.Description(description), kong.Vars{
"version": fmt.Sprintf("%s-%s-%s", version, commit, date),
"lexers": "autodetect," + strings.Join(lexers.Names(true), ","),
"styles": strings.Join(styles.Names(), ","),
"formatters": strings.Join(formatters.Names(), ","),
})
if cli.List {
listAll()
return
}
if cli.Profile != "" {
f, err := os.Create(cli.Profile)
ctx.FatalIfErrorf(err)
err = pprof.StartCPUProfile(f)
ctx.FatalIfErrorf(err)
signals := make(chan os.Signal, 1)
signal.Notify(signals, os.Interrupt)
go func() {
<-signals
pprof.StopCPUProfile()
os.Exit(128 + 3)
}()
defer pprof.StopCPUProfile()
}
var out io.Writer = os.Stdout
if runtime.GOOS == "windows" && isatty.IsTerminal(os.Stdout.Fd()) {
out = colorable.NewColorableStdout()
}
var w flushableWriter
if cli.Unbuffered {
w = &nopFlushableWriter{out}
} else {
w = bufio.NewWriterSize(out, 16384)
}
defer w.Flush() // nolint: errcheck
switch {
case cli.JSON:
cli.Formatter = "json"
case cli.HTML:
cli.Formatter = "html"
case cli.SVG:
cli.Formatter = "svg"
}
// Retrieve user-specified style, clone it, and add some overrides.
builder := styles.Get(cli.Style).Builder()
if cli.HTMLHighlightStyle != "" {
builder.Add(chroma.LineHighlight, cli.HTMLHighlightStyle)
}
if cli.HTMLLinesStyle != "" {
builder.Add(chroma.LineNumbers, cli.HTMLLinesStyle)
}
style, err := builder.Build()
ctx.FatalIfErrorf(err)
// Dump styles.
if cli.HTMLStyles {
options := []html.Option{html.WithClasses(true)}
if cli.HTMLAllStyles {
options = append(options, html.WithAllClasses(true))
}
formatter := html.New(options...)
err = formatter.WriteCSS(w, style)
ctx.FatalIfErrorf(err)
return
}
if cli.Formatter == "html" {
configureHTMLFormatter(ctx)
}
if len(cli.Files) == 0 {
contents, err := ioutil.ReadAll(os.Stdin)
ctx.FatalIfErrorf(err)
format(ctx, w, style, lex(ctx, cli.Filename, string(contents)))
} else {
for _, filename := range cli.Files {
contents, err := ioutil.ReadFile(filename)
ctx.FatalIfErrorf(err)
if cli.Check {
check(filename, lex(ctx, filename, string(contents)))
} else {
format(ctx, w, style, lex(ctx, filename, string(contents)))
}
}
}
}
func configureHTMLFormatter(ctx *kong.Context) {
options := []html.Option{
html.TabWidth(cli.HTMLTabWidth),
html.BaseLineNumber(cli.HTMLBaseLine),
html.ClassPrefix(cli.HTMLPrefix),
html.WithAllClasses(cli.HTMLAllStyles),
html.WithClasses(!cli.HTMLInlineStyles),
html.Standalone(!cli.HTMLOnly),
html.WithLineNumbers(cli.HTMLLines),
html.LineNumbersInTable(cli.HTMLLinesTable),
html.PreventSurroundingPre(cli.HTMLPreventSurroundingPre),
}
if len(cli.HTMLHighlight) > 0 {
ranges := [][2]int{}
for _, span := range strings.Split(cli.HTMLHighlight, ",") {
parts := strings.Split(span, ":")
if len(parts) > 2 {
ctx.Fatalf("range should be N[:M], not %q", span)
}
start, err := strconv.ParseInt(parts[0], 10, 64)
ctx.FatalIfErrorf(err, "min value of range should be integer not %q", parts[0])
end := start
if len(parts) == 2 {
end, err = strconv.ParseInt(parts[1], 10, 64)
ctx.FatalIfErrorf(err, "max value of range should be integer not %q", parts[1])
}
ranges = append(ranges, [2]int{int(start), int(end)})
}
options = append(options, html.HighlightLines(ranges))
}
formatters.Register("html", html.New(options...))
}
func listAll() {
fmt.Println("lexers:")
sort.Sort(lexers.Registry.Lexers)
for _, l := range lexers.Registry.Lexers {
config := l.Config()
fmt.Printf(" %s\n", config.Name)
filenames := []string{}
filenames = append(filenames, config.Filenames...)
filenames = append(filenames, config.AliasFilenames...)
if len(config.Aliases) > 0 {
fmt.Printf(" aliases: %s\n", strings.Join(config.Aliases, " "))
}
if len(filenames) > 0 {
fmt.Printf(" filenames: %s\n", strings.Join(filenames, " "))
}
if len(config.MimeTypes) > 0 {
fmt.Printf(" mimetypes: %s\n", strings.Join(config.MimeTypes, " "))
}
}
fmt.Println()
fmt.Printf("styles:")
for _, name := range styles.Names() {
fmt.Printf(" %s", name)
}
fmt.Println()
fmt.Printf("formatters:")
for _, name := range formatters.Names() {
fmt.Printf(" %s", name)
}
fmt.Println()
}
func lex(ctx *kong.Context, path string, contents string) chroma.Iterator {
lexer := selexer(path, contents)
if lexer == nil {
lexer = lexers.Fallback
}
if rel, ok := lexer.(*chroma.RegexLexer); ok {
rel.Trace(cli.Trace)
}
lexer = chroma.Coalesce(lexer)
it, err := lexer.Tokenise(nil, contents)
ctx.FatalIfErrorf(err)
return it
}
func selexer(path, contents string) (lexer chroma.Lexer) {
if cli.Lexer != "autodetect" {
return lexers.Get(cli.Lexer)
}
if path != "" {
lexer := lexers.Match(path)
if lexer != nil {
return lexer
}
}
return lexers.Analyse(contents)
}
func format(ctx *kong.Context, w io.Writer, style *chroma.Style, it chroma.Iterator) {
formatter := formatters.Get(cli.Formatter)
err := formatter.Format(w, style, it)
ctx.FatalIfErrorf(err)
}
func check(filename string, it chroma.Iterator) {
line, col := 1, 0
for token := it(); token != chroma.EOF; token = it() {
if token.Type == chroma.Error {
fmt.Printf("%s:%d:%d %q\n", filename, line, col, token.String())
}
for _, c := range token.String() {
col++
if c == '\n' {
line, col = line+1, 0
}
}
}
}

View file

@ -0,0 +1,15 @@
module github.com/alecthomas/chroma/cmd/chromad
go 1.13
require (
github.com/GeertJohan/go.rice v1.0.1-0.20191102153406-d954009f7238
github.com/alecthomas/chroma v0.7.0
github.com/alecthomas/kong v0.2.1
github.com/alecthomas/kong-hcl v0.2.0
github.com/gorilla/csrf v1.6.2
github.com/gorilla/handlers v1.4.2
github.com/gorilla/mux v1.7.3
)
replace github.com/alecthomas/chroma => ../../

View file

@ -0,0 +1,62 @@
github.com/GeertJohan/go.incremental v1.0.0 h1:7AH+pY1XUgQE4Y1HcXYaMqAI0m9yrFqo/jt0CW30vsg=
github.com/GeertJohan/go.incremental v1.0.0/go.mod h1:6fAjUhbVuX1KcMD3c8TEgVUqmo4seqhv0i0kdATSkM0=
github.com/GeertJohan/go.rice v1.0.1-0.20191102153406-d954009f7238 h1:e8qSWlZ2UCPm8BMGyhIjG7QKPQLvmvsFUt9XOc+Dba8=
github.com/GeertJohan/go.rice v1.0.1-0.20191102153406-d954009f7238/go.mod h1:af5vUNlDNkCjOZeSGFgIJxDje9qdjsO6hshx0gTmZt4=
github.com/akavel/rsrc v0.8.0 h1:zjWn7ukO9Kc5Q62DOJCcxGpXC18RawVtYAGdz2aLlfw=
github.com/akavel/rsrc v0.8.0/go.mod h1:uLoCtb9J+EyAqh+26kdrTgmzRBFPGOolLWKpdxkKq+c=
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
github.com/alecthomas/kong v0.2.1-0.20190708041108-0548c6b1afae/go.mod h1:+inYUSluD+p4L8KdviBSgzcqEjUQOfC5fQDRFuc36lI=
github.com/alecthomas/kong v0.2.1-0.20190721020729-f7d3d9bfb5ed/go.mod h1:+inYUSluD+p4L8KdviBSgzcqEjUQOfC5fQDRFuc36lI=
github.com/alecthomas/kong v0.2.1 h1:V1tLBhyQBC4rsbXbcOvm3GBaytJSwRNX69fp1WJxbqQ=
github.com/alecthomas/kong v0.2.1/go.mod h1:+inYUSluD+p4L8KdviBSgzcqEjUQOfC5fQDRFuc36lI=
github.com/alecthomas/kong-hcl v0.2.0 h1:l1+pkGJm2BtRJF9dCq9hw6KZWEanteY4Ar9suW3Qm0g=
github.com/alecthomas/kong-hcl v0.2.0/go.mod h1:S5D46RHGG8Ubdxk8TuXBT9wndShsA8/JYSxxiI9y01Y=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
github.com/daaku/go.zipexe v1.0.0 h1:VSOgZtH418pH9L16hC/JrgSNJbbAL26pj7lmD1+CGdY=
github.com/daaku/go.zipexe v1.0.0/go.mod h1:z8IiR6TsVLEYKwXAoE/I+8ys/sDkgTzSL0CLnGVd57E=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dlclark/regexp2 v1.1.6 h1:CqB4MjHw0MFCDj+PHHjiESmHX+N7t0tJzKvC6M97BRg=
github.com/dlclark/regexp2 v1.1.6/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
github.com/gorilla/csrf v1.6.2 h1:QqQ/OWwuFp4jMKgBFAzJVW3FMULdyUW7JoM4pEWuqKg=
github.com/gorilla/csrf v1.6.2/go.mod h1:7tSf8kmjNYr7IWDCYhd3U8Ck34iQ/Yw5CJu7bAkHEGI=
github.com/gorilla/handlers v1.4.2 h1:0QniY0USkHQ1RGCLfKxeNHK9bkDHGRYGNDFBCS+YARg=
github.com/gorilla/handlers v1.4.2/go.mod h1:Qkdc/uu4tH4g6mTK6auzZ766c4CA0Ng8+o/OAirnOIQ=
github.com/gorilla/mux v1.7.3 h1:gnP5JzjVOuiZD07fKKToCAOjS0yOpj/qPETTXCCS6hw=
github.com/gorilla/mux v1.7.3/go.mod h1:1lud6UwP+6orDFRuTfBEV8e9/aOM/c4fVVCaMa2zaAs=
github.com/gorilla/securecookie v1.1.1 h1:miw7JPhV+b/lAHSXz4qd/nN9jRiAFV5FwjeKyCS8BvQ=
github.com/gorilla/securecookie v1.1.1/go.mod h1:ra0sb63/xPlUeL+yeDciTfxMRAA+MP+HVt/4epWDjd4=
github.com/hashicorp/hcl v1.0.0 h1:0Anlzjpi4vEasTeNFn2mLJgTSwt0+6sfsiTG8qcWGx4=
github.com/hashicorp/hcl v1.0.0/go.mod h1:E5yfLk+7swimpb2L/Alb/PJmXilQ/rhwaUYs4T20WEQ=
github.com/jessevdk/go-flags v1.4.0 h1:4IU2WS7AumrZ/40jfhf4QVDMsQwqA7VEHozFRrGARJA=
github.com/jessevdk/go-flags v1.4.0/go.mod h1:4FA24M0QyGHXBuZZK/XkWh8h0e1EYbRYJSGM75WSRxI=
github.com/mattn/go-colorable v0.0.9/go.mod h1:9vuHe8Xs5qXnSaW/c/ABM9alt+Vo+STaOChaDxuIBZU=
github.com/mattn/go-isatty v0.0.4 h1:bnP0vzxcAdeI1zdubAl5PjU6zsERjGZb7raWodagDYs=
github.com/mattn/go-isatty v0.0.4/go.mod h1:M+lRXTBqGeGNdLjl/ufCoiOlB5xdOkqRJdNxMWT7Zi4=
github.com/mitchellh/mapstructure v1.1.2/go.mod h1:FVVH3fgwuzCH5S8UJGiWEs2h04kUh9fWfEaFds41c1Y=
github.com/nkovacs/streamquote v1.0.0 h1:PmVIV08Zlx2lZK5fFZlMZ04eHcDTIFJCv/5/0twVUow=
github.com/nkovacs/streamquote v1.0.0/go.mod h1:BN+NaZ2CmdKqUuTUXUEm9j95B2TRbpOWpxbJYzzgUsc=
github.com/pkg/errors v0.8.0/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
github.com/valyala/bytebufferpool v1.0.0 h1:GqA5TC/0021Y/b9FG4Oi9Mr3q7XYx6KllzawFIhcdPw=
github.com/valyala/bytebufferpool v1.0.0/go.mod h1:6bBcMArwyJ5K/AmCkWv1jt77kVWyCJ6HpOuEn7z0Csc=
github.com/valyala/fasttemplate v1.0.1 h1:tY9CJiPnMXf1ERmG2EyK7gNUd+c6RKGD0IfU8WdUSz8=
github.com/valyala/fasttemplate v1.0.1/go.mod h1:UQGH1tvbgY+Nz5t2n7tXsz52dQxojPUpymEIMZ47gx8=
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35 h1:YAFjXN64LMvktoUZH9zgY4lGc/msGN7HQfoSuKCgaDU=
golang.org/x/sys v0.0.0-20181128092732-4ed8d59d0b35/go.mod h1:STP8DvDyc/dI5b8T5hshtkjS+E42TnysNCUPdjciGhY=

View file

@ -0,0 +1,172 @@
package main
import (
"encoding/json"
"fmt"
"html/template"
"log"
"net/http"
"sort"
"strings"
rice "github.com/GeertJohan/go.rice"
"github.com/alecthomas/kong"
konghcl "github.com/alecthomas/kong-hcl"
"github.com/gorilla/csrf"
"github.com/gorilla/handlers"
"github.com/gorilla/mux"
"github.com/alecthomas/chroma"
"github.com/alecthomas/chroma/formatters/html"
"github.com/alecthomas/chroma/lexers"
"github.com/alecthomas/chroma/styles"
)
var (
templateFiles = rice.MustFindBox("templates")
staticFiles = rice.MustFindBox("static")
htmlTemplate = template.Must(template.New("html").Parse(templateFiles.MustString("index.html.tmpl")))
)
type context struct {
Background template.CSS
SelectedLanguage string
Languages []string
SelectedStyle string
Styles []string
CSRFField template.HTML
Version string
}
func index(w http.ResponseWriter, r *http.Request) {
ctx := newContext(r)
err := htmlTemplate.Execute(w, &ctx)
if err != nil {
panic(err)
}
}
type renderRequest struct {
Language string `json:"language"`
Style string `json:"style"`
Text string `json:"text"`
Classes bool `json:"classes"`
}
type renderResponse struct {
Error string `json:"error,omitempty"`
HTML string `json:"html,omitempty"`
Language string `json:"language,omitempty"`
Background string `json:"background,omitempty"`
}
func renderHandler(w http.ResponseWriter, r *http.Request) {
req := &renderRequest{}
err := json.NewDecoder(r.Body).Decode(&req)
var rep *renderResponse
if err != nil {
rep = &renderResponse{Error: err.Error()}
} else {
rep, err = render(req)
if err != nil {
rep = &renderResponse{Error: err.Error()}
}
}
w.Header().Set("Content-Type", "application/json")
_ = json.NewEncoder(w).Encode(rep)
}
func render(req *renderRequest) (*renderResponse, error) {
language := lexers.Get(req.Language)
if language == nil {
language = lexers.Analyse(req.Text)
if language != nil {
req.Language = language.Config().Name
}
}
if language == nil {
language = lexers.Fallback
}
tokens, err := chroma.Coalesce(language).Tokenise(nil, req.Text)
if err != nil {
return nil, err
}
style := styles.Get(req.Style)
if style == nil {
style = styles.Fallback
}
buf := &strings.Builder{}
options := []html.Option{}
if req.Classes {
options = append(options, html.WithClasses(true), html.Standalone(true))
}
formatter := html.New(options...)
err = formatter.Format(buf, style, tokens)
if err != nil {
return nil, err
}
lang := language.Config().Name
if language == lexers.Fallback {
lang = ""
}
return &renderResponse{
Language: lang,
HTML: buf.String(),
Background: html.StyleEntryToCSS(style.Get(chroma.Background)),
}, nil
}
func newContext(r *http.Request) context {
ctx := context{
SelectedStyle: "monokailight",
CSRFField: csrf.TemplateField(r),
Version: fmt.Sprintf("%d", staticFiles.Time().Unix()),
}
style := styles.Get(ctx.SelectedStyle)
if style == nil {
style = styles.Fallback
}
ctx.Background = template.CSS(html.StyleEntryToCSS(style.Get(chroma.Background)))
if ctx.SelectedStyle == "" {
ctx.SelectedStyle = "monokailight"
}
for _, lexer := range lexers.Registry.Lexers {
ctx.Languages = append(ctx.Languages, lexer.Config().Name)
}
sort.Strings(ctx.Languages)
for _, style := range styles.Registry {
ctx.Styles = append(ctx.Styles, style.Name)
}
sort.Strings(ctx.Styles)
return ctx
}
func main() {
var cli struct {
Config kong.ConfigFlag `help:"Load configuration." placeholder:"FILE"`
Bind string `help:"HTTP bind address." default:"127.0.0.1:8080"`
CSRFKey string `help:"CSRF key." default:""`
}
ctx := kong.Parse(&cli, kong.Configuration(konghcl.Loader))
log.Printf("Starting on http://%s\n", cli.Bind)
router := mux.NewRouter()
router.Handle("/", http.HandlerFunc(index)).Methods("GET")
router.Handle("/api/render", http.HandlerFunc(renderHandler)).Methods("POST")
router.Handle("/static/{file:.*}", http.StripPrefix("/static/", http.FileServer(staticFiles.HTTPBox()))).Methods("GET")
options := []csrf.Option{}
if cli.CSRFKey == "" {
options = append(options, csrf.Secure(false))
}
root := handlers.CORS()(csrf.Protect([]byte(cli.CSRFKey), options...)(router))
err := http.ListenAndServe(cli.Bind, root)
ctx.FatalIfErrorf(err)
}

View file

@ -0,0 +1,88 @@
document.addEventListener("DOMContentLoaded", function () {
var style = document.createElement('style');
var ref = document.querySelector('script');
ref.parentNode.insertBefore(style, ref);
var form = document.getElementById('chroma');
var textArea = form.elements["text"];
var styleSelect = form.elements["style"];
var languageSelect = form.elements["language"];
var csrfToken = form.elements["gorilla.csrf.Token"].value;
var output = document.getElementById("output");
var htmlCheckbox = document.getElementById("html");
(document.querySelectorAll('.notification .delete') || []).forEach(($delete) => {
$notification = $delete.parentNode;
$delete.addEventListener('click', () => {
$notification.parentNode.removeChild($notification);
});
});
function debounce(func, wait, immediate) {
var timeout;
return function () {
var context = this, args = arguments;
var later = function () {
timeout = null;
if (!immediate) func.apply(context, args);
};
var callNow = immediate && !timeout;
clearTimeout(timeout);
timeout = setTimeout(later, wait);
if (callNow) func.apply(context, args);
};
}
function getFormJSON() {
return {
"language": languageSelect.value,
"style": styleSelect.value,
"text": textArea.value,
"classes": htmlCheckbox.checked,
}
}
function update(event) {
fetch("api/render", {
method: 'POST',
mode: 'cors',
cache: 'no-cache',
credentials: 'same-origin',
headers: {
'X-CSRF-Token': csrfToken,
'Content-Type': 'application/json',
},
redirect: 'follow',
referrer: 'no-referrer',
body: JSON.stringify(getFormJSON()),
}).then(data => {
data.json().then(
value => {
if (value.language) {
languageSelect.value = value.language;
}
style.innerHTML = "#output { " + value.background + "}";
if (htmlCheckbox.checked) {
output.innerText = value.html;
} else {
output.innerHTML = value.html;
}
}
);
}).catch(reason => {
console.log(reason);
});
event.preventDefault();
}
var eventHandler = (event) => update(event);
var debouncedEventHandler = debounce(eventHandler, 250);
languageSelect.addEventListener('change', eventHandler);
styleSelect.addEventListener('change', eventHandler);
htmlCheckbox.addEventListener('change', eventHandler);
textArea.addEventListener('input', debouncedEventHandler);
textArea.addEventListener('change', debouncedEventHandler);
});

View file

@ -0,0 +1,86 @@
<!doctype html>
<html>
<head>
<title>Chroma Playground</title>
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/bulma/0.7.5/css/bulma.min.css"/>
<style>
textarea {
font-family: Consolas, Monaco, Lucida Console, Liberation Mono, DejaVu Sans Mono, Bitstream Vera Sans Mono, Courier New, monospace;
}
#output {
{{.Background}};
}
#output pre {
padding: 0;
}
</style>
<script src="static/index.js?{{.Version}}"></script>
</head>
<body>
<div class="container">
<h1 class="title">Chroma Playground</h1>
<div class="notification">
<a href="https://github.com/alecthomas/chroma">Chroma</a> is a general purpose syntax highlighter in pure Go.
It takes source code and other structured text and converts it into syntax highlighted HTML, ANSI-coloured text,
etc. Chroma is based heavily on Pygments, and includes translators for Pygments lexers and styles.
</div>
<form id="chroma" method="post">
{{ .CSRFField }}
<div class="columns">
<div class="column field">
<label class="label">Language</label>
<div class="control">
<div class="select">
<select name="language" id="language">
<option value="" disabled{{if eq "" $.SelectedLanguage}} selected{{end}}>Language</option>
{{- range .Languages}}
<option value="{{.}}"{{if eq . $.SelectedLanguage}} selected{{end}}>{{.}}</option>
{{- end}}
</select>
</div>
</div>
</div>
<div class="column field">
<label class="label">Style</label>
<div class="control">
<div class="select">
<select name="style" id="style">
<option value="" disabled{{if eq "" $.SelectedStyle}} selected{{end}}>Style</option>
{{- range .Styles}}
<option value="{{.}}"{{if eq . $.SelectedStyle}} selected{{end}}>{{.}}</option>
{{- end}}
</select>
</div>
</div>
</div>
</div>
<div class="field">
<label class="label">Code</label>
<div class="control">
<textarea autocomplete="off" autocorrect="off" autocapitalize="off" spellcheck="false" class="textarea" id="text" name="text" rows="25" cols="80"></textarea>
</div>
</div>
<hr>
<label class="checkbox is-pulled-right">
<input name="html" id="html" type="checkbox">
Show HTML
</label>
<label class="label">
Output
</label>
<div class="field box" id="output"></div>
</form>
</div>
</body>
</html>

View file

@ -0,0 +1,35 @@
package chroma
// Coalesce is a Lexer interceptor that collapses runs of common types into a single token.
func Coalesce(lexer Lexer) Lexer { return &coalescer{lexer} }
type coalescer struct{ Lexer }
func (d *coalescer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) {
var prev Token
it, err := d.Lexer.Tokenise(options, text)
if err != nil {
return nil, err
}
return func() Token {
for token := it(); token != (EOF); token = it() {
if len(token.Value) == 0 {
continue
}
if prev == EOF {
prev = token
} else {
if prev.Type == token.Type && len(prev.Value) < 8192 {
prev.Value += token.Value
} else {
out := prev
prev = token
return out
}
}
}
out := prev
prev = EOF
return out
}, nil
}

View file

@ -0,0 +1,19 @@
package chroma
import (
"testing"
"github.com/alecthomas/assert"
)
func TestCoalesce(t *testing.T) {
lexer := Coalesce(MustNewLexer(nil, Rules{
"root": []Rule{
{`[!@#$%^&*()]`, Punctuation, nil},
},
}))
actual, err := Tokenise(lexer, nil, "!@#$")
assert.NoError(t, err)
expected := []Token{{Punctuation, "!@#$"}}
assert.Equal(t, expected, actual)
}

View file

@ -0,0 +1,164 @@
package chroma
import (
"fmt"
"math"
"strconv"
"strings"
)
// ANSI2RGB maps ANSI colour names, as supported by Chroma, to hex RGB values.
var ANSI2RGB = map[string]string{
"#ansiblack": "000000",
"#ansidarkred": "7f0000",
"#ansidarkgreen": "007f00",
"#ansibrown": "7f7fe0",
"#ansidarkblue": "00007f",
"#ansipurple": "7f007f",
"#ansiteal": "007f7f",
"#ansilightgray": "e5e5e5",
// Normal
"#ansidarkgray": "555555",
"#ansired": "ff0000",
"#ansigreen": "00ff00",
"#ansiyellow": "ffff00",
"#ansiblue": "0000ff",
"#ansifuchsia": "ff00ff",
"#ansiturquoise": "00ffff",
"#ansiwhite": "ffffff",
// Aliases without the "ansi" prefix, because...why?
"#black": "000000",
"#darkred": "7f0000",
"#darkgreen": "007f00",
"#brown": "7f7fe0",
"#darkblue": "00007f",
"#purple": "7f007f",
"#teal": "007f7f",
"#lightgray": "e5e5e5",
// Normal
"#darkgray": "555555",
"#red": "ff0000",
"#green": "00ff00",
"#yellow": "ffff00",
"#blue": "0000ff",
"#fuchsia": "ff00ff",
"#turquoise": "00ffff",
"#white": "ffffff",
}
// Colour represents an RGB colour.
type Colour int32
// NewColour creates a Colour directly from RGB values.
func NewColour(r, g, b uint8) Colour {
return ParseColour(fmt.Sprintf("%02x%02x%02x", r, g, b))
}
// Distance between this colour and another.
//
// This uses the approach described here (https://www.compuphase.com/cmetric.htm).
// This is not as accurate as LAB, et. al. but is *vastly* simpler and sufficient for our needs.
func (c Colour) Distance(e2 Colour) float64 {
ar, ag, ab := int64(c.Red()), int64(c.Green()), int64(c.Blue())
br, bg, bb := int64(e2.Red()), int64(e2.Green()), int64(e2.Blue())
rmean := (ar + br) / 2
r := ar - br
g := ag - bg
b := ab - bb
return math.Sqrt(float64((((512 + rmean) * r * r) >> 8) + 4*g*g + (((767 - rmean) * b * b) >> 8)))
}
// Brighten returns a copy of this colour with its brightness adjusted.
//
// If factor is negative, the colour is darkened.
//
// Uses approach described here (http://www.pvladov.com/2012/09/make-color-lighter-or-darker.html).
func (c Colour) Brighten(factor float64) Colour {
r := float64(c.Red())
g := float64(c.Green())
b := float64(c.Blue())
if factor < 0 {
factor++
r *= factor
g *= factor
b *= factor
} else {
r = (255-r)*factor + r
g = (255-g)*factor + g
b = (255-b)*factor + b
}
return NewColour(uint8(r), uint8(g), uint8(b))
}
// BrightenOrDarken brightens a colour if it is < 0.5 brighteness or darkens if > 0.5 brightness.
func (c Colour) BrightenOrDarken(factor float64) Colour {
if c.Brightness() < 0.5 {
return c.Brighten(factor)
}
return c.Brighten(-factor)
}
// Brightness of the colour (roughly) in the range 0.0 to 1.0
func (c Colour) Brightness() float64 {
return (float64(c.Red()) + float64(c.Green()) + float64(c.Blue())) / 255.0 / 3.0
}
// ParseColour in the forms #rgb, #rrggbb, #ansi<colour>, or #<colour>.
// Will return an "unset" colour if invalid.
func ParseColour(colour string) Colour {
colour = normaliseColour(colour)
n, err := strconv.ParseUint(colour, 16, 32)
if err != nil {
return 0
}
return Colour(n + 1)
}
// MustParseColour is like ParseColour except it panics if the colour is invalid.
//
// Will panic if colour is in an invalid format.
func MustParseColour(colour string) Colour {
parsed := ParseColour(colour)
if !parsed.IsSet() {
panic(fmt.Errorf("invalid colour %q", colour))
}
return parsed
}
// IsSet returns true if the colour is set.
func (c Colour) IsSet() bool { return c != 0 }
func (c Colour) String() string { return fmt.Sprintf("#%06x", int(c-1)) }
func (c Colour) GoString() string { return fmt.Sprintf("Colour(0x%06x)", int(c-1)) }
// Red component of colour.
func (c Colour) Red() uint8 { return uint8(((c - 1) >> 16) & 0xff) }
// Green component of colour.
func (c Colour) Green() uint8 { return uint8(((c - 1) >> 8) & 0xff) }
// Blue component of colour.
func (c Colour) Blue() uint8 { return uint8((c - 1) & 0xff) }
// Colours is an orderable set of colours.
type Colours []Colour
func (c Colours) Len() int { return len(c) }
func (c Colours) Swap(i, j int) { c[i], c[j] = c[j], c[i] }
func (c Colours) Less(i, j int) bool { return c[i] < c[j] }
// Convert colours to #rrggbb.
func normaliseColour(colour string) string {
if ansi, ok := ANSI2RGB[colour]; ok {
return ansi
}
if strings.HasPrefix(colour, "#") {
colour = colour[1:]
if len(colour) == 3 {
return colour[0:1] + colour[0:1] + colour[1:2] + colour[1:2] + colour[2:3] + colour[2:3]
}
}
return colour
}

View file

@ -0,0 +1,42 @@
package chroma
import (
"testing"
"github.com/alecthomas/assert"
)
func TestColourRGB(t *testing.T) {
colour := ParseColour("#8913af")
assert.Equal(t, uint8(0x89), colour.Red())
assert.Equal(t, uint8(0x13), colour.Green())
assert.Equal(t, uint8(0xaf), colour.Blue())
}
func TestColourString(t *testing.T) {
assert.Equal(t, "#8913af", ParseColour("#8913af").String())
}
func distance(a, b uint8) uint8 {
if a < b {
return b - a
}
return a - b
}
func TestColourBrighten(t *testing.T) {
actual := NewColour(128, 128, 128).Brighten(0.5)
// Closeish to what we expect is fine.
assert.True(t, distance(192, actual.Red()) <= 2)
assert.True(t, distance(192, actual.Blue()) <= 2)
assert.True(t, distance(192, actual.Green()) <= 2)
actual = NewColour(128, 128, 128).Brighten(-0.5)
assert.True(t, distance(65, actual.Red()) <= 2)
assert.True(t, distance(65, actual.Blue()) <= 2)
assert.True(t, distance(65, actual.Green()) <= 2)
}
func TestColourBrightess(t *testing.T) {
actual := NewColour(128, 128, 128).Brightness()
assert.True(t, distance(128, uint8(actual*255.0)) <= 2)
}

View file

@ -0,0 +1,137 @@
package chroma
import (
"bytes"
)
type delegatingLexer struct {
root Lexer
language Lexer
}
// DelegatingLexer combines two lexers to handle the common case of a language embedded inside another, such as PHP
// inside HTML or PHP inside plain text.
//
// It takes two lexer as arguments: a root lexer and a language lexer. First everything is scanned using the language
// lexer, which must return "Other" for unrecognised tokens. Then all "Other" tokens are lexed using the root lexer.
// Finally, these two sets of tokens are merged.
//
// The lexers from the template lexer package use this base lexer.
func DelegatingLexer(root Lexer, language Lexer) Lexer {
return &delegatingLexer{
root: root,
language: language,
}
}
func (d *delegatingLexer) Config() *Config {
return d.language.Config()
}
// An insertion is the character range where language tokens should be inserted.
type insertion struct {
start, end int
tokens []Token
}
func (d *delegatingLexer) Tokenise(options *TokeniseOptions, text string) (Iterator, error) { // nolint: gocognit
tokens, err := Tokenise(Coalesce(d.language), options, text)
if err != nil {
return nil, err
}
// Compute insertions and gather "Other" tokens.
others := &bytes.Buffer{}
insertions := []*insertion{}
var insert *insertion
offset := 0
var last Token
for _, t := range tokens {
if t.Type == Other {
if last != EOF && insert != nil && last.Type != Other {
insert.end = offset
}
others.WriteString(t.Value)
} else {
if last == EOF || last.Type == Other {
insert = &insertion{start: offset}
insertions = append(insertions, insert)
}
insert.tokens = append(insert.tokens, t)
}
last = t
offset += len(t.Value)
}
if len(insertions) == 0 {
return d.root.Tokenise(options, text)
}
// Lex the other tokens.
rootTokens, err := Tokenise(Coalesce(d.root), options, others.String())
if err != nil {
return nil, err
}
// Interleave the two sets of tokens.
var out []Token
offset = 0 // Offset into text.
tokenIndex := 0
nextToken := func() Token {
if tokenIndex >= len(rootTokens) {
return EOF
}
t := rootTokens[tokenIndex]
tokenIndex++
return t
}
insertionIndex := 0
nextInsertion := func() *insertion {
if insertionIndex >= len(insertions) {
return nil
}
i := insertions[insertionIndex]
insertionIndex++
return i
}
t := nextToken()
i := nextInsertion()
for t != EOF || i != nil {
// fmt.Printf("%d->%d:%q %d->%d:%q\n", offset, offset+len(t.Value), t.Value, i.start, i.end, Stringify(i.tokens...))
if t == EOF || (i != nil && i.start < offset+len(t.Value)) {
var l Token
l, t = splitToken(t, i.start-offset)
if l != EOF {
out = append(out, l)
offset += len(l.Value)
}
out = append(out, i.tokens...)
offset += i.end - i.start
if t == EOF {
t = nextToken()
}
i = nextInsertion()
} else {
out = append(out, t)
offset += len(t.Value)
t = nextToken()
}
}
return Literator(out...), nil
}
func splitToken(t Token, offset int) (l Token, r Token) {
if t == EOF {
return EOF, EOF
}
if offset == 0 {
return EOF, t
}
if offset == len(t.Value) {
return t, EOF
}
l = t.Clone()
r = t.Clone()
l.Value = l.Value[:offset]
r.Value = r.Value[offset:]
return
}

View file

@ -0,0 +1,111 @@
package chroma
import (
"testing"
"github.com/alecthomas/assert"
)
func makeDelegationTestLexers() (lang Lexer, root Lexer) {
return MustNewLexer(nil, Rules{
"root": {
{`\<\?`, CommentPreproc, Push("inside")},
{`.`, Other, nil},
},
"inside": {
{`\?\>`, CommentPreproc, Pop(1)},
{`\bwhat\b`, Keyword, nil},
{`\s+`, Whitespace, nil},
},
}),
MustNewLexer(nil, Rules{
"root": {
{`\bhello\b`, Keyword, nil},
{`\b(world|there)\b`, Name, nil},
{`\s+`, Whitespace, nil},
},
})
}
func TestDelegate(t *testing.T) {
testdata := []struct {
name string
source string
expected []Token
}{
{"SourceInMiddle", `hello world <? what ?> there`, []Token{
{Keyword, "hello"},
{TextWhitespace, " "},
{Name, "world"},
{TextWhitespace, " "},
// lang
{CommentPreproc, "<?"},
{Whitespace, " "},
{Keyword, "what"},
{Whitespace, " "},
{CommentPreproc, "?>"},
// /lang
{TextWhitespace, " "},
{Name, "there"},
}},
{"SourceBeginning", `<? what ?> hello world there`, []Token{
{CommentPreproc, "<?"},
{TextWhitespace, " "},
{Keyword, "what"},
{TextWhitespace, " "},
{CommentPreproc, "?>"},
{TextWhitespace, " "},
{Keyword, "hello"},
{TextWhitespace, " "},
{Name, "world"},
{TextWhitespace, " "},
{Name, "there"},
}},
{"SourceEnd", `hello world <? what there`, []Token{
{Keyword, "hello"},
{TextWhitespace, " "},
{Name, "world"},
{TextWhitespace, " "},
// lang
{CommentPreproc, "<?"},
{Whitespace, " "},
{Keyword, "what"},
{TextWhitespace, " "},
{Error, "there"},
}},
{"SourceMultiple", "hello world <? what ?> hello there <? what ?> hello", []Token{
{Keyword, "hello"},
{TextWhitespace, " "},
{Name, "world"},
{TextWhitespace, " "},
{CommentPreproc, "<?"},
{TextWhitespace, " "},
{Keyword, "what"},
{TextWhitespace, " "},
{CommentPreproc, "?>"},
{TextWhitespace, " "},
{Keyword, "hello"},
{TextWhitespace, " "},
{Name, "there"},
{TextWhitespace, " "},
{CommentPreproc, "<?"},
{TextWhitespace, " "},
{Keyword, "what"},
{TextWhitespace, " "},
{CommentPreproc, "?>"},
{TextWhitespace, " "},
{Keyword, "hello"},
}},
}
lang, root := makeDelegationTestLexers()
delegate := DelegatingLexer(root, lang)
for _, test := range testdata {
// nolint: scopelint
t.Run(test.name, func(t *testing.T) {
it, err := delegate.Tokenise(nil, test.source)
assert.NoError(t, err)
actual := it.Tokens()
assert.Equal(t, test.expected, actual)
})
}
}

View file

@ -0,0 +1,7 @@
// Package chroma takes source code and other structured text and converts it into syntax highlighted HTML, ANSI-
// coloured text, etc.
//
// Chroma is based heavily on Pygments, and includes translators for Pygments lexers and styles.
//
// For more information, go here: https://github.com/alecthomas/chroma
package chroma

View file

@ -0,0 +1,43 @@
package chroma
import (
"io"
)
// A Formatter for Chroma lexers.
type Formatter interface {
// Format returns a formatting function for tokens.
//
// If the iterator panics, the Formatter should recover.
Format(w io.Writer, style *Style, iterator Iterator) error
}
// A FormatterFunc is a Formatter implemented as a function.
//
// Guards against iterator panics.
type FormatterFunc func(w io.Writer, style *Style, iterator Iterator) error
func (f FormatterFunc) Format(w io.Writer, s *Style, it Iterator) (err error) { // nolint
defer func() {
if perr := recover(); perr != nil {
err = perr.(error)
}
}()
return f(w, s, it)
}
type recoveringFormatter struct {
Formatter
}
func (r recoveringFormatter) Format(w io.Writer, s *Style, it Iterator) (err error) {
defer func() {
if perr := recover(); perr != nil {
err = perr.(error)
}
}()
return r.Formatter.Format(w, s, it)
}
// RecoveringFormatter wraps a formatter with panic recovery.
func RecoveringFormatter(formatter Formatter) Formatter { return recoveringFormatter{formatter} }

View file

@ -0,0 +1,57 @@
package formatters
import (
"io"
"sort"
"github.com/alecthomas/chroma"
"github.com/alecthomas/chroma/formatters/html"
"github.com/alecthomas/chroma/formatters/svg"
)
var (
// NoOp formatter.
NoOp = Register("noop", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, iterator chroma.Iterator) error {
for t := iterator(); t != chroma.EOF; t = iterator() {
if _, err := io.WriteString(w, t.Value); err != nil {
return err
}
}
return nil
}))
// Default HTML formatter outputs self-contained HTML.
htmlFull = Register("html", html.New(html.Standalone(true), html.WithClasses(true))) // nolint
SVG = Register("svg", svg.New(svg.EmbedFont("Liberation Mono", svg.FontLiberationMono, svg.WOFF)))
)
// Fallback formatter.
var Fallback = NoOp
// Registry of Formatters.
var Registry = map[string]chroma.Formatter{}
// Names of registered formatters.
func Names() []string {
out := []string{}
for name := range Registry {
out = append(out, name)
}
sort.Strings(out)
return out
}
// Get formatter by name.
//
// If the given formatter is not found, the Fallback formatter will be returned.
func Get(name string) chroma.Formatter {
if f, ok := Registry[name]; ok {
return f
}
return Fallback
}
// Register a named formatter.
func Register(name string, formatter chroma.Formatter) chroma.Formatter {
Registry[name] = formatter
return formatter
}

View file

@ -0,0 +1,443 @@
package html
import (
"fmt"
"html"
"io"
"sort"
"strings"
"github.com/alecthomas/chroma"
)
// Option sets an option of the HTML formatter.
type Option func(f *Formatter)
// Standalone configures the HTML formatter for generating a standalone HTML document.
func Standalone(b bool) Option { return func(f *Formatter) { f.standalone = b } }
// ClassPrefix sets the CSS class prefix.
func ClassPrefix(prefix string) Option { return func(f *Formatter) { f.prefix = prefix } }
// WithClasses emits HTML using CSS classes, rather than inline styles.
func WithClasses(b bool) Option { return func(f *Formatter) { f.Classes = b } }
// WithAllClasses disables an optimisation that omits redundant CSS classes.
func WithAllClasses(b bool) Option { return func(f *Formatter) { f.allClasses = b } }
// TabWidth sets the number of characters for a tab. Defaults to 8.
func TabWidth(width int) Option { return func(f *Formatter) { f.tabWidth = width } }
// PreventSurroundingPre prevents the surrounding pre tags around the generated code.
func PreventSurroundingPre(b bool) Option {
return func(f *Formatter) {
if b {
f.preWrapper = nopPreWrapper
} else {
f.preWrapper = defaultPreWrapper
}
}
}
// WithPreWrapper allows control of the surrounding pre tags.
func WithPreWrapper(wrapper PreWrapper) Option {
return func(f *Formatter) {
f.preWrapper = wrapper
}
}
// WithLineNumbers formats output with line numbers.
func WithLineNumbers(b bool) Option {
return func(f *Formatter) {
f.lineNumbers = b
}
}
// LineNumbersInTable will, when combined with WithLineNumbers, separate the line numbers
// and code in table td's, which make them copy-and-paste friendly.
func LineNumbersInTable(b bool) Option {
return func(f *Formatter) {
f.lineNumbersInTable = b
}
}
// LinkableLineNumbers decorates the line numbers HTML elements with an "id"
// attribute so they can be linked.
func LinkableLineNumbers(b bool, prefix string) Option {
return func(f *Formatter) {
f.linkableLineNumbers = b
f.lineNumbersIDPrefix = prefix
}
}
// HighlightLines higlights the given line ranges with the Highlight style.
//
// A range is the beginning and ending of a range as 1-based line numbers, inclusive.
func HighlightLines(ranges [][2]int) Option {
return func(f *Formatter) {
f.highlightRanges = ranges
sort.Sort(f.highlightRanges)
}
}
// BaseLineNumber sets the initial number to start line numbering at. Defaults to 1.
func BaseLineNumber(n int) Option {
return func(f *Formatter) {
f.baseLineNumber = n
}
}
// New HTML formatter.
func New(options ...Option) *Formatter {
f := &Formatter{
baseLineNumber: 1,
preWrapper: defaultPreWrapper,
}
for _, option := range options {
option(f)
}
return f
}
// PreWrapper defines the operations supported in WithPreWrapper.
type PreWrapper interface {
// Start is called to write a start <pre> element.
// The code flag tells whether this block surrounds
// highlighted code. This will be false when surrounding
// line numbers.
Start(code bool, styleAttr string) string
// End is called to write the end </pre> element.
End(code bool) string
}
type preWrapper struct {
start func(code bool, styleAttr string) string
end func(code bool) string
}
func (p preWrapper) Start(code bool, styleAttr string) string {
return p.start(code, styleAttr)
}
func (p preWrapper) End(code bool) string {
return p.end(code)
}
var (
nopPreWrapper = preWrapper{
start: func(code bool, styleAttr string) string { return "" },
end: func(code bool) string { return "" },
}
defaultPreWrapper = preWrapper{
start: func(code bool, styleAttr string) string {
return fmt.Sprintf("<pre%s>", styleAttr)
},
end: func(code bool) string {
return "</pre>"
},
}
)
// Formatter that generates HTML.
type Formatter struct {
standalone bool
prefix string
Classes bool // Exported field to detect when classes are being used
allClasses bool
preWrapper PreWrapper
tabWidth int
lineNumbers bool
lineNumbersInTable bool
linkableLineNumbers bool
lineNumbersIDPrefix string
highlightRanges highlightRanges
baseLineNumber int
}
type highlightRanges [][2]int
func (h highlightRanges) Len() int { return len(h) }
func (h highlightRanges) Swap(i, j int) { h[i], h[j] = h[j], h[i] }
func (h highlightRanges) Less(i, j int) bool { return h[i][0] < h[j][0] }
func (f *Formatter) Format(w io.Writer, style *chroma.Style, iterator chroma.Iterator) (err error) {
return f.writeHTML(w, style, iterator.Tokens())
}
// We deliberately don't use html/template here because it is two orders of magnitude slower (benchmarked).
//
// OTOH we need to be super careful about correct escaping...
func (f *Formatter) writeHTML(w io.Writer, style *chroma.Style, tokens []chroma.Token) (err error) { // nolint: gocyclo
css := f.styleToCSS(style)
if !f.Classes {
for t, style := range css {
css[t] = compressStyle(style)
}
}
if f.standalone {
fmt.Fprint(w, "<html>\n")
if f.Classes {
fmt.Fprint(w, "<style type=\"text/css\">\n")
err = f.WriteCSS(w, style)
if err != nil {
return err
}
fmt.Fprintf(w, "body { %s; }\n", css[chroma.Background])
fmt.Fprint(w, "</style>")
}
fmt.Fprintf(w, "<body%s>\n", f.styleAttr(css, chroma.Background))
}
wrapInTable := f.lineNumbers && f.lineNumbersInTable
lines := chroma.SplitTokensIntoLines(tokens)
lineDigits := len(fmt.Sprintf("%d", f.baseLineNumber+len(lines)-1))
highlightIndex := 0
if wrapInTable {
// List line numbers in its own <td>
fmt.Fprintf(w, "<div%s>\n", f.styleAttr(css, chroma.Background))
fmt.Fprintf(w, "<table%s><tr>", f.styleAttr(css, chroma.LineTable))
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD))
fmt.Fprintf(w, f.preWrapper.Start(false, f.styleAttr(css, chroma.Background)))
for index := range lines {
line := f.baseLineNumber + index
highlight, next := f.shouldHighlight(highlightIndex, line)
if next {
highlightIndex++
}
if highlight {
fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
}
fmt.Fprintf(w, "<span%s%s>%*d\n</span>", f.styleAttr(css, chroma.LineNumbersTable), f.lineIDAttribute(line), lineDigits, line)
if highlight {
fmt.Fprintf(w, "</span>")
}
}
fmt.Fprint(w, f.preWrapper.End(false))
fmt.Fprint(w, "</td>\n")
fmt.Fprintf(w, "<td%s>\n", f.styleAttr(css, chroma.LineTableTD, "width:100%"))
}
fmt.Fprintf(w, f.preWrapper.Start(true, f.styleAttr(css, chroma.Background)))
highlightIndex = 0
for index, tokens := range lines {
// 1-based line number.
line := f.baseLineNumber + index
highlight, next := f.shouldHighlight(highlightIndex, line)
if next {
highlightIndex++
}
if highlight {
fmt.Fprintf(w, "<span%s>", f.styleAttr(css, chroma.LineHighlight))
}
if f.lineNumbers && !wrapInTable {
fmt.Fprintf(w, "<span%s%s>%*d</span>", f.styleAttr(css, chroma.LineNumbers), f.lineIDAttribute(line), lineDigits, line)
}
for _, token := range tokens {
html := html.EscapeString(token.String())
attr := f.styleAttr(css, token.Type)
if attr != "" {
html = fmt.Sprintf("<span%s>%s</span>", attr, html)
}
fmt.Fprint(w, html)
}
if highlight {
fmt.Fprintf(w, "</span>")
}
}
fmt.Fprintf(w, f.preWrapper.End(true))
if wrapInTable {
fmt.Fprint(w, "</td></tr></table>\n")
fmt.Fprint(w, "</div>\n")
}
if f.standalone {
fmt.Fprint(w, "\n</body>\n")
fmt.Fprint(w, "</html>\n")
}
return nil
}
func (f *Formatter) lineIDAttribute(line int) string {
if !f.linkableLineNumbers {
return ""
}
return fmt.Sprintf(" id=\"%s%d\"", f.lineNumbersIDPrefix, line)
}
func (f *Formatter) shouldHighlight(highlightIndex, line int) (bool, bool) {
next := false
for highlightIndex < len(f.highlightRanges) && line > f.highlightRanges[highlightIndex][1] {
highlightIndex++
next = true
}
if highlightIndex < len(f.highlightRanges) {
hrange := f.highlightRanges[highlightIndex]
if line >= hrange[0] && line <= hrange[1] {
return true, next
}
}
return false, next
}
func (f *Formatter) class(t chroma.TokenType) string {
for t != 0 {
if cls, ok := chroma.StandardTypes[t]; ok {
if cls != "" {
return f.prefix + cls
}
return ""
}
t = t.Parent()
}
if cls := chroma.StandardTypes[t]; cls != "" {
return f.prefix + cls
}
return ""
}
func (f *Formatter) styleAttr(styles map[chroma.TokenType]string, tt chroma.TokenType, extraCSS ...string) string {
if f.Classes {
cls := f.class(tt)
if cls == "" {
return ""
}
return fmt.Sprintf(` class="%s"`, cls)
}
if _, ok := styles[tt]; !ok {
tt = tt.SubCategory()
if _, ok := styles[tt]; !ok {
tt = tt.Category()
if _, ok := styles[tt]; !ok {
return ""
}
}
}
css := []string{styles[tt]}
css = append(css, extraCSS...)
return fmt.Sprintf(` style="%s"`, strings.Join(css, ";"))
}
func (f *Formatter) tabWidthStyle() string {
if f.tabWidth != 0 && f.tabWidth != 8 {
return fmt.Sprintf("; -moz-tab-size: %[1]d; -o-tab-size: %[1]d; tab-size: %[1]d", f.tabWidth)
}
return ""
}
// WriteCSS writes CSS style definitions (without any surrounding HTML).
func (f *Formatter) WriteCSS(w io.Writer, style *chroma.Style) error {
css := f.styleToCSS(style)
// Special-case background as it is mapped to the outer ".chroma" class.
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma { %s }\n", chroma.Background, f.prefix, css[chroma.Background]); err != nil {
return err
}
// Special-case code column of table to expand width.
if f.lineNumbers && f.lineNumbersInTable {
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma .%s:last-child { width: 100%%; }",
chroma.LineTableTD, f.prefix, f.class(chroma.LineTableTD)); err != nil {
return err
}
}
// Special-case line number highlighting when targeted.
if f.lineNumbers || f.lineNumbersInTable {
targetedLineCSS := StyleEntryToCSS(style.Get(chroma.LineHighlight))
for _, tt := range []chroma.TokenType{chroma.LineNumbers, chroma.LineNumbersTable} {
fmt.Fprintf(w, "/* %s targeted by URL anchor */ .%schroma .%s:target { %s }\n", tt, f.prefix, f.class(tt), targetedLineCSS)
}
}
tts := []int{}
for tt := range css {
tts = append(tts, int(tt))
}
sort.Ints(tts)
for _, ti := range tts {
tt := chroma.TokenType(ti)
if tt == chroma.Background {
continue
}
class := f.class(tt)
if class == "" {
continue
}
styles := css[tt]
if _, err := fmt.Fprintf(w, "/* %s */ .%schroma .%s { %s }\n", tt, f.prefix, class, styles); err != nil {
return err
}
}
return nil
}
func (f *Formatter) styleToCSS(style *chroma.Style) map[chroma.TokenType]string {
classes := map[chroma.TokenType]string{}
bg := style.Get(chroma.Background)
// Convert the style.
for t := range chroma.StandardTypes {
entry := style.Get(t)
if t != chroma.Background {
entry = entry.Sub(bg)
}
if !f.allClasses && entry.IsZero() {
continue
}
classes[t] = StyleEntryToCSS(entry)
}
classes[chroma.Background] += f.tabWidthStyle()
lineNumbersStyle := "margin-right: 0.4em; padding: 0 0.4em 0 0.4em;"
// All rules begin with default rules followed by user provided rules
classes[chroma.LineNumbers] = lineNumbersStyle + classes[chroma.LineNumbers]
classes[chroma.LineNumbersTable] = lineNumbersStyle + classes[chroma.LineNumbersTable]
classes[chroma.LineHighlight] = "display: block; width: 100%;" + classes[chroma.LineHighlight]
classes[chroma.LineTable] = "border-spacing: 0; padding: 0; margin: 0; border: 0; width: auto; overflow: auto; display: block;" + classes[chroma.LineTable]
classes[chroma.LineTableTD] = "vertical-align: top; padding: 0; margin: 0; border: 0;" + classes[chroma.LineTableTD]
return classes
}
// StyleEntryToCSS converts a chroma.StyleEntry to CSS attributes.
func StyleEntryToCSS(e chroma.StyleEntry) string {
styles := []string{}
if e.Colour.IsSet() {
styles = append(styles, "color: "+e.Colour.String())
}
if e.Background.IsSet() {
styles = append(styles, "background-color: "+e.Background.String())
}
if e.Bold == chroma.Yes {
styles = append(styles, "font-weight: bold")
}
if e.Italic == chroma.Yes {
styles = append(styles, "font-style: italic")
}
if e.Underline == chroma.Yes {
styles = append(styles, "text-decoration: underline")
}
return strings.Join(styles, "; ")
}
// Compress CSS attributes - remove spaces, transform 6-digit colours to 3.
func compressStyle(s string) string {
parts := strings.Split(s, ";")
out := []string{}
for _, p := range parts {
p = strings.Join(strings.Fields(p), " ")
p = strings.Replace(p, ": ", ":", 1)
if strings.Contains(p, "#") {
c := p[len(p)-6:]
if c[0] == c[1] && c[2] == c[3] && c[4] == c[5] {
p = p[:len(p)-6] + c[0:1] + c[2:3] + c[4:5]
}
}
out = append(out, p)
}
return strings.Join(out, ";")
}

View file

@ -0,0 +1,235 @@
package html
import (
"bytes"
"fmt"
"io/ioutil"
"strings"
"testing"
"github.com/alecthomas/assert"
"github.com/alecthomas/chroma"
"github.com/alecthomas/chroma/lexers"
"github.com/alecthomas/chroma/styles"
)
func TestCompressStyle(t *testing.T) {
style := "color: #888888; background-color: #faffff"
actual := compressStyle(style)
expected := "color:#888;background-color:#faffff"
assert.Equal(t, expected, actual)
}
func BenchmarkHTMLFormatter(b *testing.B) {
formatter := New()
b.ResetTimer()
for i := 0; i < b.N; i++ {
it, err := lexers.Get("go").Tokenise(nil, "package main\nfunc main()\n{\nprintln(`hello world`)\n}\n")
assert.NoError(b, err)
err = formatter.Format(ioutil.Discard, styles.Fallback, it)
assert.NoError(b, err)
}
}
func TestSplitTokensIntoLines(t *testing.T) {
in := []chroma.Token{
{Value: "hello", Type: chroma.NameKeyword},
{Value: " world\nwhat?\n", Type: chroma.NameKeyword},
}
expected := [][]chroma.Token{
{
{Type: chroma.NameKeyword, Value: "hello"},
{Type: chroma.NameKeyword, Value: " world\n"},
},
{
{Type: chroma.NameKeyword, Value: "what?\n"},
},
}
actual := chroma.SplitTokensIntoLines(in)
assert.Equal(t, expected, actual)
}
func TestFormatterStyleToCSS(t *testing.T) {
builder := styles.Get("github").Builder()
builder.Add(chroma.LineHighlight, "bg:#ffffcc")
builder.Add(chroma.LineNumbers, "bold")
style, err := builder.Build()
if err != nil {
t.Error(err)
}
formatter := New(WithClasses(true))
css := formatter.styleToCSS(style)
for _, s := range css {
if strings.HasPrefix(strings.TrimSpace(s), ";") {
t.Errorf("rule starts with semicolon - expected valid css rule without semicolon: %v", s)
}
}
}
func TestClassPrefix(t *testing.T) {
wantPrefix := "some-prefix-"
withPrefix := New(WithClasses(true), ClassPrefix(wantPrefix))
noPrefix := New(WithClasses(true))
for st := range chroma.StandardTypes {
if noPrefix.class(st) == "" {
if got := withPrefix.class(st); got != "" {
t.Errorf("Formatter.class(%v): prefix shouldn't be added to empty classes", st)
}
} else if got := withPrefix.class(st); !strings.HasPrefix(got, wantPrefix) {
t.Errorf("Formatter.class(%v): %q should have a class prefix", st, got)
}
}
var styleBuf bytes.Buffer
err := withPrefix.WriteCSS(&styleBuf, styles.Fallback)
assert.NoError(t, err)
if !strings.Contains(styleBuf.String(), ".some-prefix-chroma ") {
t.Error("Stylesheets should have a class prefix")
}
}
func TestTableLineNumberNewlines(t *testing.T) {
f := New(WithClasses(true), WithLineNumbers(true), LineNumbersInTable(true))
it, err := lexers.Get("go").Tokenise(nil, "package main\nfunc main()\n{\nprintln(`hello world`)\n}\n")
assert.NoError(t, err)
var buf bytes.Buffer
err = f.Format(&buf, styles.Fallback, it)
assert.NoError(t, err)
// Don't bother testing the whole output, just verify it's got line numbers
// in a <pre>-friendly format.
// Note: placing the newlines inside the <span> lets browser selections look
// better, instead of "skipping" over the span margin.
assert.Contains(t, buf.String(), `<span class="lnt">2
</span><span class="lnt">3
</span><span class="lnt">4
</span>`)
}
func TestTableLineNumberSpacing(t *testing.T) {
testCases := []struct {
baseLineNumber int
expectedBuf string
}{{
7,
`<span class="lnt"> 7
</span><span class="lnt"> 8
</span><span class="lnt"> 9
</span><span class="lnt">10
</span><span class="lnt">11
</span>`,
}, {
6,
`<span class="lnt"> 6
</span><span class="lnt"> 7
</span><span class="lnt"> 8
</span><span class="lnt"> 9
</span><span class="lnt">10
</span>`,
}, {
5,
`<span class="lnt">5
</span><span class="lnt">6
</span><span class="lnt">7
</span><span class="lnt">8
</span><span class="lnt">9
</span>`,
}}
for i, testCase := range testCases {
f := New(
WithClasses(true),
WithLineNumbers(true),
LineNumbersInTable(true),
BaseLineNumber(testCase.baseLineNumber),
)
it, err := lexers.Get("go").Tokenise(nil, "package main\nfunc main()\n{\nprintln(`hello world`)\n}\n")
assert.NoError(t, err)
var buf bytes.Buffer
err = f.Format(&buf, styles.Fallback, it)
assert.NoError(t, err, "Test Case %d", i)
assert.Contains(t, buf.String(), testCase.expectedBuf, "Test Case %d", i)
}
}
func TestWithPreWrapper(t *testing.T) {
wrapper := preWrapper{
start: func(code bool, styleAttr string) string {
return fmt.Sprintf("<foo%s id=\"code-%t\">", styleAttr, code)
},
end: func(code bool) string {
return fmt.Sprintf("</foo>")
},
}
format := func(f *Formatter) string {
it, err := lexers.Get("bash").Tokenise(nil, "echo FOO")
assert.NoError(t, err)
var buf bytes.Buffer
err = f.Format(&buf, styles.Fallback, it)
assert.NoError(t, err)
return buf.String()
}
t.Run("Regular", func(t *testing.T) {
s := format(New(WithClasses(true)))
assert.Equal(t, s, `<pre class="chroma"><span class="nb">echo</span> FOO</pre>`)
})
t.Run("PreventSurroundingPre", func(t *testing.T) {
s := format(New(PreventSurroundingPre(true), WithClasses(true)))
assert.Equal(t, s, `<span class="nb">echo</span> FOO`)
})
t.Run("Wrapper", func(t *testing.T) {
s := format(New(WithPreWrapper(wrapper), WithClasses(true)))
assert.Equal(t, s, `<foo class="chroma" id="code-true"><span class="nb">echo</span> FOO</foo>`)
})
t.Run("Wrapper, LineNumbersInTable", func(t *testing.T) {
s := format(New(WithPreWrapper(wrapper), WithClasses(true), WithLineNumbers(true), LineNumbersInTable(true)))
assert.Equal(t, s, `<div class="chroma">
<table class="lntable"><tr><td class="lntd">
<foo class="chroma" id="code-false"><span class="lnt">1
</span></foo></td>
<td class="lntd">
<foo class="chroma" id="code-true"><span class="nb">echo</span> FOO</foo></td></tr></table>
</div>
`)
})
}
func TestReconfigureOptions(t *testing.T) {
options := []Option{
WithClasses(true),
WithLineNumbers(true),
}
options = append(options, WithLineNumbers(false))
f := New(options...)
it, err := lexers.Get("bash").Tokenise(nil, "echo FOO")
assert.NoError(t, err)
var buf bytes.Buffer
err = f.Format(&buf, styles.Fallback, it)
assert.NoError(t, err)
assert.Equal(t, `<pre class="chroma"><span class="nb">echo</span> FOO</pre>`, buf.String())
}
func TestWriteCssWithAllClasses(t *testing.T) {
formatter := New()
formatter.allClasses = true
var buf bytes.Buffer
err := formatter.WriteCSS(&buf, styles.Fallback)
assert.NoError(t, err)
assert.NotContains(t, buf.String(), ".chroma . {", "Generated css doesn't contain invalid css")
}

View file

@ -0,0 +1,31 @@
package formatters
import (
"encoding/json"
"fmt"
"io"
"github.com/alecthomas/chroma"
)
// JSON formatter outputs the raw token structures as JSON.
var JSON = Register("json", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
fmt.Fprintln(w, "[")
i := 0
for t := it(); t != chroma.EOF; t = it() {
if i > 0 {
fmt.Fprintln(w, ",")
}
i++
bytes, err := json.Marshal(t)
if err != nil {
return err
}
if _, err := fmt.Fprint(w, " "+string(bytes)); err != nil {
return err
}
}
fmt.Fprintln(w)
fmt.Fprintln(w, "]")
return nil
}))

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,222 @@
// Package svg contains an SVG formatter.
package svg
import (
"encoding/base64"
"errors"
"fmt"
"io"
"io/ioutil"
"path"
"strings"
"github.com/alecthomas/chroma"
)
// Option sets an option of the SVG formatter.
type Option func(f *Formatter)
// FontFamily sets the font-family.
func FontFamily(fontFamily string) Option { return func(f *Formatter) { f.fontFamily = fontFamily } }
// EmbedFontFile embeds given font file
func EmbedFontFile(fontFamily string, fileName string) (option Option, err error) {
var format FontFormat
switch path.Ext(fileName) {
case ".woff":
format = WOFF
case ".woff2":
format = WOFF2
case ".ttf":
format = TRUETYPE
default:
return nil, errors.New("unexpected font file suffix")
}
var content []byte
if content, err = ioutil.ReadFile(fileName); err == nil {
option = EmbedFont(fontFamily, base64.StdEncoding.EncodeToString(content), format)
}
return
}
// EmbedFont embeds given base64 encoded font
func EmbedFont(fontFamily string, font string, format FontFormat) Option {
return func(f *Formatter) { f.fontFamily = fontFamily; f.embeddedFont = font; f.fontFormat = format }
}
// New SVG formatter.
func New(options ...Option) *Formatter {
f := &Formatter{fontFamily: "Consolas, Monaco, Lucida Console, Liberation Mono, DejaVu Sans Mono, Bitstream Vera Sans Mono, Courier New, monospace"}
for _, option := range options {
option(f)
}
return f
}
// Formatter that generates SVG.
type Formatter struct {
fontFamily string
embeddedFont string
fontFormat FontFormat
}
func (f *Formatter) Format(w io.Writer, style *chroma.Style, iterator chroma.Iterator) (err error) {
f.writeSVG(w, style, iterator.Tokens())
return err
}
var svgEscaper = strings.NewReplacer(
`&`, "&amp;",
`<`, "&lt;",
`>`, "&gt;",
`"`, "&quot;",
` `, "&#160;",
` `, "&#160;&#160;&#160;&#160;",
)
// EscapeString escapes special characters.
func escapeString(s string) string {
return svgEscaper.Replace(s)
}
func (f *Formatter) writeSVG(w io.Writer, style *chroma.Style, tokens []chroma.Token) { // nolint: gocyclo
svgStyles := f.styleToSVG(style)
lines := chroma.SplitTokensIntoLines(tokens)
fmt.Fprint(w, "<?xml version=\"1.0\" encoding=\"UTF-8\"?>\n")
fmt.Fprint(w, "<!DOCTYPE svg PUBLIC \"-//W3C//DTD SVG 1.0//EN\" \"http://www.w3.org/TR/2001/REC-SVG-20010904/DTD/svg10.dtd\">\n")
fmt.Fprintf(w, "<svg width=\"%dpx\" height=\"%dpx\" xmlns=\"http://www.w3.org/2000/svg\">\n", 8*maxLineWidth(lines), 10+int(16.8*float64(len(lines)+1)))
if f.embeddedFont != "" {
f.writeFontStyle(w)
}
fmt.Fprintf(w, "<rect width=\"100%%\" height=\"100%%\" fill=\"%s\"/>\n", style.Get(chroma.Background).Background.String())
fmt.Fprintf(w, "<g font-family=\"%s\" font-size=\"14px\" fill=\"%s\">\n", f.fontFamily, style.Get(chroma.Text).Colour.String())
f.writeTokenBackgrounds(w, lines, style)
for index, tokens := range lines {
fmt.Fprintf(w, "<text x=\"0\" y=\"%fem\" xml:space=\"preserve\">", 1.2*float64(index+1))
for _, token := range tokens {
text := escapeString(token.String())
attr := f.styleAttr(svgStyles, token.Type)
if attr != "" {
text = fmt.Sprintf("<tspan %s>%s</tspan>", attr, text)
}
fmt.Fprint(w, text)
}
fmt.Fprint(w, "</text>")
}
fmt.Fprint(w, "\n</g>\n")
fmt.Fprint(w, "</svg>\n")
}
func maxLineWidth(lines [][]chroma.Token) int {
maxWidth := 0
for _, tokens := range lines {
length := 0
for _, token := range tokens {
length += len(strings.Replace(token.String(), ` `, " ", -1))
}
if length > maxWidth {
maxWidth = length
}
}
return maxWidth
}
// There is no background attribute for text in SVG so simply calculate the position and text
// of tokens with a background color that differs from the default and add a rectangle for each before
// adding the token.
func (f *Formatter) writeTokenBackgrounds(w io.Writer, lines [][]chroma.Token, style *chroma.Style) {
for index, tokens := range lines {
lineLength := 0
for _, token := range tokens {
length := len(strings.Replace(token.String(), ` `, " ", -1))
tokenBackground := style.Get(token.Type).Background
if tokenBackground.IsSet() && tokenBackground != style.Get(chroma.Background).Background {
fmt.Fprintf(w, "<rect id=\"%s\" x=\"%dch\" y=\"%fem\" width=\"%dch\" height=\"1.2em\" fill=\"%s\" />\n", escapeString(token.String()), lineLength, 1.2*float64(index)+0.25, length, style.Get(token.Type).Background.String())
}
lineLength += length
}
}
}
type FontFormat int
// https://transfonter.org/formats
const (
WOFF FontFormat = iota
WOFF2
TRUETYPE
)
var fontFormats = [...]string{
"woff",
"woff2",
"truetype",
}
func (f *Formatter) writeFontStyle(w io.Writer) {
fmt.Fprintf(w, `<style>
@font-face {
font-family: '%s';
src: url(data:application/x-font-%s;charset=utf-8;base64,%s) format('%s');'
font-weight: normal;
font-style: normal;
}
</style>`, f.fontFamily, fontFormats[f.fontFormat], f.embeddedFont, fontFormats[f.fontFormat])
}
func (f *Formatter) styleAttr(styles map[chroma.TokenType]string, tt chroma.TokenType) string {
if _, ok := styles[tt]; !ok {
tt = tt.SubCategory()
if _, ok := styles[tt]; !ok {
tt = tt.Category()
if _, ok := styles[tt]; !ok {
return ""
}
}
}
return styles[tt]
}
func (f *Formatter) styleToSVG(style *chroma.Style) map[chroma.TokenType]string {
converted := map[chroma.TokenType]string{}
bg := style.Get(chroma.Background)
// Convert the style.
for t := range chroma.StandardTypes {
entry := style.Get(t)
if t != chroma.Background {
entry = entry.Sub(bg)
}
if entry.IsZero() {
continue
}
converted[t] = StyleEntryToSVG(entry)
}
return converted
}
// StyleEntryToSVG converts a chroma.StyleEntry to SVG attributes.
func StyleEntryToSVG(e chroma.StyleEntry) string {
var styles []string
if e.Colour.IsSet() {
styles = append(styles, "fill=\""+e.Colour.String()+"\"")
}
if e.Bold == chroma.Yes {
styles = append(styles, "font-weight=\"bold\"")
}
if e.Italic == chroma.Yes {
styles = append(styles, "font-style=\"italic\"")
}
if e.Underline == chroma.Yes {
styles = append(styles, "text-decoration=\"underline\"")
}
return strings.Join(styles, " ")
}

View file

@ -0,0 +1,18 @@
package formatters
import (
"fmt"
"io"
"github.com/alecthomas/chroma"
)
// Tokens formatter outputs the raw token structures.
var Tokens = Register("tokens", chroma.FormatterFunc(func(w io.Writer, s *chroma.Style, it chroma.Iterator) error {
for t := it(); t != chroma.EOF; t = it() {
if _, err := fmt.Fprintln(w, t.GoString()); err != nil {
return err
}
}
return nil
}))

View file

@ -0,0 +1,260 @@
package formatters
import (
"fmt"
"io"
"math"
"github.com/alecthomas/chroma"
)
type ttyTable struct {
foreground map[chroma.Colour]string
background map[chroma.Colour]string
}
var c = chroma.MustParseColour
var ttyTables = map[int]*ttyTable{
8: {
foreground: map[chroma.Colour]string{
c("#000000"): "\033[30m", c("#7f0000"): "\033[31m", c("#007f00"): "\033[32m", c("#7f7fe0"): "\033[33m",
c("#00007f"): "\033[34m", c("#7f007f"): "\033[35m", c("#007f7f"): "\033[36m", c("#e5e5e5"): "\033[37m",
c("#555555"): "\033[90m", c("#ff0000"): "\033[91m", c("#00ff00"): "\033[92m", c("#ffff00"): "\033[93m",
c("#0000ff"): "\033[94m", c("#ff00ff"): "\033[95m", c("#00ffff"): "\033[96m", c("#ffffff"): "\033[97m",
},
background: map[chroma.Colour]string{
c("#000000"): "\033[40m", c("#7f0000"): "\033[41m", c("#007f00"): "\033[42m", c("#7f7fe0"): "\033[43m",
c("#00007f"): "\033[44m", c("#7f007f"): "\033[45m", c("#007f7f"): "\033[46m", c("#e5e5e5"): "\033[47m",
c("#555555"): "\033[100m", c("#ff0000"): "\033[101m", c("#00ff00"): "\033[102m", c("#ffff00"): "\033[103m",
c("#0000ff"): "\033[104m", c("#ff00ff"): "\033[105m", c("#00ffff"): "\033[106m", c("#ffffff"): "\033[107m",
},
},
256: {
foreground: map[chroma.Colour]string{
c("#000000"): "\033[38;5;0m", c("#800000"): "\033[38;5;1m", c("#008000"): "\033[38;5;2m", c("#808000"): "\033[38;5;3m",
c("#000080"): "\033[38;5;4m", c("#800080"): "\033[38;5;5m", c("#008080"): "\033[38;5;6m", c("#c0c0c0"): "\033[38;5;7m",
c("#808080"): "\033[38;5;8m", c("#ff0000"): "\033[38;5;9m", c("#00ff00"): "\033[38;5;10m", c("#ffff00"): "\033[38;5;11m",
c("#0000ff"): "\033[38;5;12m", c("#ff00ff"): "\033[38;5;13m", c("#00ffff"): "\033[38;5;14m", c("#ffffff"): "\033[38;5;15m",
c("#000000"): "\033[38;5;16m", c("#00005f"): "\033[38;5;17m", c("#000087"): "\033[38;5;18m", c("#0000af"): "\033[38;5;19m",
c("#0000d7"): "\033[38;5;20m", c("#0000ff"): "\033[38;5;21m", c("#005f00"): "\033[38;5;22m", c("#005f5f"): "\033[38;5;23m",
c("#005f87"): "\033[38;5;24m", c("#005faf"): "\033[38;5;25m", c("#005fd7"): "\033[38;5;26m", c("#005fff"): "\033[38;5;27m",
c("#008700"): "\033[38;5;28m", c("#00875f"): "\033[38;5;29m", c("#008787"): "\033[38;5;30m", c("#0087af"): "\033[38;5;31m",
c("#0087d7"): "\033[38;5;32m", c("#0087ff"): "\033[38;5;33m", c("#00af00"): "\033[38;5;34m", c("#00af5f"): "\033[38;5;35m",
c("#00af87"): "\033[38;5;36m", c("#00afaf"): "\033[38;5;37m", c("#00afd7"): "\033[38;5;38m", c("#00afff"): "\033[38;5;39m",
c("#00d700"): "\033[38;5;40m", c("#00d75f"): "\033[38;5;41m", c("#00d787"): "\033[38;5;42m", c("#00d7af"): "\033[38;5;43m",
c("#00d7d7"): "\033[38;5;44m", c("#00d7ff"): "\033[38;5;45m", c("#00ff00"): "\033[38;5;46m", c("#00ff5f"): "\033[38;5;47m",
c("#00ff87"): "\033[38;5;48m", c("#00ffaf"): "\033[38;5;49m", c("#00ffd7"): "\033[38;5;50m", c("#00ffff"): "\033[38;5;51m",
c("#5f0000"): "\033[38;5;52m", c("#5f005f"): "\033[38;5;53m", c("#5f0087"): "\033[38;5;54m", c("#5f00af"): "\033[38;5;55m",
c("#5f00d7"): "\033[38;5;56m", c("#5f00ff"): "\033[38;5;57m", c("#5f5f00"): "\033[38;5;58m", c("#5f5f5f"): "\033[38;5;59m",
c("#5f5f87"): "\033[38;5;60m", c("#5f5faf"): "\033[38;5;61m", c("#5f5fd7"): "\033[38;5;62m", c("#5f5fff"): "\033[38;5;63m",
c("#5f8700"): "\033[38;5;64m", c("#5f875f"): "\033[38;5;65m", c("#5f8787"): "\033[38;5;66m", c("#5f87af"): "\033[38;5;67m",
c("#5f87d7"): "\033[38;5;68m", c("#5f87ff"): "\033[38;5;69m", c("#5faf00"): "\033[38;5;70m", c("#5faf5f"): "\033[38;5;71m",
c("#5faf87"): "\033[38;5;72m", c("#5fafaf"): "\033[38;5;73m", c("#5fafd7"): "\033[38;5;74m", c("#5fafff"): "\033[38;5;75m",
c("#5fd700"): "\033[38;5;76m", c("#5fd75f"): "\033[38;5;77m", c("#5fd787"): "\033[38;5;78m", c("#5fd7af"): "\033[38;5;79m",
c("#5fd7d7"): "\033[38;5;80m", c("#5fd7ff"): "\033[38;5;81m", c("#5fff00"): "\033[38;5;82m", c("#5fff5f"): "\033[38;5;83m",
c("#5fff87"): "\033[38;5;84m", c("#5fffaf"): "\033[38;5;85m", c("#5fffd7"): "\033[38;5;86m", c("#5fffff"): "\033[38;5;87m",
c("#870000"): "\033[38;5;88m", c("#87005f"): "\033[38;5;89m", c("#870087"): "\033[38;5;90m", c("#8700af"): "\033[38;5;91m",
c("#8700d7"): "\033[38;5;92m", c("#8700ff"): "\033[38;5;93m", c("#875f00"): "\033[38;5;94m", c("#875f5f"): "\033[38;5;95m",
c("#875f87"): "\033[38;5;96m", c("#875faf"): "\033[38;5;97m", c("#875fd7"): "\033[38;5;98m", c("#875fff"): "\033[38;5;99m",
c("#878700"): "\033[38;5;100m", c("#87875f"): "\033[38;5;101m", c("#878787"): "\033[38;5;102m", c("#8787af"): "\033[38;5;103m",
c("#8787d7"): "\033[38;5;104m", c("#8787ff"): "\033[38;5;105m", c("#87af00"): "\033[38;5;106m", c("#87af5f"): "\033[38;5;107m",
c("#87af87"): "\033[38;5;108m", c("#87afaf"): "\033[38;5;109m", c("#87afd7"): "\033[38;5;110m", c("#87afff"): "\033[38;5;111m",
c("#87d700"): "\033[38;5;112m", c("#87d75f"): "\033[38;5;113m", c("#87d787"): "\033[38;5;114m", c("#87d7af"): "\033[38;5;115m",
c("#87d7d7"): "\033[38;5;116m", c("#87d7ff"): "\033[38;5;117m", c("#87ff00"): "\033[38;5;118m", c("#87ff5f"): "\033[38;5;119m",
c("#87ff87"): "\033[38;5;120m", c("#87ffaf"): "\033[38;5;121m", c("#87ffd7"): "\033[38;5;122m", c("#87ffff"): "\033[38;5;123m",
c("#af0000"): "\033[38;5;124m", c("#af005f"): "\033[38;5;125m", c("#af0087"): "\033[38;5;126m", c("#af00af"): "\033[38;5;127m",
c("#af00d7"): "\033[38;5;128m", c("#af00ff"): "\033[38;5;129m", c("#af5f00"): "\033[38;5;130m", c("#af5f5f"): "\033[38;5;131m",
c("#af5f87"): "\033[38;5;132m", c("#af5faf"): "\033[38;5;133m", c("#af5fd7"): "\033[38;5;134m", c("#af5fff"): "\033[38;5;135m",
c("#af8700"): "\033[38;5;136m", c("#af875f"): "\033[38;5;137m", c("#af8787"): "\033[38;5;138m", c("#af87af"): "\033[38;5;139m",
c("#af87d7"): "\033[38;5;140m", c("#af87ff"): "\033[38;5;141m", c("#afaf00"): "\033[38;5;142m", c("#afaf5f"): "\033[38;5;143m",
c("#afaf87"): "\033[38;5;144m", c("#afafaf"): "\033[38;5;145m", c("#afafd7"): "\033[38;5;146m", c("#afafff"): "\033[38;5;147m",
c("#afd700"): "\033[38;5;148m", c("#afd75f"): "\033[38;5;149m", c("#afd787"): "\033[38;5;150m", c("#afd7af"): "\033[38;5;151m",
c("#afd7d7"): "\033[38;5;152m", c("#afd7ff"): "\033[38;5;153m", c("#afff00"): "\033[38;5;154m", c("#afff5f"): "\033[38;5;155m",
c("#afff87"): "\033[38;5;156m", c("#afffaf"): "\033[38;5;157m", c("#afffd7"): "\033[38;5;158m", c("#afffff"): "\033[38;5;159m",
c("#d70000"): "\033[38;5;160m", c("#d7005f"): "\033[38;5;161m", c("#d70087"): "\033[38;5;162m", c("#d700af"): "\033[38;5;163m",
c("#d700d7"): "\033[38;5;164m", c("#d700ff"): "\033[38;5;165m", c("#d75f00"): "\033[38;5;166m", c("#d75f5f"): "\033[38;5;167m",
c("#d75f87"): "\033[38;5;168m", c("#d75faf"): "\033[38;5;169m", c("#d75fd7"): "\033[38;5;170m", c("#d75fff"): "\033[38;5;171m",
c("#d78700"): "\033[38;5;172m", c("#d7875f"): "\033[38;5;173m", c("#d78787"): "\033[38;5;174m", c("#d787af"): "\033[38;5;175m",
c("#d787d7"): "\033[38;5;176m", c("#d787ff"): "\033[38;5;177m", c("#d7af00"): "\033[38;5;178m", c("#d7af5f"): "\033[38;5;179m",
c("#d7af87"): "\033[38;5;180m", c("#d7afaf"): "\033[38;5;181m", c("#d7afd7"): "\033[38;5;182m", c("#d7afff"): "\033[38;5;183m",
c("#d7d700"): "\033[38;5;184m", c("#d7d75f"): "\033[38;5;185m", c("#d7d787"): "\033[38;5;186m", c("#d7d7af"): "\033[38;5;187m",
c("#d7d7d7"): "\033[38;5;188m", c("#d7d7ff"): "\033[38;5;189m", c("#d7ff00"): "\033[38;5;190m", c("#d7ff5f"): "\033[38;5;191m",
c("#d7ff87"): "\033[38;5;192m", c("#d7ffaf"): "\033[38;5;193m", c("#d7ffd7"): "\033[38;5;194m", c("#d7ffff"): "\033[38;5;195m",
c("#ff0000"): "\033[38;5;196m", c("#ff005f"): "\033[38;5;197m", c("#ff0087"): "\033[38;5;198m", c("#ff00af"): "\033[38;5;199m",
c("#ff00d7"): "\033[38;5;200m", c("#ff00ff"): "\033[38;5;201m", c("#ff5f00"): "\033[38;5;202m", c("#ff5f5f"): "\033[38;5;203m",
c("#ff5f87"): "\033[38;5;204m", c("#ff5faf"): "\033[38;5;205m", c("#ff5fd7"): "\033[38;5;206m", c("#ff5fff"): "\033[38;5;207m",
c("#ff8700"): "\033[38;5;208m", c("#ff875f"): "\033[38;5;209m", c("#ff8787"): "\033[38;5;210m", c("#ff87af"): "\033[38;5;211m",
c("#ff87d7"): "\033[38;5;212m", c("#ff87ff"): "\033[38;5;213m", c("#ffaf00"): "\033[38;5;214m", c("#ffaf5f"): "\033[38;5;215m",
c("#ffaf87"): "\033[38;5;216m", c("#ffafaf"): "\033[38;5;217m", c("#ffafd7"): "\033[38;5;218m", c("#ffafff"): "\033[38;5;219m",
c("#ffd700"): "\033[38;5;220m", c("#ffd75f"): "\033[38;5;221m", c("#ffd787"): "\033[38;5;222m", c("#ffd7af"): "\033[38;5;223m",
c("#ffd7d7"): "\033[38;5;224m", c("#ffd7ff"): "\033[38;5;225m", c("#ffff00"): "\033[38;5;226m", c("#ffff5f"): "\033[38;5;227m",
c("#ffff87"): "\033[38;5;228m", c("#ffffaf"): "\033[38;5;229m", c("#ffffd7"): "\033[38;5;230m", c("#ffffff"): "\033[38;5;231m",
c("#080808"): "\033[38;5;232m", c("#121212"): "\033[38;5;233m", c("#1c1c1c"): "\033[38;5;234m", c("#262626"): "\033[38;5;235m",
c("#303030"): "\033[38;5;236m", c("#3a3a3a"): "\033[38;5;237m", c("#444444"): "\033[38;5;238m", c("#4e4e4e"): "\033[38;5;239m",
c("#585858"): "\033[38;5;240m", c("#626262"): "\033[38;5;241m", c("#6c6c6c"): "\033[38;5;242m", c("#767676"): "\033[38;5;243m",
c("#808080"): "\033[38;5;244m", c("#8a8a8a"): "\033[38;5;245m", c("#949494"): "\033[38;5;246m", c("#9e9e9e"): "\033[38;5;247m",
c("#a8a8a8"): "\033[38;5;248m", c("#b2b2b2"): "\033[38;5;249m", c("#bcbcbc"): "\033[38;5;250m", c("#c6c6c6"): "\033[38;5;251m",
c("#d0d0d0"): "\033[38;5;252m", c("#dadada"): "\033[38;5;253m", c("#e4e4e4"): "\033[38;5;254m", c("#eeeeee"): "\033[38;5;255m",
},
background: map[chroma.Colour]string{
c("#000000"): "\033[48;5;0m", c("#800000"): "\033[48;5;1m", c("#008000"): "\033[48;5;2m", c("#808000"): "\033[48;5;3m",
c("#000080"): "\033[48;5;4m", c("#800080"): "\033[48;5;5m", c("#008080"): "\033[48;5;6m", c("#c0c0c0"): "\033[48;5;7m",
c("#808080"): "\033[48;5;8m", c("#ff0000"): "\033[48;5;9m", c("#00ff00"): "\033[48;5;10m", c("#ffff00"): "\033[48;5;11m",
c("#0000ff"): "\033[48;5;12m", c("#ff00ff"): "\033[48;5;13m", c("#00ffff"): "\033[48;5;14m", c("#ffffff"): "\033[48;5;15m",
c("#000000"): "\033[48;5;16m", c("#00005f"): "\033[48;5;17m", c("#000087"): "\033[48;5;18m", c("#0000af"): "\033[48;5;19m",
c("#0000d7"): "\033[48;5;20m", c("#0000ff"): "\033[48;5;21m", c("#005f00"): "\033[48;5;22m", c("#005f5f"): "\033[48;5;23m",
c("#005f87"): "\033[48;5;24m", c("#005faf"): "\033[48;5;25m", c("#005fd7"): "\033[48;5;26m", c("#005fff"): "\033[48;5;27m",
c("#008700"): "\033[48;5;28m", c("#00875f"): "\033[48;5;29m", c("#008787"): "\033[48;5;30m", c("#0087af"): "\033[48;5;31m",
c("#0087d7"): "\033[48;5;32m", c("#0087ff"): "\033[48;5;33m", c("#00af00"): "\033[48;5;34m", c("#00af5f"): "\033[48;5;35m",
c("#00af87"): "\033[48;5;36m", c("#00afaf"): "\033[48;5;37m", c("#00afd7"): "\033[48;5;38m", c("#00afff"): "\033[48;5;39m",
c("#00d700"): "\033[48;5;40m", c("#00d75f"): "\033[48;5;41m", c("#00d787"): "\033[48;5;42m", c("#00d7af"): "\033[48;5;43m",
c("#00d7d7"): "\033[48;5;44m", c("#00d7ff"): "\033[48;5;45m", c("#00ff00"): "\033[48;5;46m", c("#00ff5f"): "\033[48;5;47m",
c("#00ff87"): "\033[48;5;48m", c("#00ffaf"): "\033[48;5;49m", c("#00ffd7"): "\033[48;5;50m", c("#00ffff"): "\033[48;5;51m",
c("#5f0000"): "\033[48;5;52m", c("#5f005f"): "\033[48;5;53m", c("#5f0087"): "\033[48;5;54m", c("#5f00af"): "\033[48;5;55m",
c("#5f00d7"): "\033[48;5;56m", c("#5f00ff"): "\033[48;5;57m", c("#5f5f00"): "\033[48;5;58m", c("#5f5f5f"): "\033[48;5;59m",
c("#5f5f87"): "\033[48;5;60m", c("#5f5faf"): "\033[48;5;61m", c("#5f5fd7"): "\033[48;5;62m", c("#5f5fff"): "\033[48;5;63m",
c("#5f8700"): "\033[48;5;64m", c("#5f875f"): "\033[48;5;65m", c("#5f8787"): "\033[48;5;66m", c("#5f87af"): "\033[48;5;67m",
c("#5f87d7"): "\033[48;5;68m", c("#5f87ff"): "\033[48;5;69m", c("#5faf00"): "\033[48;5;70m", c("#5faf5f"): "\033[48;5;71m",
c("#5faf87"): "\033[48;5;72m", c("#5fafaf"): "\033[48;5;73m", c("#5fafd7"): "\033[48;5;74m", c("#5fafff"): "\033[48;5;75m",
c("#5fd700"): "\033[48;5;76m", c("#5fd75f"): "\033[48;5;77m", c("#5fd787"): "\033[48;5;78m", c("#5fd7af"): "\033[48;5;79m",
c("#5fd7d7"): "\033[48;5;80m", c("#5fd7ff"): "\033[48;5;81m", c("#5fff00"): "\033[48;5;82m", c("#5fff5f"): "\033[48;5;83m",
c("#5fff87"): "\033[48;5;84m", c("#5fffaf"): "\033[48;5;85m", c("#5fffd7"): "\033[48;5;86m", c("#5fffff"): "\033[48;5;87m",
c("#870000"): "\033[48;5;88m", c("#87005f"): "\033[48;5;89m", c("#870087"): "\033[48;5;90m", c("#8700af"): "\033[48;5;91m",
c("#8700d7"): "\033[48;5;92m", c("#8700ff"): "\033[48;5;93m", c("#875f00"): "\033[48;5;94m", c("#875f5f"): "\033[48;5;95m",
c("#875f87"): "\033[48;5;96m", c("#875faf"): "\033[48;5;97m", c("#875fd7"): "\033[48;5;98m", c("#875fff"): "\033[48;5;99m",
c("#878700"): "\033[48;5;100m", c("#87875f"): "\033[48;5;101m", c("#878787"): "\033[48;5;102m", c("#8787af"): "\033[48;5;103m",
c("#8787d7"): "\033[48;5;104m", c("#8787ff"): "\033[48;5;105m", c("#87af00"): "\033[48;5;106m", c("#87af5f"): "\033[48;5;107m",
c("#87af87"): "\033[48;5;108m", c("#87afaf"): "\033[48;5;109m", c("#87afd7"): "\033[48;5;110m", c("#87afff"): "\033[48;5;111m",
c("#87d700"): "\033[48;5;112m", c("#87d75f"): "\033[48;5;113m", c("#87d787"): "\033[48;5;114m", c("#87d7af"): "\033[48;5;115m",
c("#87d7d7"): "\033[48;5;116m", c("#87d7ff"): "\033[48;5;117m", c("#87ff00"): "\033[48;5;118m", c("#87ff5f"): "\033[48;5;119m",
c("#87ff87"): "\033[48;5;120m", c("#87ffaf"): "\033[48;5;121m", c("#87ffd7"): "\033[48;5;122m", c("#87ffff"): "\033[48;5;123m",
c("#af0000"): "\033[48;5;124m", c("#af005f"): "\033[48;5;125m", c("#af0087"): "\033[48;5;126m", c("#af00af"): "\033[48;5;127m",
c("#af00d7"): "\033[48;5;128m", c("#af00ff"): "\033[48;5;129m", c("#af5f00"): "\033[48;5;130m", c("#af5f5f"): "\033[48;5;131m",
c("#af5f87"): "\033[48;5;132m", c("#af5faf"): "\033[48;5;133m", c("#af5fd7"): "\033[48;5;134m", c("#af5fff"): "\033[48;5;135m",
c("#af8700"): "\033[48;5;136m", c("#af875f"): "\033[48;5;137m", c("#af8787"): "\033[48;5;138m", c("#af87af"): "\033[48;5;139m",
c("#af87d7"): "\033[48;5;140m", c("#af87ff"): "\033[48;5;141m", c("#afaf00"): "\033[48;5;142m", c("#afaf5f"): "\033[48;5;143m",
c("#afaf87"): "\033[48;5;144m", c("#afafaf"): "\033[48;5;145m", c("#afafd7"): "\033[48;5;146m", c("#afafff"): "\033[48;5;147m",
c("#afd700"): "\033[48;5;148m", c("#afd75f"): "\033[48;5;149m", c("#afd787"): "\033[48;5;150m", c("#afd7af"): "\033[48;5;151m",
c("#afd7d7"): "\033[48;5;152m", c("#afd7ff"): "\033[48;5;153m", c("#afff00"): "\033[48;5;154m", c("#afff5f"): "\033[48;5;155m",
c("#afff87"): "\033[48;5;156m", c("#afffaf"): "\033[48;5;157m", c("#afffd7"): "\033[48;5;158m", c("#afffff"): "\033[48;5;159m",
c("#d70000"): "\033[48;5;160m", c("#d7005f"): "\033[48;5;161m", c("#d70087"): "\033[48;5;162m", c("#d700af"): "\033[48;5;163m",
c("#d700d7"): "\033[48;5;164m", c("#d700ff"): "\033[48;5;165m", c("#d75f00"): "\033[48;5;166m", c("#d75f5f"): "\033[48;5;167m",
c("#d75f87"): "\033[48;5;168m", c("#d75faf"): "\033[48;5;169m", c("#d75fd7"): "\033[48;5;170m", c("#d75fff"): "\033[48;5;171m",
c("#d78700"): "\033[48;5;172m", c("#d7875f"): "\033[48;5;173m", c("#d78787"): "\033[48;5;174m", c("#d787af"): "\033[48;5;175m",
c("#d787d7"): "\033[48;5;176m", c("#d787ff"): "\033[48;5;177m", c("#d7af00"): "\033[48;5;178m", c("#d7af5f"): "\033[48;5;179m",
c("#d7af87"): "\033[48;5;180m", c("#d7afaf"): "\033[48;5;181m", c("#d7afd7"): "\033[48;5;182m", c("#d7afff"): "\033[48;5;183m",
c("#d7d700"): "\033[48;5;184m", c("#d7d75f"): "\033[48;5;185m", c("#d7d787"): "\033[48;5;186m", c("#d7d7af"): "\033[48;5;187m",
c("#d7d7d7"): "\033[48;5;188m", c("#d7d7ff"): "\033[48;5;189m", c("#d7ff00"): "\033[48;5;190m", c("#d7ff5f"): "\033[48;5;191m",
c("#d7ff87"): "\033[48;5;192m", c("#d7ffaf"): "\033[48;5;193m", c("#d7ffd7"): "\033[48;5;194m", c("#d7ffff"): "\033[48;5;195m",
c("#ff0000"): "\033[48;5;196m", c("#ff005f"): "\033[48;5;197m", c("#ff0087"): "\033[48;5;198m", c("#ff00af"): "\033[48;5;199m",
c("#ff00d7"): "\033[48;5;200m", c("#ff00ff"): "\033[48;5;201m", c("#ff5f00"): "\033[48;5;202m", c("#ff5f5f"): "\033[48;5;203m",
c("#ff5f87"): "\033[48;5;204m", c("#ff5faf"): "\033[48;5;205m", c("#ff5fd7"): "\033[48;5;206m", c("#ff5fff"): "\033[48;5;207m",
c("#ff8700"): "\033[48;5;208m", c("#ff875f"): "\033[48;5;209m", c("#ff8787"): "\033[48;5;210m", c("#ff87af"): "\033[48;5;211m",
c("#ff87d7"): "\033[48;5;212m", c("#ff87ff"): "\033[48;5;213m", c("#ffaf00"): "\033[48;5;214m", c("#ffaf5f"): "\033[48;5;215m",
c("#ffaf87"): "\033[48;5;216m", c("#ffafaf"): "\033[48;5;217m", c("#ffafd7"): "\033[48;5;218m", c("#ffafff"): "\033[48;5;219m",
c("#ffd700"): "\033[48;5;220m", c("#ffd75f"): "\033[48;5;221m", c("#ffd787"): "\033[48;5;222m", c("#ffd7af"): "\033[48;5;223m",
c("#ffd7d7"): "\033[48;5;224m", c("#ffd7ff"): "\033[48;5;225m", c("#ffff00"): "\033[48;5;226m", c("#ffff5f"): "\033[48;5;227m",
c("#ffff87"): "\033[48;5;228m", c("#ffffaf"): "\033[48;5;229m", c("#ffffd7"): "\033[48;5;230m", c("#ffffff"): "\033[48;5;231m",
c("#080808"): "\033[48;5;232m", c("#121212"): "\033[48;5;233m", c("#1c1c1c"): "\033[48;5;234m", c("#262626"): "\033[48;5;235m",
c("#303030"): "\033[48;5;236m", c("#3a3a3a"): "\033[48;5;237m", c("#444444"): "\033[48;5;238m", c("#4e4e4e"): "\033[48;5;239m",
c("#585858"): "\033[48;5;240m", c("#626262"): "\033[48;5;241m", c("#6c6c6c"): "\033[48;5;242m", c("#767676"): "\033[48;5;243m",
c("#808080"): "\033[48;5;244m", c("#8a8a8a"): "\033[48;5;245m", c("#949494"): "\033[48;5;246m", c("#9e9e9e"): "\033[48;5;247m",
c("#a8a8a8"): "\033[48;5;248m", c("#b2b2b2"): "\033[48;5;249m", c("#bcbcbc"): "\033[48;5;250m", c("#c6c6c6"): "\033[48;5;251m",
c("#d0d0d0"): "\033[48;5;252m", c("#dadada"): "\033[48;5;253m", c("#e4e4e4"): "\033[48;5;254m", c("#eeeeee"): "\033[48;5;255m",
},
},
}
func entryToEscapeSequence(table *ttyTable, entry chroma.StyleEntry) string {
out := ""
if entry.Bold == chroma.Yes {
out += "\033[1m"
}
if entry.Underline == chroma.Yes {
out += "\033[4m"
}
if entry.Italic == chroma.Yes {
out += "\033[3m"
}
if entry.Colour.IsSet() {
out += table.foreground[findClosest(table, entry.Colour)]
}
if entry.Background.IsSet() {
out += table.background[findClosest(table, entry.Background)]
}
return out
}
func findClosest(table *ttyTable, seeking chroma.Colour) chroma.Colour {
closestColour := chroma.Colour(0)
closest := float64(math.MaxFloat64)
for colour := range table.foreground {
distance := colour.Distance(seeking)
if distance < closest {
closest = distance
closestColour = colour
}
}
return closestColour
}
func styleToEscapeSequence(table *ttyTable, style *chroma.Style) map[chroma.TokenType]string {
style = clearBackground(style)
out := map[chroma.TokenType]string{}
for _, ttype := range style.Types() {
entry := style.Get(ttype)
out[ttype] = entryToEscapeSequence(table, entry)
}
return out
}
// Clear the background colour.
func clearBackground(style *chroma.Style) *chroma.Style {
builder := style.Builder()
bg := builder.Get(chroma.Background)
bg.Background = 0
bg.NoInherit = true
builder.AddEntry(chroma.Background, bg)
style, _ = builder.Build()
return style
}
type indexedTTYFormatter struct {
table *ttyTable
}
func (c *indexedTTYFormatter) Format(w io.Writer, style *chroma.Style, it chroma.Iterator) (err error) {
theme := styleToEscapeSequence(c.table, style)
for token := it(); token != chroma.EOF; token = it() {
// TODO: Cache token lookups?
clr, ok := theme[token.Type]
if !ok {
clr, ok = theme[token.Type.SubCategory()]
if !ok {
clr = theme[token.Type.Category()]
// if !ok {
// clr = theme[chroma.InheritStyle]
// }
}
}
if clr != "" {
fmt.Fprint(w, clr)
}
fmt.Fprint(w, token.Value)
if clr != "" {
fmt.Fprintf(w, "\033[0m")
}
}
return nil
}
// TTY8 is an 8-colour terminal formatter.
//
// The Lab colour space is used to map RGB values to the most appropriate index colour.
var TTY8 = Register("terminal", &indexedTTYFormatter{ttyTables[8]})
// TTY256 is a 256-colour terminal formatter.
//
// The Lab colour space is used to map RGB values to the most appropriate index colour.
var TTY256 = Register("terminal256", &indexedTTYFormatter{ttyTables[256]})

View file

@ -0,0 +1,13 @@
package formatters
import (
"testing"
"github.com/alecthomas/assert"
"github.com/alecthomas/chroma"
)
func TestClosestColour(t *testing.T) {
actual := findClosest(ttyTables[256], chroma.MustParseColour("#e06c75"))
assert.Equal(t, chroma.MustParseColour("#d75f87"), actual)
}

View file

@ -0,0 +1,42 @@
package formatters
import (
"fmt"
"io"
"github.com/alecthomas/chroma"
)
// TTY16m is a true-colour terminal formatter.
var TTY16m = Register("terminal16m", chroma.FormatterFunc(trueColourFormatter))
func trueColourFormatter(w io.Writer, style *chroma.Style, it chroma.Iterator) error {
style = clearBackground(style)
for token := it(); token != chroma.EOF; token = it() {
entry := style.Get(token.Type)
if !entry.IsZero() {
out := ""
if entry.Bold == chroma.Yes {
out += "\033[1m"
}
if entry.Underline == chroma.Yes {
out += "\033[4m"
}
if entry.Italic == chroma.Yes {
out += "\033[3m"
}
if entry.Colour.IsSet() {
out += fmt.Sprintf("\033[38;2;%d;%d;%dm", entry.Colour.Red(), entry.Colour.Green(), entry.Colour.Blue())
}
if entry.Background.IsSet() {
out += fmt.Sprintf("\033[48;2;%d;%d;%dm", entry.Background.Red(), entry.Background.Green(), entry.Background.Blue())
}
fmt.Fprint(w, out)
}
fmt.Fprint(w, token.Value)
if !entry.IsZero() {
fmt.Fprint(w, "\033[0m")
}
}
return nil
}

View file

@ -0,0 +1,18 @@
module github.com/alecthomas/chroma
go 1.13
require (
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 // indirect
github.com/alecthomas/kong v0.2.4
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 // indirect
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964
github.com/dlclark/regexp2 v1.2.0
github.com/mattn/go-colorable v0.1.6
github.com/mattn/go-isatty v0.0.12
github.com/pkg/errors v0.9.1 // indirect
github.com/sergi/go-diff v1.0.0 // indirect
github.com/stretchr/testify v1.3.0 // indirect
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 // indirect
)

View file

@ -0,0 +1,36 @@
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38 h1:smF2tmSOzy2Mm+0dGI2AIUHY+w0BUc+4tn40djz7+6U=
github.com/alecthomas/assert v0.0.0-20170929043011-405dbfeb8e38/go.mod h1:r7bzyVFMNntcxPZXK3/+KdruV1H5KSlyVY0gc+NgInI=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721 h1:JHZL0hZKJ1VENNfmXvHbgYlbUOvpzYzvy2aZU5gXVeo=
github.com/alecthomas/colour v0.0.0-20160524082231-60882d9e2721/go.mod h1:QO9JBoKquHd+jz9nshCh40fOfO+JzsoXy8qTHF68zU0=
github.com/alecthomas/kong v0.2.4 h1:Y0ZBCHAvHhTHw7FFJ2FzCAAG4pkbTgA45nc7BpMhDNk=
github.com/alecthomas/kong v0.2.4/go.mod h1:kQOmtJgV+Lb4aj+I2LEn40cbtawdWJ9Y8QLq+lElKxE=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897 h1:p9Sln00KOTlrYkxI1zYWl1QLnEqAqEARBEYa8FQnQcY=
github.com/alecthomas/repr v0.0.0-20180818092828-117648cd9897/go.mod h1:xTS7Pm1pD1mvyM075QCDSRqH6qRLXylzS24ZTpRiSzQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964 h1:y5HC9v93H5EPKqaS1UYVg1uYah5Xf51mBfIoWehClUQ=
github.com/danwakefield/fnmatch v0.0.0-20160403171240-cbb64ac3d964/go.mod h1:Xd9hchkHSWYkEqJwUGisez3G1QY8Ryz0sdWrLPMGjLk=
github.com/davecgh/go-spew v1.1.0/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/davecgh/go-spew v1.1.1 h1:vj9j/u1bqnvCEfJOwUhtlOARqs3+rkHYY13jYWTU97c=
github.com/davecgh/go-spew v1.1.1/go.mod h1:J7Y8YcW2NihsgmVo/mv3lAwl/skON4iLHjSsI+c5H38=
github.com/dlclark/regexp2 v1.2.0 h1:8sAhBGEM0dRWogWqWyQeIJnxjWO6oIjl8FKqREDsGfk=
github.com/dlclark/regexp2 v1.2.0/go.mod h1:2pZnwuY/m+8K6iRw6wQdMtk+rH5tNGR1i55kozfMjCc=
github.com/mattn/go-colorable v0.1.6 h1:6Su7aK7lXmJ/U79bYtBjLNaha4Fs1Rg9plHpcH+vvnE=
github.com/mattn/go-colorable v0.1.6/go.mod h1:u6P/XSegPjTcexA+o6vUJrdnUu04hMope9wVRipJSqc=
github.com/mattn/go-isatty v0.0.12 h1:wuysRhFDzyxgEmMf5xjvJ2M9dZoWAXNNr5LSBS7uHXY=
github.com/mattn/go-isatty v0.0.12/go.mod h1:cbi8OIDigv2wuxKPP5vlRcQ1OAZbq2CE4Kysco4FUpU=
github.com/pkg/errors v0.8.1 h1:iURUrRGxPUNPdy5/HRSm+Yj6okJ6UtLINN0Q9M4+h3I=
github.com/pkg/errors v0.8.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pkg/errors v0.9.1 h1:FEBLx1zS214owpjy7qsBeixbURkuhQAwrK5UwLGTwt4=
github.com/pkg/errors v0.9.1/go.mod h1:bwawxfHBFNV+L2hUp1rHADufV3IMtnDRdf1r5NINEl0=
github.com/pmezard/go-difflib v1.0.0 h1:4DBwDE0NGyQoBHbLQYPwSUPoCMWR5BEzIk/f1lZbAQM=
github.com/pmezard/go-difflib v1.0.0/go.mod h1:iKH77koFhYxTK1pcRnkKkqfTogsbg7gZNVY4sRDYZ/4=
github.com/sergi/go-diff v1.0.0 h1:Kpca3qRNrduNnOQeazBd0ysaKrUJiIuISHxogkT9RPQ=
github.com/sergi/go-diff v1.0.0/go.mod h1:0CfEIISq7TuYL3j771MWULgwwjU+GofnZX9QAmXWZgo=
github.com/stretchr/objx v0.1.0/go.mod h1:HFkY916IF+rwdDfMAkV7OtwuqBVzrE8GR6GFx+wExME=
github.com/stretchr/testify v1.2.2 h1:bSDNvY7ZPG5RlJ8otE/7V6gMiyenm9RtJ7IUVIAoJ1w=
github.com/stretchr/testify v1.2.2/go.mod h1:a8OnRcib4nhh0OaRAV+Yts87kKdq0PP7pXfy6kDkUVs=
github.com/stretchr/testify v1.3.0 h1:TivCn/peBQ7UY8ooIcPgZFpTNSz0Q2U6UrFlUfqbe0Q=
github.com/stretchr/testify v1.3.0/go.mod h1:M5WIy9Dh21IEIfnGCwXGc5bZfKNJtfHm1UVUgZn+9EI=
golang.org/x/sys v0.0.0-20200116001909-b77594299b42/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200223170610-d5e6a3e2c0ae/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4 h1:opSr2sbRXk5X5/givKrrKj9HXxFpW2sdCiP8MJSKLQY=
golang.org/x/sys v0.0.0-20200413165638-669c56c373c4/go.mod h1:h1NjWce9XRLGQEsW7wpKNCjG9DtNlClVuFLEZdDNbEs=

View file

@ -0,0 +1,76 @@
package chroma
import "strings"
// An Iterator across tokens.
//
// nil will be returned at the end of the Token stream.
//
// If an error occurs within an Iterator, it may propagate this in a panic. Formatters should recover.
type Iterator func() Token
// Tokens consumes all tokens from the iterator and returns them as a slice.
func (i Iterator) Tokens() []Token {
var out []Token
for t := i(); t != EOF; t = i() {
out = append(out, t)
}
return out
}
// Concaterator concatenates tokens from a series of iterators.
func Concaterator(iterators ...Iterator) Iterator {
return func() Token {
for len(iterators) > 0 {
t := iterators[0]()
if t != EOF {
return t
}
iterators = iterators[1:]
}
return EOF
}
}
// Literator converts a sequence of literal Tokens into an Iterator.
func Literator(tokens ...Token) Iterator {
return func() Token {
if len(tokens) == 0 {
return EOF
}
token := tokens[0]
tokens = tokens[1:]
return token
}
}
// SplitTokensIntoLines splits tokens containing newlines in two.
func SplitTokensIntoLines(tokens []Token) (out [][]Token) {
var line []Token // nolint: prealloc
for _, token := range tokens {
for strings.Contains(token.Value, "\n") {
parts := strings.SplitAfterN(token.Value, "\n", 2)
// Token becomes the tail.
token.Value = parts[1]
// Append the head to the line and flush the line.
clone := token.Clone()
clone.Value = parts[0]
line = append(line, clone)
out = append(out, line)
line = nil
}
line = append(line, token)
}
if len(line) > 0 {
out = append(out, line)
}
// Strip empty trailing token line.
if len(out) > 0 {
last := out[len(out)-1]
if len(last) == 1 && last[0].Value == "" {
out = out[:len(out)-1]
}
}
return
}

View file

@ -0,0 +1,125 @@
package chroma
import (
"fmt"
)
var (
defaultOptions = &TokeniseOptions{
State: "root",
EnsureLF: true,
}
)
// Config for a lexer.
type Config struct {
// Name of the lexer.
Name string
// Shortcuts for the lexer
Aliases []string
// File name globs
Filenames []string
// Secondary file name globs
AliasFilenames []string
// MIME types
MimeTypes []string
// Regex matching is case-insensitive.
CaseInsensitive bool
// Regex matches all characters.
DotAll bool
// Regex does not match across lines ($ matches EOL).
//
// Defaults to multiline.
NotMultiline bool
// Don't strip leading and trailing newlines from the input.
// DontStripNL bool
// Strip all leading and trailing whitespace from the input
// StripAll bool
// Make sure that the input ends with a newline. This
// is required for some lexers that consume input linewise.
EnsureNL bool
// If given and greater than 0, expand tabs in the input.
// TabSize int
// Priority of lexer.
//
// If this is 0 it will be treated as a default of 1.
Priority float32
}
// Token output to formatter.
type Token struct {
Type TokenType `json:"type"`
Value string `json:"value"`
}
func (t *Token) String() string { return t.Value }
func (t *Token) GoString() string { return fmt.Sprintf("&Token{%s, %q}", t.Type, t.Value) }
// Clone returns a clone of the Token.
func (t *Token) Clone() Token {
return *t
}
// EOF is returned by lexers at the end of input.
var EOF Token
// TokeniseOptions contains options for tokenisers.
type TokeniseOptions struct {
// State to start tokenisation in. Defaults to "root".
State string
// Nested tokenisation.
Nested bool
// If true, all EOLs are converted into LF
// by replacing CRLF and CR
EnsureLF bool
}
// A Lexer for tokenising source code.
type Lexer interface {
// Config describing the features of the Lexer.
Config() *Config
// Tokenise returns an Iterator over tokens in text.
Tokenise(options *TokeniseOptions, text string) (Iterator, error)
}
// Lexers is a slice of lexers sortable by name.
type Lexers []Lexer
func (l Lexers) Len() int { return len(l) }
func (l Lexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] }
func (l Lexers) Less(i, j int) bool { return l[i].Config().Name < l[j].Config().Name }
// PrioritisedLexers is a slice of lexers sortable by priority.
type PrioritisedLexers []Lexer
func (l PrioritisedLexers) Len() int { return len(l) }
func (l PrioritisedLexers) Swap(i, j int) { l[i], l[j] = l[j], l[i] }
func (l PrioritisedLexers) Less(i, j int) bool {
ip := l[i].Config().Priority
if ip == 0 {
ip = 1
}
jp := l[j].Config().Priority
if jp == 0 {
jp = 1
}
return ip > jp
}
// Analyser determines how appropriate this lexer is for the given text.
type Analyser interface {
AnalyseText(text string) float32
}

View file

@ -0,0 +1,52 @@
package chroma
import (
"testing"
"github.com/alecthomas/assert"
)
func TestTokenTypeClassifiers(t *testing.T) {
assert.True(t, GenericDeleted.InCategory(Generic))
assert.True(t, LiteralStringBacktick.InSubCategory(String))
assert.Equal(t, LiteralStringBacktick.String(), "LiteralStringBacktick")
}
func TestSimpleLexer(t *testing.T) {
lexer, err := NewLexer(
&Config{
Name: "INI",
Aliases: []string{"ini", "cfg"},
Filenames: []string{"*.ini", "*.cfg"},
},
map[string][]Rule{
"root": {
{`\s+`, Whitespace, nil},
{`;.*?$`, Comment, nil},
{`\[.*?\]$`, Keyword, nil},
{`(.*?)(\s*)(=)(\s*)(.*?)$`, ByGroups(Name, Whitespace, Operator, Whitespace, String), nil},
},
},
)
assert.NoError(t, err)
actual, err := Tokenise(lexer, nil, `
; this is a comment
[section]
a = 10
`)
assert.NoError(t, err)
expected := []Token{
{Whitespace, "\n\t"},
{Comment, "; this is a comment"},
{Whitespace, "\n\t"},
{Keyword, "[section]"},
{Whitespace, "\n\t"},
{Name, "a"},
{Whitespace, " "},
{Operator, "="},
{Whitespace, " "},
{LiteralString, "10"},
{Whitespace, "\n"},
}
assert.Equal(t, expected, actual)
}

View file

@ -0,0 +1,37 @@
# Lexer tests
The tests in this directory feed a known input `testdata/<name>.actual` into the parser for `<name>` and check
that its output matches `<name>.exported`.
## Running the tests
Run the tests as normal:
```go
go test ./lexers
```
## Update existing tests
When you add a new test data file (`*.actual`), you need to regenerate all tests. That's how Chroma creates the `*.expected` test file based on the corresponding lexer.
To regenerate all tests, type in your terminal:
```go
RECORD=true go test ./lexers
```
This first sets the `RECORD` environment variable to `true`. Then it runs `go test` on the `./lexers` directory of the Chroma project.
(That environment variable tells Chroma it needs to output test data. After running `go test ./lexers` you can remove or reset that variable.)
### Windows users
Windows users will find that the `RECORD=true go test ./lexers` command fails in both the standard command prompt terminal and in PowerShell.
Instead we have to perform both steps separately:
- Set the `RECORD` environment variable to `true`.
+ In the regular command prompt window, the `set` command sets an environment variable for the current session: `set RECORD=true`. See [this page](https://superuser.com/questions/212150/how-to-set-env-variable-in-windows-cmd-line) for more.
+ In PowerShell, you can use the `$env:RECORD = 'true'` command for that. See [this article](https://mcpmag.com/articles/2019/03/28/environment-variables-in-powershell.aspx) for more.
+ You can also make a persistent environment variable by hand in the Windows computer settings. See [this article](https://www.computerhope.com/issues/ch000549.htm) for how.
- When the environment variable is set, run `go tests ./lexers`.
Chroma will now regenerate the test files and print its results to the console window.

View file

@ -0,0 +1,56 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// ABAP lexer.
var Abap = internal.Register(MustNewLexer(
&Config{
Name: "ABAP",
Aliases: []string{"abap"},
Filenames: []string{"*.abap", "*.ABAP"},
MimeTypes: []string{"text/x-abap"},
CaseInsensitive: true,
},
Rules{
"common": {
{`\s+`, Text, nil},
{`^\*.*$`, CommentSingle, nil},
{`\".*?\n`, CommentSingle, nil},
{`##\w+`, CommentSpecial, nil},
},
"variable-names": {
{`<\S+>`, NameVariable, nil},
{`\w[\w~]*(?:(\[\])|->\*)?`, NameVariable, nil},
},
"root": {
Include("common"),
{`CALL\s+(?:BADI|CUSTOMER-FUNCTION|FUNCTION)`, Keyword, nil},
{`(CALL\s+(?:DIALOG|SCREEN|SUBSCREEN|SELECTION-SCREEN|TRANSACTION|TRANSFORMATION))\b`, Keyword, nil},
{`(FORM|PERFORM)(\s+)(\w+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`(PERFORM)(\s+)(\()(\w+)(\))`, ByGroups(Keyword, Text, Punctuation, NameVariable, Punctuation), nil},
{`(MODULE)(\s+)(\S+)(\s+)(INPUT|OUTPUT)`, ByGroups(Keyword, Text, NameFunction, Text, Keyword), nil},
{`(METHOD)(\s+)([\w~]+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`(\s+)([\w\-]+)([=\-]>)([\w\-~]+)`, ByGroups(Text, NameVariable, Operator, NameFunction), nil},
{`(?<=(=|-)>)([\w\-~]+)(?=\()`, NameFunction, nil},
{`(TEXT)(-)(\d{3})`, ByGroups(Keyword, Punctuation, LiteralNumberInteger), nil},
{`(TEXT)(-)(\w{3})`, ByGroups(Keyword, Punctuation, NameVariable), nil},
{`(ADD-CORRESPONDING|AUTHORITY-CHECK|CLASS-DATA|CLASS-EVENTS|CLASS-METHODS|CLASS-POOL|DELETE-ADJACENT|DIVIDE-CORRESPONDING|EDITOR-CALL|ENHANCEMENT-POINT|ENHANCEMENT-SECTION|EXIT-COMMAND|FIELD-GROUPS|FIELD-SYMBOLS|FUNCTION-POOL|INTERFACE-POOL|INVERTED-DATE|LOAD-OF-PROGRAM|LOG-POINT|MESSAGE-ID|MOVE-CORRESPONDING|MULTIPLY-CORRESPONDING|NEW-LINE|NEW-PAGE|NEW-SECTION|NO-EXTENSION|OUTPUT-LENGTH|PRINT-CONTROL|SELECT-OPTIONS|START-OF-SELECTION|SUBTRACT-CORRESPONDING|SYNTAX-CHECK|SYSTEM-EXCEPTIONS|TYPE-POOL|TYPE-POOLS|NO-DISPLAY)\b`, Keyword, nil},
{`(?<![-\>])(CREATE\s+(PUBLIC|PRIVATE|DATA|OBJECT)|(PUBLIC|PRIVATE|PROTECTED)\s+SECTION|(TYPE|LIKE)\s+((LINE\s+OF|REF\s+TO|(SORTED|STANDARD|HASHED)\s+TABLE\s+OF))?|FROM\s+(DATABASE|MEMORY)|CALL\s+METHOD|(GROUP|ORDER) BY|HAVING|SEPARATED BY|GET\s+(BADI|BIT|CURSOR|DATASET|LOCALE|PARAMETER|PF-STATUS|(PROPERTY|REFERENCE)\s+OF|RUN\s+TIME|TIME\s+(STAMP)?)?|SET\s+(BIT|BLANK\s+LINES|COUNTRY|CURSOR|DATASET|EXTENDED\s+CHECK|HANDLER|HOLD\s+DATA|LANGUAGE|LEFT\s+SCROLL-BOUNDARY|LOCALE|MARGIN|PARAMETER|PF-STATUS|PROPERTY\s+OF|RUN\s+TIME\s+(ANALYZER|CLOCK\s+RESOLUTION)|SCREEN|TITLEBAR|UPADTE\s+TASK\s+LOCAL|USER-COMMAND)|CONVERT\s+((INVERTED-)?DATE|TIME|TIME\s+STAMP|TEXT)|(CLOSE|OPEN)\s+(DATASET|CURSOR)|(TO|FROM)\s+(DATA BUFFER|INTERNAL TABLE|MEMORY ID|DATABASE|SHARED\s+(MEMORY|BUFFER))|DESCRIBE\s+(DISTANCE\s+BETWEEN|FIELD|LIST|TABLE)|FREE\s(MEMORY|OBJECT)?|PROCESS\s+(BEFORE\s+OUTPUT|AFTER\s+INPUT|ON\s+(VALUE-REQUEST|HELP-REQUEST))|AT\s+(LINE-SELECTION|USER-COMMAND|END\s+OF|NEW)|AT\s+SELECTION-SCREEN(\s+(ON(\s+(BLOCK|(HELP|VALUE)-REQUEST\s+FOR|END\s+OF|RADIOBUTTON\s+GROUP))?|OUTPUT))?|SELECTION-SCREEN:?\s+((BEGIN|END)\s+OF\s+((TABBED\s+)?BLOCK|LINE|SCREEN)|COMMENT|FUNCTION\s+KEY|INCLUDE\s+BLOCKS|POSITION|PUSHBUTTON|SKIP|ULINE)|LEAVE\s+(LIST-PROCESSING|PROGRAM|SCREEN|TO LIST-PROCESSING|TO TRANSACTION)(ENDING|STARTING)\s+AT|FORMAT\s+(COLOR|INTENSIFIED|INVERSE|HOTSPOT|INPUT|FRAMES|RESET)|AS\s+(CHECKBOX|SUBSCREEN|WINDOW)|WITH\s+(((NON-)?UNIQUE)?\s+KEY|FRAME)|(BEGIN|END)\s+OF|DELETE(\s+ADJACENT\s+DUPLICATES\sFROM)?|COMPARING(\s+ALL\s+FIELDS)?|(INSERT|APPEND)(\s+INITIAL\s+LINE\s+(IN)?TO|\s+LINES\s+OF)?|IN\s+((BYTE|CHARACTER)\s+MODE|PROGRAM)|END-OF-(DEFINITION|PAGE|SELECTION)|WITH\s+FRAME(\s+TITLE)|(REPLACE|FIND)\s+((FIRST|ALL)\s+OCCURRENCES?\s+OF\s+)?(SUBSTRING|REGEX)?|MATCH\s+(LENGTH|COUNT|LINE|OFFSET)|(RESPECTING|IGNORING)\s+CASE|IN\s+UPDATE\s+TASK|(SOURCE|RESULT)\s+(XML)?|REFERENCE\s+INTO|AND\s+(MARK|RETURN)|CLIENT\s+SPECIFIED|CORRESPONDING\s+FIELDS\s+OF|IF\s+FOUND|FOR\s+EVENT|INHERITING\s+FROM|LEAVE\s+TO\s+SCREEN|LOOP\s+AT\s+(SCREEN)?|LOWER\s+CASE|MATCHCODE\s+OBJECT|MODIF\s+ID|MODIFY\s+SCREEN|NESTING\s+LEVEL|NO\s+INTERVALS|OF\s+STRUCTURE|RADIOBUTTON\s+GROUP|RANGE\s+OF|REF\s+TO|SUPPRESS DIALOG|TABLE\s+OF|UPPER\s+CASE|TRANSPORTING\s+NO\s+FIELDS|VALUE\s+CHECK|VISIBLE\s+LENGTH|HEADER\s+LINE|COMMON\s+PART)\b`, Keyword, nil},
{`(^|(?<=(\s|\.)))(ABBREVIATED|ABSTRACT|ADD|ALIASES|ALIGN|ALPHA|ASSERT|AS|ASSIGN(ING)?|AT(\s+FIRST)?|BACK|BLOCK|BREAK-POINT|CASE|CATCH|CHANGING|CHECK|CLASS|CLEAR|COLLECT|COLOR|COMMIT|CREATE|COMMUNICATION|COMPONENTS?|COMPUTE|CONCATENATE|CONDENSE|CONSTANTS|CONTEXTS|CONTINUE|CONTROLS|COUNTRY|CURRENCY|DATA|DATE|DECIMALS|DEFAULT|DEFINE|DEFINITION|DEFERRED|DEMAND|DETAIL|DIRECTORY|DIVIDE|DO|DUMMY|ELSE(IF)?|ENDAT|ENDCASE|ENDCATCH|ENDCLASS|ENDDO|ENDFORM|ENDFUNCTION|ENDIF|ENDINTERFACE|ENDLOOP|ENDMETHOD|ENDMODULE|ENDSELECT|ENDTRY|ENDWHILE|ENHANCEMENT|EVENTS|EXACT|EXCEPTIONS?|EXIT|EXPONENT|EXPORT|EXPORTING|EXTRACT|FETCH|FIELDS?|FOR|FORM|FORMAT|FREE|FROM|FUNCTION|HIDE|ID|IF|IMPORT|IMPLEMENTATION|IMPORTING|IN|INCLUDE|INCLUDING|INDEX|INFOTYPES|INITIALIZATION|INTERFACE|INTERFACES|INTO|LANGUAGE|LEAVE|LENGTH|LINES|LOAD|LOCAL|JOIN|KEY|NEXT|MAXIMUM|MESSAGE|METHOD[S]?|MINIMUM|MODULE|MODIFIER|MODIFY|MOVE|MULTIPLY|NODES|NUMBER|OBLIGATORY|OBJECT|OF|OFF|ON|OTHERS|OVERLAY|PACK|PAD|PARAMETERS|PERCENTAGE|POSITION|PROGRAM|PROVIDE|PUBLIC|PUT|PF\d\d|RAISE|RAISING|RANGES?|READ|RECEIVE|REDEFINITION|REFRESH|REJECT|REPORT|RESERVE|RESUME|RETRY|RETURN|RETURNING|RIGHT|ROLLBACK|REPLACE|SCROLL|SEARCH|SELECT|SHIFT|SIGN|SINGLE|SIZE|SKIP|SORT|SPLIT|STATICS|STOP|STYLE|SUBMATCHES|SUBMIT|SUBTRACT|SUM(?!\()|SUMMARY|SUMMING|SUPPLY|TABLE|TABLES|TIMESTAMP|TIMES?|TIMEZONE|TITLE|\??TO|TOP-OF-PAGE|TRANSFER|TRANSLATE|TRY|TYPES|ULINE|UNDER|UNPACK|UPDATE|USING|VALUE|VALUES|VIA|VARYING|VARY|WAIT|WHEN|WHERE|WIDTH|WHILE|WITH|WINDOW|WRITE|XSD|ZERO)\b`, Keyword, nil},
{`(abs|acos|asin|atan|boolc|boolx|bit_set|char_off|charlen|ceil|cmax|cmin|condense|contains|contains_any_of|contains_any_not_of|concat_lines_of|cos|cosh|count|count_any_of|count_any_not_of|dbmaxlen|distance|escape|exp|find|find_end|find_any_of|find_any_not_of|floor|frac|from_mixed|insert|lines|log|log10|match|matches|nmax|nmin|numofchar|repeat|replace|rescale|reverse|round|segment|shift_left|shift_right|sign|sin|sinh|sqrt|strlen|substring|substring_after|substring_from|substring_before|substring_to|tan|tanh|to_upper|to_lower|to_mixed|translate|trunc|xstrlen)(\()\b`, ByGroups(NameBuiltin, Punctuation), nil},
{`&[0-9]`, Name, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`(?<=(\s|.))(AND|OR|EQ|NE|GT|LT|GE|LE|CO|CN|CA|NA|CS|NOT|NS|CP|NP|BYTE-CO|BYTE-CN|BYTE-CA|BYTE-NA|BYTE-CS|BYTE-NS|IS\s+(NOT\s+)?(INITIAL|ASSIGNED|REQUESTED|BOUND))\b`, OperatorWord, nil},
Include("variable-names"),
{`[?*<>=\-+&]`, Operator, nil},
{`'(''|[^'])*'`, LiteralStringSingle, nil},
{"`([^`])*`", LiteralStringSingle, nil},
{`([|}])([^{}|]*?)([|{])`, ByGroups(Punctuation, LiteralStringSingle, Punctuation), nil},
{`[/;:()\[\],.]`, Punctuation, nil},
{`(!)(\w+)`, ByGroups(Operator, Name), nil},
},
},
))

View file

@ -0,0 +1,38 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Abnf lexer.
var Abnf = internal.Register(MustNewLexer(
&Config{
Name: "ABNF",
Aliases: []string{"abnf"},
Filenames: []string{"*.abnf"},
MimeTypes: []string{"text/x-abnf"},
},
Rules{
"root": {
{`;.*$`, CommentSingle, nil},
{`(%[si])?"[^"]*"`, Literal, nil},
{`%b[01]+\-[01]+\b`, Literal, nil},
{`%b[01]+(\.[01]+)*\b`, Literal, nil},
{`%d[0-9]+\-[0-9]+\b`, Literal, nil},
{`%d[0-9]+(\.[0-9]+)*\b`, Literal, nil},
{`%x[0-9a-fA-F]+\-[0-9a-fA-F]+\b`, Literal, nil},
{`%x[0-9a-fA-F]+(\.[0-9a-fA-F]+)*\b`, Literal, nil},
{`\b[0-9]+\*[0-9]+`, Operator, nil},
{`\b[0-9]+\*`, Operator, nil},
{`\b[0-9]+`, Operator, nil},
{`\*`, Operator, nil},
{Words(``, `\b`, `ALPHA`, `BIT`, `CHAR`, `CR`, `CRLF`, `CTL`, `DIGIT`, `DQUOTE`, `HEXDIG`, `HTAB`, `LF`, `LWSP`, `OCTET`, `SP`, `VCHAR`, `WSP`), Keyword, nil},
{`[a-zA-Z][a-zA-Z0-9-]+\b`, NameClass, nil},
{`(=/|=|/)`, Operator, nil},
{`[\[\]()]`, Punctuation, nil},
{`\s+`, Text, nil},
{`.`, Text, nil},
},
},
))

View file

@ -0,0 +1,39 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Actionscript lexer.
var Actionscript = internal.Register(MustNewLexer(
&Config{
Name: "ActionScript",
Aliases: []string{"as", "actionscript"},
Filenames: []string{"*.as"},
MimeTypes: []string{"application/x-actionscript", "text/x-actionscript", "text/actionscript"},
NotMultiline: true,
DotAll: true,
},
Rules{
"root": {
{`\s+`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`/(\\\\|\\/|[^/\n])*/[gim]*`, LiteralStringRegex, nil},
{`[~^*!%&<>|+=:;,/?\\-]+`, Operator, nil},
{`[{}\[\]();.]+`, Punctuation, nil},
{Words(``, `\b`, `case`, `default`, `for`, `each`, `in`, `while`, `do`, `break`, `return`, `continue`, `if`, `else`, `throw`, `try`, `catch`, `var`, `with`, `new`, `typeof`, `arguments`, `instanceof`, `this`, `switch`), Keyword, nil},
{Words(``, `\b`, `class`, `public`, `final`, `internal`, `native`, `override`, `private`, `protected`, `static`, `import`, `extends`, `implements`, `interface`, `intrinsic`, `return`, `super`, `dynamic`, `function`, `const`, `get`, `namespace`, `package`, `set`), KeywordDeclaration, nil},
{`(true|false|null|NaN|Infinity|-Infinity|undefined|Void)\b`, KeywordConstant, nil},
{Words(``, `\b`, `Accessibility`, `AccessibilityProperties`, `ActionScriptVersion`, `ActivityEvent`, `AntiAliasType`, `ApplicationDomain`, `AsBroadcaster`, `Array`, `AsyncErrorEvent`, `AVM1Movie`, `BevelFilter`, `Bitmap`, `BitmapData`, `BitmapDataChannel`, `BitmapFilter`, `BitmapFilterQuality`, `BitmapFilterType`, `BlendMode`, `BlurFilter`, `Boolean`, `ByteArray`, `Camera`, `Capabilities`, `CapsStyle`, `Class`, `Color`, `ColorMatrixFilter`, `ColorTransform`, `ContextMenu`, `ContextMenuBuiltInItems`, `ContextMenuEvent`, `ContextMenuItem`, `ConvultionFilter`, `CSMSettings`, `DataEvent`, `Date`, `DefinitionError`, `DeleteObjectSample`, `Dictionary`, `DisplacmentMapFilter`, `DisplayObject`, `DisplacmentMapFilterMode`, `DisplayObjectContainer`, `DropShadowFilter`, `Endian`, `EOFError`, `Error`, `ErrorEvent`, `EvalError`, `Event`, `EventDispatcher`, `EventPhase`, `ExternalInterface`, `FileFilter`, `FileReference`, `FileReferenceList`, `FocusDirection`, `FocusEvent`, `Font`, `FontStyle`, `FontType`, `FrameLabel`, `FullScreenEvent`, `Function`, `GlowFilter`, `GradientBevelFilter`, `GradientGlowFilter`, `GradientType`, `Graphics`, `GridFitType`, `HTTPStatusEvent`, `IBitmapDrawable`, `ID3Info`, `IDataInput`, `IDataOutput`, `IDynamicPropertyOutputIDynamicPropertyWriter`, `IEventDispatcher`, `IExternalizable`, `IllegalOperationError`, `IME`, `IMEConversionMode`, `IMEEvent`, `int`, `InteractiveObject`, `InterpolationMethod`, `InvalidSWFError`, `InvokeEvent`, `IOError`, `IOErrorEvent`, `JointStyle`, `Key`, `Keyboard`, `KeyboardEvent`, `KeyLocation`, `LineScaleMode`, `Loader`, `LoaderContext`, `LoaderInfo`, `LoadVars`, `LocalConnection`, `Locale`, `Math`, `Matrix`, `MemoryError`, `Microphone`, `MorphShape`, `Mouse`, `MouseEvent`, `MovieClip`, `MovieClipLoader`, `Namespace`, `NetConnection`, `NetStatusEvent`, `NetStream`, `NewObjectSample`, `Number`, `Object`, `ObjectEncoding`, `PixelSnapping`, `Point`, `PrintJob`, `PrintJobOptions`, `PrintJobOrientation`, `ProgressEvent`, `Proxy`, `QName`, `RangeError`, `Rectangle`, `ReferenceError`, `RegExp`, `Responder`, `Sample`, `Scene`, `ScriptTimeoutError`, `Security`, `SecurityDomain`, `SecurityError`, `SecurityErrorEvent`, `SecurityPanel`, `Selection`, `Shape`, `SharedObject`, `SharedObjectFlushStatus`, `SimpleButton`, `Socket`, `Sound`, `SoundChannel`, `SoundLoaderContext`, `SoundMixer`, `SoundTransform`, `SpreadMethod`, `Sprite`, `StackFrame`, `StackOverflowError`, `Stage`, `StageAlign`, `StageDisplayState`, `StageQuality`, `StageScaleMode`, `StaticText`, `StatusEvent`, `String`, `StyleSheet`, `SWFVersion`, `SyncEvent`, `SyntaxError`, `System`, `TextColorType`, `TextField`, `TextFieldAutoSize`, `TextFieldType`, `TextFormat`, `TextFormatAlign`, `TextLineMetrics`, `TextRenderer`, `TextSnapshot`, `Timer`, `TimerEvent`, `Transform`, `TypeError`, `uint`, `URIError`, `URLLoader`, `URLLoaderDataFormat`, `URLRequest`, `URLRequestHeader`, `URLRequestMethod`, `URLStream`, `URLVariabeles`, `VerifyError`, `Video`, `XML`, `XMLDocument`, `XMLList`, `XMLNode`, `XMLNodeType`, `XMLSocket`, `XMLUI`), NameBuiltin, nil},
{Words(``, `\b`, `decodeURI`, `decodeURIComponent`, `encodeURI`, `escape`, `eval`, `isFinite`, `isNaN`, `isXMLName`, `clearInterval`, `fscommand`, `getTimer`, `getURL`, `getVersion`, `parseFloat`, `parseInt`, `setInterval`, `trace`, `updateAfterEvent`, `unescape`), NameFunction, nil},
{`[$a-zA-Z_]\w*`, NameOther, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
},
},
))

View file

@ -0,0 +1,56 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Actionscript 3 lexer.
var Actionscript3 = internal.Register(MustNewLexer(
&Config{
Name: "ActionScript 3",
Aliases: []string{"as3", "actionscript3"},
Filenames: []string{"*.as"},
MimeTypes: []string{"application/x-actionscript3", "text/x-actionscript3", "text/actionscript3"},
DotAll: true,
},
Rules{
"root": {
{`\s+`, Text, nil},
{`(function\s+)([$a-zA-Z_]\w*)(\s*)(\()`, ByGroups(KeywordDeclaration, NameFunction, Text, Operator), Push("funcparams")},
{`(var|const)(\s+)([$a-zA-Z_]\w*)(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?)`, ByGroups(KeywordDeclaration, Text, Name, Text, Punctuation, Text, KeywordType), nil},
{`(import|package)(\s+)((?:[$a-zA-Z_]\w*|\.)+)(\s*)`, ByGroups(Keyword, Text, NameNamespace, Text), nil},
{`(new)(\s+)([$a-zA-Z_]\w*(?:\.<\w+>)?)(\s*)(\()`, ByGroups(Keyword, Text, KeywordType, Text, Operator), nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`/(\\\\|\\/|[^\n])*/[gisx]*`, LiteralStringRegex, nil},
{`(\.)([$a-zA-Z_]\w*)`, ByGroups(Operator, NameAttribute), nil},
{`(case|default|for|each|in|while|do|break|return|continue|if|else|throw|try|catch|with|new|typeof|arguments|instanceof|this|switch|import|include|as|is)\b`, Keyword, nil},
{`(class|public|final|internal|native|override|private|protected|static|import|extends|implements|interface|intrinsic|return|super|dynamic|function|const|get|namespace|package|set)\b`, KeywordDeclaration, nil},
{`(true|false|null|NaN|Infinity|-Infinity|undefined|void)\b`, KeywordConstant, nil},
{`(decodeURI|decodeURIComponent|encodeURI|escape|eval|isFinite|isNaN|isXMLName|clearInterval|fscommand|getTimer|getURL|getVersion|isFinite|parseFloat|parseInt|setInterval|trace|updateAfterEvent|unescape)\b`, NameFunction, nil},
{`[$a-zA-Z_]\w*`, Name, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
{`[~^*!%&<>|+=:;,/?\\{}\[\]().-]+`, Operator, nil},
},
"funcparams": {
{`\s+`, Text, nil},
{`(\s*)(\.\.\.)?([$a-zA-Z_]\w*)(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?|\*)(\s*)`, ByGroups(Text, Punctuation, Name, Text, Operator, Text, KeywordType, Text), Push("defval")},
{`\)`, Operator, Push("type")},
},
"type": {
{`(\s*)(:)(\s*)([$a-zA-Z_]\w*(?:\.<\w+>)?|\*)`, ByGroups(Text, Operator, Text, KeywordType), Pop(2)},
{`\s+`, Text, Pop(2)},
Default(Pop(2)),
},
"defval": {
{`(=)(\s*)([^(),]+)(\s*)(,?)`, ByGroups(Operator, Text, UsingSelf("root"), Text, Operator), Pop(1)},
{`,`, Operator, Pop(1)},
Default(Pop(1)),
},
},
))

View file

@ -0,0 +1,114 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Ada lexer.
var Ada = internal.Register(MustNewLexer(
&Config{
Name: "Ada",
Aliases: []string{"ada", "ada95", "ada2005"},
Filenames: []string{"*.adb", "*.ads", "*.ada"},
MimeTypes: []string{"text/x-ada"},
CaseInsensitive: true,
},
Rules{
"root": {
{`[^\S\n]+`, Text, nil},
{`--.*?\n`, CommentSingle, nil},
{`[^\S\n]+`, Text, nil},
{`function|procedure|entry`, KeywordDeclaration, Push("subprogram")},
{`(subtype|type)(\s+)(\w+)`, ByGroups(KeywordDeclaration, Text, KeywordType), Push("type_def")},
{`task|protected`, KeywordDeclaration, nil},
{`(subtype)(\s+)`, ByGroups(KeywordDeclaration, Text), nil},
{`(end)(\s+)`, ByGroups(KeywordReserved, Text), Push("end")},
{`(pragma)(\s+)(\w+)`, ByGroups(KeywordReserved, Text, CommentPreproc), nil},
{`(true|false|null)\b`, KeywordConstant, nil},
{Words(``, `\b`, `Address`, `Byte`, `Boolean`, `Character`, `Controlled`, `Count`, `Cursor`, `Duration`, `File_Mode`, `File_Type`, `Float`, `Generator`, `Integer`, `Long_Float`, `Long_Integer`, `Long_Long_Float`, `Long_Long_Integer`, `Natural`, `Positive`, `Reference_Type`, `Short_Float`, `Short_Integer`, `Short_Short_Float`, `Short_Short_Integer`, `String`, `Wide_Character`, `Wide_String`), KeywordType, nil},
{`(and(\s+then)?|in|mod|not|or(\s+else)|rem)\b`, OperatorWord, nil},
{`generic|private`, KeywordDeclaration, nil},
{`package`, KeywordDeclaration, Push("package")},
{`array\b`, KeywordReserved, Push("array_def")},
{`(with|use)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
{`(\w+)(\s*)(:)(\s*)(constant)`, ByGroups(NameConstant, Text, Punctuation, Text, KeywordReserved), nil},
{`<<\w+>>`, NameLabel, nil},
{`(\w+)(\s*)(:)(\s*)(declare|begin|loop|for|while)`, ByGroups(NameLabel, Text, Punctuation, Text, KeywordReserved), nil},
{Words(`\b`, `\b`, `abort`, `abs`, `abstract`, `accept`, `access`, `aliased`, `all`, `array`, `at`, `begin`, `body`, `case`, `constant`, `declare`, `delay`, `delta`, `digits`, `do`, `else`, `elsif`, `end`, `entry`, `exception`, `exit`, `interface`, `for`, `goto`, `if`, `is`, `limited`, `loop`, `new`, `null`, `of`, `or`, `others`, `out`, `overriding`, `pragma`, `protected`, `raise`, `range`, `record`, `renames`, `requeue`, `return`, `reverse`, `select`, `separate`, `subtype`, `synchronized`, `task`, `tagged`, `terminate`, `then`, `type`, `until`, `when`, `while`, `xor`), KeywordReserved, nil},
{`"[^"]*"`, LiteralString, nil},
Include("attribute"),
Include("numbers"),
{`'[^']'`, LiteralStringChar, nil},
{`(\w+)(\s*|[(,])`, ByGroups(Name, UsingSelf("root")), nil},
{`(<>|=>|:=|[()|:;,.'])`, Punctuation, nil},
{`[*<>+=/&-]`, Operator, nil},
{`\n+`, Text, nil},
},
"numbers": {
{`[0-9_]+#[0-9a-f]+#`, LiteralNumberHex, nil},
{`[0-9_]+\.[0-9_]*`, LiteralNumberFloat, nil},
{`[0-9_]+`, LiteralNumberInteger, nil},
},
"attribute": {
{`(')(\w+)`, ByGroups(Punctuation, NameAttribute), nil},
},
"subprogram": {
{`\(`, Punctuation, Push("#pop", "formal_part")},
{`;`, Punctuation, Pop(1)},
{`is\b`, KeywordReserved, Pop(1)},
{`"[^"]+"|\w+`, NameFunction, nil},
Include("root"),
},
"end": {
{`(if|case|record|loop|select)`, KeywordReserved, nil},
{`"[^"]+"|[\w.]+`, NameFunction, nil},
{`\s+`, Text, nil},
{`;`, Punctuation, Pop(1)},
},
"type_def": {
{`;`, Punctuation, Pop(1)},
{`\(`, Punctuation, Push("formal_part")},
{`with|and|use`, KeywordReserved, nil},
{`array\b`, KeywordReserved, Push("#pop", "array_def")},
{`record\b`, KeywordReserved, Push("record_def")},
{`(null record)(;)`, ByGroups(KeywordReserved, Punctuation), Pop(1)},
Include("root"),
},
"array_def": {
{`;`, Punctuation, Pop(1)},
{`(\w+)(\s+)(range)`, ByGroups(KeywordType, Text, KeywordReserved), nil},
Include("root"),
},
"record_def": {
{`end record`, KeywordReserved, Pop(1)},
Include("root"),
},
"import": {
{`[\w.]+`, NameNamespace, Pop(1)},
Default(Pop(1)),
},
"formal_part": {
{`\)`, Punctuation, Pop(1)},
{`\w+`, NameVariable, nil},
{`,|:[^=]`, Punctuation, nil},
{`(in|not|null|out|access)\b`, KeywordReserved, nil},
Include("root"),
},
"package": {
{`body`, KeywordDeclaration, nil},
{`is\s+new|renames`, KeywordReserved, nil},
{`is`, KeywordReserved, Pop(1)},
{`;`, Punctuation, Pop(1)},
{`\(`, Punctuation, Push("package_instantiation")},
{`([\w.]+)`, NameClass, nil},
Include("root"),
},
"package_instantiation": {
{`("[^"]+"|\w+)(\s+)(=>)`, ByGroups(NameVariable, Text, Punctuation), nil},
{`[\w.\'"]`, Text, nil},
{`\)`, Punctuation, Pop(1)},
Include("root"),
},
},
))

View file

@ -0,0 +1,42 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Angular2 lexer.
var Angular2 = internal.Register(MustNewLexer(
&Config{
Name: "Angular2",
Aliases: []string{"ng2"},
Filenames: []string{},
MimeTypes: []string{},
},
Rules{
"root": {
{`[^{([*#]+`, Other, nil},
{`(\{\{)(\s*)`, ByGroups(CommentPreproc, Text), Push("ngExpression")},
{`([([]+)([\w:.-]+)([\])]+)(\s*)(=)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Text, Operator, Text), Push("attr")},
{`([([]+)([\w:.-]+)([\])]+)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Text), nil},
{`([*#])([\w:.-]+)(\s*)(=)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation, Operator), Push("attr")},
{`([*#])([\w:.-]+)(\s*)`, ByGroups(Punctuation, NameAttribute, Punctuation), nil},
},
"ngExpression": {
{`\s+(\|\s+)?`, Text, nil},
{`\}\}`, CommentPreproc, Pop(1)},
{`:?(true|false)`, LiteralStringBoolean, nil},
{`:?"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`:?'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
{`[0-9](\.[0-9]*)?(eE[+-][0-9])?[flFLdD]?|0[xX][0-9a-fA-F]+[Ll]?`, LiteralNumber, nil},
{`[a-zA-Z][\w-]*(\(.*\))?`, NameVariable, nil},
{`\.[\w-]+(\(.*\))?`, NameVariable, nil},
{`(\?)(\s*)([^}\s]+)(\s*)(:)(\s*)([^}\s]+)(\s*)`, ByGroups(Operator, Text, LiteralString, Text, Operator, Text, LiteralString, Text), nil},
},
"attr": {
{`".*?"`, LiteralString, Pop(1)},
{`'.*?'`, LiteralString, Pop(1)},
{`[^\s>]+`, LiteralString, Pop(1)},
},
},
))

View file

@ -0,0 +1,101 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// ANTLR lexer.
var ANTLR = internal.Register(MustNewLexer(
&Config{
Name: "ANTLR",
Aliases: []string{"antlr"},
Filenames: []string{},
MimeTypes: []string{},
},
Rules{
"whitespace": {
{`\s+`, TextWhitespace, nil},
},
"comments": {
{`//.*$`, Comment, nil},
{`/\*(.|\n)*?\*/`, Comment, nil},
},
"root": {
Include("whitespace"),
Include("comments"),
{`(lexer|parser|tree)?(\s*)(grammar\b)(\s*)([A-Za-z]\w*)(;)`, ByGroups(Keyword, TextWhitespace, Keyword, TextWhitespace, NameClass, Punctuation), nil},
{`options\b`, Keyword, Push("options")},
{`tokens\b`, Keyword, Push("tokens")},
{`(scope)(\s*)([A-Za-z]\w*)(\s*)(\{)`, ByGroups(Keyword, TextWhitespace, NameVariable, TextWhitespace, Punctuation), Push("action")},
{`(catch|finally)\b`, Keyword, Push("exception")},
{`(@[A-Za-z]\w*)(\s*)(::)?(\s*)([A-Za-z]\w*)(\s*)(\{)`, ByGroups(NameLabel, TextWhitespace, Punctuation, TextWhitespace, NameLabel, TextWhitespace, Punctuation), Push("action")},
{`((?:protected|private|public|fragment)\b)?(\s*)([A-Za-z]\w*)(!)?`, ByGroups(Keyword, TextWhitespace, NameLabel, Punctuation), Push("rule-alts", "rule-prelims")},
},
"exception": {
{`\n`, TextWhitespace, Pop(1)},
{`\s`, TextWhitespace, nil},
Include("comments"),
{`\[`, Punctuation, Push("nested-arg-action")},
{`\{`, Punctuation, Push("action")},
},
"rule-prelims": {
Include("whitespace"),
Include("comments"),
{`returns\b`, Keyword, nil},
{`\[`, Punctuation, Push("nested-arg-action")},
{`\{`, Punctuation, Push("action")},
{`(throws)(\s+)([A-Za-z]\w*)`, ByGroups(Keyword, TextWhitespace, NameLabel), nil},
{`(,)(\s*)([A-Za-z]\w*)`, ByGroups(Punctuation, TextWhitespace, NameLabel), nil},
{`options\b`, Keyword, Push("options")},
{`(scope)(\s+)(\{)`, ByGroups(Keyword, TextWhitespace, Punctuation), Push("action")},
{`(scope)(\s+)([A-Za-z]\w*)(\s*)(;)`, ByGroups(Keyword, TextWhitespace, NameLabel, TextWhitespace, Punctuation), nil},
{`(@[A-Za-z]\w*)(\s*)(\{)`, ByGroups(NameLabel, TextWhitespace, Punctuation), Push("action")},
{`:`, Punctuation, Pop(1)},
},
"rule-alts": {
Include("whitespace"),
Include("comments"),
{`options\b`, Keyword, Push("options")},
{`:`, Punctuation, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralString, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
{`<<([^>]|>[^>])>>`, LiteralString, nil},
{`\$?[A-Z_]\w*`, NameConstant, nil},
{`\$?[a-z_]\w*`, NameVariable, nil},
{`(\+|\||->|=>|=|\(|\)|\.\.|\.|\?|\*|\^|!|\#|~)`, Operator, nil},
{`,`, Punctuation, nil},
{`\[`, Punctuation, Push("nested-arg-action")},
{`\{`, Punctuation, Push("action")},
{`;`, Punctuation, Pop(1)},
},
"tokens": {
Include("whitespace"),
Include("comments"),
{`\{`, Punctuation, nil},
{`([A-Z]\w*)(\s*)(=)?(\s*)(\'(?:\\\\|\\\'|[^\']*)\')?(\s*)(;)`, ByGroups(NameLabel, TextWhitespace, Punctuation, TextWhitespace, LiteralString, TextWhitespace, Punctuation), nil},
{`\}`, Punctuation, Pop(1)},
},
"options": {
Include("whitespace"),
Include("comments"),
{`\{`, Punctuation, nil},
{`([A-Za-z]\w*)(\s*)(=)(\s*)([A-Za-z]\w*|\'(?:\\\\|\\\'|[^\']*)\'|[0-9]+|\*)(\s*)(;)`, ByGroups(NameVariable, TextWhitespace, Punctuation, TextWhitespace, Text, TextWhitespace, Punctuation), nil},
{`\}`, Punctuation, Pop(1)},
},
"action": {
{`([^${}\'"/\\]+|"(\\\\|\\"|[^"])*"|'(\\\\|\\'|[^'])*'|//.*$\n?|/\*(.|\n)*?\*/|/(?!\*)(\\\\|\\/|[^/])*/|\\(?!%)|/)+`, Other, nil},
{`(\\)(%)`, ByGroups(Punctuation, Other), nil},
{`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil},
{`\{`, Punctuation, Push()},
{`\}`, Punctuation, Pop(1)},
},
"nested-arg-action": {
{`([^$\[\]\'"/]+|"(\\\\|\\"|[^"])*"|'(\\\\|\\'|[^'])*'|//.*$\n?|/\*(.|\n)*?\*/|/(?!\*)(\\\\|\\/|[^/])*/|/)+`, Other, nil},
{`\[`, Punctuation, Push()},
{`\]`, Punctuation, Pop(1)},
{`(\$[a-zA-Z]+)(\.?)(text|value)?`, ByGroups(NameVariable, Punctuation, NameProperty), nil},
{`(\\\\|\\\]|\\\[|[^\[\]])+`, Other, nil},
},
},
))

View file

@ -0,0 +1,38 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Apacheconf lexer.
var Apacheconf = internal.Register(MustNewLexer(
&Config{
Name: "ApacheConf",
Aliases: []string{"apacheconf", "aconf", "apache"},
Filenames: []string{".htaccess", "apache.conf", "apache2.conf"},
MimeTypes: []string{"text/x-apacheconf"},
CaseInsensitive: true,
},
Rules{
"root": {
{`\s+`, Text, nil},
{`(#.*?)$`, Comment, nil},
{`(<[^\s>]+)(?:(\s+)(.*?))?(>)`, ByGroups(NameTag, Text, LiteralString, NameTag), nil},
{`([a-z]\w*)(\s+)`, ByGroups(NameBuiltin, Text), Push("value")},
{`\.+`, Text, nil},
},
"value": {
{`\\\n`, Text, nil},
{`$`, Text, Pop(1)},
{`\\`, Text, nil},
{`[^\S\n]+`, Text, nil},
{`\d+\.\d+\.\d+\.\d+(?:/\d+)?`, LiteralNumber, nil},
{`\d+`, LiteralNumber, nil},
{`/([a-z0-9][\w./-]+)`, LiteralStringOther, nil},
{`(on|off|none|any|all|double|email|dns|min|minimal|os|productonly|full|emerg|alert|crit|error|warn|notice|info|debug|registry|script|inetd|standalone|user|group)\b`, Keyword, nil},
{`"([^"\\]*(?:\\.[^"\\]*)*)"`, LiteralStringDouble, nil},
{`[^\s"\\]+`, Text, nil},
},
},
))

View file

@ -0,0 +1,36 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Apl lexer.
var Apl = internal.Register(MustNewLexer(
&Config{
Name: "APL",
Aliases: []string{"apl"},
Filenames: []string{"*.apl"},
MimeTypes: []string{},
},
Rules{
"root": {
{`\s+`, Text, nil},
{`[⍝#].*$`, CommentSingle, nil},
{`\'((\'\')|[^\'])*\'`, LiteralStringSingle, nil},
{`"(("")|[^"])*"`, LiteralStringDouble, nil},
{`[⋄◇()]`, Punctuation, nil},
{`[\[\];]`, LiteralStringRegex, nil},
{`⎕[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameFunction, nil},
{`[A-Za-zΔ∆⍙][A-Za-zΔ∆⍙_¯0-9]*`, NameVariable, nil},
{`¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞)([Jj]¯?(0[Xx][0-9A-Fa-f]+|[0-9]*\.?[0-9]+([Ee][+¯]?[0-9]+)?|¯|∞))?`, LiteralNumber, nil},
{`[\.\\/⌿⍀¨⍣⍨⍠⍤∘]`, NameAttribute, nil},
{`[+\-×÷⌈⌊∣|?*⍟○!⌹<≤=>≥≠≡≢∊⍷∪∩~∨∧⍱⍲⍴,⍪⌽⊖⍉↑↓⊂⊃⌷⍋⍒⊤⊥⍕⍎⊣⊢⍁⍂≈⌸⍯↗]`, Operator, nil},
{``, NameConstant, nil},
{`[⎕⍞]`, NameVariableGlobal, nil},
{`[←→]`, KeywordDeclaration, nil},
{`[⍺⍵⍶⍹∇:]`, NameBuiltinPseudo, nil},
{`[{}]`, KeywordType, nil},
},
},
))

View file

@ -0,0 +1,55 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Applescript lexer.
var Applescript = internal.Register(MustNewLexer(
&Config{
Name: "AppleScript",
Aliases: []string{"applescript"},
Filenames: []string{"*.applescript"},
MimeTypes: []string{},
DotAll: true,
},
Rules{
"root": {
{`\s+`, Text, nil},
{`¬\n`, LiteralStringEscape, nil},
{`'s\s+`, Text, nil},
{`(--|#).*?$`, Comment, nil},
{`\(\*`, CommentMultiline, Push("comment")},
{`[(){}!,.:]`, Punctuation, nil},
{`(«)([^»]+)(»)`, ByGroups(Text, NameBuiltin, Text), nil},
{`\b((?:considering|ignoring)\s*)(application responses|case|diacriticals|hyphens|numeric strings|punctuation|white space)`, ByGroups(Keyword, NameBuiltin), nil},
{`(-|\*|\+|&|≠|>=?|<=?|=|≥|≤|/|÷|\^)`, Operator, nil},
{`\b(and|or|is equal|equals|(is )?equal to|is not|isn't|isn't equal( to)?|is not equal( to)?|doesn't equal|does not equal|(is )?greater than|comes after|is not less than or equal( to)?|isn't less than or equal( to)?|(is )?less than|comes before|is not greater than or equal( to)?|isn't greater than or equal( to)?|(is )?greater than or equal( to)?|is not less than|isn't less than|does not come before|doesn't come before|(is )?less than or equal( to)?|is not greater than|isn't greater than|does not come after|doesn't come after|starts? with|begins? with|ends? with|contains?|does not contain|doesn't contain|is in|is contained by|is not in|is not contained by|isn't contained by|div|mod|not|(a )?(ref( to)?|reference to)|is|does)\b`, OperatorWord, nil},
{`^(\s*(?:on|end)\s+)(zoomed|write to file|will zoom|will show|will select tab view item|will resize( sub views)?|will resign active|will quit|will pop up|will open|will move|will miniaturize|will hide|will finish launching|will display outline cell|will display item cell|will display cell|will display browser cell|will dismiss|will close|will become active|was miniaturized|was hidden|update toolbar item|update parameters|update menu item|shown|should zoom|should selection change|should select tab view item|should select row|should select item|should select column|should quit( after last window closed)?|should open( untitled)?|should expand item|should end editing|should collapse item|should close|should begin editing|selection changing|selection changed|selected tab view item|scroll wheel|rows changed|right mouse up|right mouse dragged|right mouse down|resized( sub views)?|resigned main|resigned key|resigned active|read from file|prepare table drop|prepare table drag|prepare outline drop|prepare outline drag|prepare drop|plugin loaded|parameters updated|panel ended|opened|open untitled|number of rows|number of items|number of browser rows|moved|mouse up|mouse moved|mouse exited|mouse entered|mouse dragged|mouse down|miniaturized|load data representation|launched|keyboard up|keyboard down|items changed|item value changed|item value|item expandable|idle|exposed|end editing|drop|drag( (entered|exited|updated))?|double clicked|document nib name|dialog ended|deminiaturized|data representation|conclude drop|column resized|column moved|column clicked|closed|clicked toolbar item|clicked|choose menu item|child of item|changed|change item value|change cell value|cell value changed|cell value|bounds changed|begin editing|became main|became key|awake from nib|alert ended|activated|action|accept table drop|accept outline drop)`, ByGroups(Keyword, NameFunction), nil},
{`^(\s*)(in|on|script|to)(\s+)`, ByGroups(Text, Keyword, Text), nil},
{`\b(as )(alias |application |boolean |class |constant |date |file |integer |list |number |POSIX file |real |record |reference |RGB color |script |text |unit types|(?:Unicode )?text|string)\b`, ByGroups(Keyword, NameClass), nil},
{`\b(AppleScript|current application|false|linefeed|missing value|pi|quote|result|return|space|tab|text item delimiters|true|version)\b`, NameConstant, nil},
{`\b(ASCII (character|number)|activate|beep|choose URL|choose application|choose color|choose file( name)?|choose folder|choose from list|choose remote application|clipboard info|close( access)?|copy|count|current date|delay|delete|display (alert|dialog)|do shell script|duplicate|exists|get eof|get volume settings|info for|launch|list (disks|folder)|load script|log|make|mount volume|new|offset|open( (for access|location))?|path to|print|quit|random number|read|round|run( script)?|say|scripting components|set (eof|the clipboard to|volume)|store script|summarize|system attribute|system info|the clipboard|time to GMT|write|quoted form)\b`, NameBuiltin, nil},
{`\b(considering|else|error|exit|from|if|ignoring|in|repeat|tell|then|times|to|try|until|using terms from|while|with|with timeout( of)?|with transaction|by|continue|end|its?|me|my|return|of|as)\b`, Keyword, nil},
{`\b(global|local|prop(erty)?|set|get)\b`, Keyword, nil},
{`\b(but|put|returning|the)\b`, NameBuiltin, nil},
{`\b(attachment|attribute run|character|day|month|paragraph|word|year)s?\b`, NameBuiltin, nil},
{`\b(about|above|against|apart from|around|aside from|at|below|beneath|beside|between|for|given|instead of|on|onto|out of|over|since)\b`, NameBuiltin, nil},
{`\b(accepts arrow key|action method|active|alignment|allowed identifiers|allows branch selection|allows column reordering|allows column resizing|allows column selection|allows customization|allows editing text attributes|allows empty selection|allows mixed state|allows multiple selection|allows reordering|allows undo|alpha( value)?|alternate image|alternate increment value|alternate title|animation delay|associated file name|associated object|auto completes|auto display|auto enables items|auto repeat|auto resizes( outline column)?|auto save expanded items|auto save name|auto save table columns|auto saves configuration|auto scroll|auto sizes all columns to fit|auto sizes cells|background color|bezel state|bezel style|bezeled|border rect|border type|bordered|bounds( rotation)?|box type|button returned|button type|can choose directories|can choose files|can draw|can hide|cell( (background color|size|type))?|characters|class|click count|clicked( data)? column|clicked data item|clicked( data)? row|closeable|collating|color( (mode|panel))|command key down|configuration|content(s| (size|view( margins)?))?|context|continuous|control key down|control size|control tint|control view|controller visible|coordinate system|copies( on scroll)?|corner view|current cell|current column|current( field)? editor|current( menu)? item|current row|current tab view item|data source|default identifiers|delta (x|y|z)|destination window|directory|display mode|displayed cell|document( (edited|rect|view))?|double value|dragged column|dragged distance|dragged items|draws( cell)? background|draws grid|dynamically scrolls|echos bullets|edge|editable|edited( data)? column|edited data item|edited( data)? row|enabled|enclosing scroll view|ending page|error handling|event number|event type|excluded from windows menu|executable path|expanded|fax number|field editor|file kind|file name|file type|first responder|first visible column|flipped|floating|font( panel)?|formatter|frameworks path|frontmost|gave up|grid color|has data items|has horizontal ruler|has horizontal scroller|has parent data item|has resize indicator|has shadow|has sub menu|has vertical ruler|has vertical scroller|header cell|header view|hidden|hides when deactivated|highlights by|horizontal line scroll|horizontal page scroll|horizontal ruler view|horizontally resizable|icon image|id|identifier|ignores multiple clicks|image( (alignment|dims when disabled|frame style|scaling))?|imports graphics|increment value|indentation per level|indeterminate|index|integer value|intercell spacing|item height|key( (code|equivalent( modifier)?|window))?|knob thickness|label|last( visible)? column|leading offset|leaf|level|line scroll|loaded|localized sort|location|loop mode|main( (bunde|menu|window))?|marker follows cell|matrix mode|maximum( content)? size|maximum visible columns|menu( form representation)?|miniaturizable|miniaturized|minimized image|minimized title|minimum column width|minimum( content)? size|modal|modified|mouse down state|movie( (controller|file|rect))?|muted|name|needs display|next state|next text|number of tick marks|only tick mark values|opaque|open panel|option key down|outline table column|page scroll|pages across|pages down|palette label|pane splitter|parent data item|parent window|pasteboard|path( (names|separator))?|playing|plays every frame|plays selection only|position|preferred edge|preferred type|pressure|previous text|prompt|properties|prototype cell|pulls down|rate|released when closed|repeated|requested print time|required file type|resizable|resized column|resource path|returns records|reuses columns|rich text|roll over|row height|rulers visible|save panel|scripts path|scrollable|selectable( identifiers)?|selected cell|selected( data)? columns?|selected data items?|selected( data)? rows?|selected item identifier|selection by rect|send action on arrow key|sends action when done editing|separates columns|separator item|sequence number|services menu|shared frameworks path|shared support path|sheet|shift key down|shows alpha|shows state by|size( mode)?|smart insert delete enabled|sort case sensitivity|sort column|sort order|sort type|sorted( data rows)?|sound|source( mask)?|spell checking enabled|starting page|state|string value|sub menu|super menu|super view|tab key traverses cells|tab state|tab type|tab view|table view|tag|target( printer)?|text color|text container insert|text container origin|text returned|tick mark position|time stamp|title(d| (cell|font|height|position|rect))?|tool tip|toolbar|trailing offset|transparent|treat packages as directories|truncated labels|types|unmodified characters|update views|use sort indicator|user defaults|uses data source|uses ruler|uses threaded animation|uses title from previous column|value wraps|version|vertical( (line scroll|page scroll|ruler view))?|vertically resizable|view|visible( document rect)?|volume|width|window|windows menu|wraps|zoomable|zoomed)\b`, NameAttribute, nil},
{`\b(action cell|alert reply|application|box|browser( cell)?|bundle|button( cell)?|cell|clip view|color well|color-panel|combo box( item)?|control|data( (cell|column|item|row|source))?|default entry|dialog reply|document|drag info|drawer|event|font(-panel)?|formatter|image( (cell|view))?|matrix|menu( item)?|item|movie( view)?|open-panel|outline view|panel|pasteboard|plugin|popup button|progress indicator|responder|save-panel|scroll view|secure text field( cell)?|slider|sound|split view|stepper|tab view( item)?|table( (column|header cell|header view|view))|text( (field( cell)?|view))?|toolbar( item)?|user-defaults|view|window)s?\b`, NameBuiltin, nil},
{`\b(animate|append|call method|center|close drawer|close panel|display|display alert|display dialog|display panel|go|hide|highlight|increment|item for|load image|load movie|load nib|load panel|load sound|localized string|lock focus|log|open drawer|path for|pause|perform action|play|register|resume|scroll|select( all)?|show|size to fit|start|step back|step forward|stop|synchronize|unlock focus|update)\b`, NameBuiltin, nil},
{`\b((in )?back of|(in )?front of|[0-9]+(st|nd|rd|th)|first|second|third|fourth|fifth|sixth|seventh|eighth|ninth|tenth|after|back|before|behind|every|front|index|last|middle|some|that|through|thru|where|whose)\b`, NameBuiltin, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`\b([a-zA-Z]\w*)\b`, NameVariable, nil},
{`[-+]?(\d+\.\d*|\d*\.\d+)(E[-+][0-9]+)?`, LiteralNumberFloat, nil},
{`[-+]?\d+`, LiteralNumberInteger, nil},
},
"comment": {
{`\(\*`, CommentMultiline, Push()},
{`\*\)`, CommentMultiline, Pop(1)},
{`[^*(]+`, CommentMultiline, nil},
{`[*(]`, CommentMultiline, nil},
},
},
))

View file

@ -0,0 +1,110 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Arduino lexer.
var Arduino = internal.Register(MustNewLexer(
&Config{
Name: "Arduino",
Aliases: []string{"arduino"},
Filenames: []string{"*.ino"},
MimeTypes: []string{"text/x-arduino"},
EnsureNL: true,
},
Rules{
"statements": {
{Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`), Keyword, nil},
{`char(16_t|32_t)\b`, KeywordType, nil},
{`(class)\b`, ByGroups(Keyword, Text), Push("classname")},
{`(R)(")([^\\()\s]{,16})(\()((?:.|\n)*?)(\)\3)(")`, ByGroups(LiteralStringAffix, LiteralString, LiteralStringDelimiter, LiteralStringDelimiter, LiteralString, LiteralStringDelimiter, LiteralString), nil},
{`(u8|u|U)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
{`(L?)(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')`, ByGroups(LiteralStringAffix, LiteralStringChar, LiteralStringChar, LiteralStringChar), nil},
{`(\d+\.\d*|\.\d+|\d+)[eE][+-]?\d+[LlUu]*`, LiteralNumberFloat, nil},
{`(\d+\.\d*|\.\d+|\d+[fF])[fF]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+[LlUu]*`, LiteralNumberHex, nil},
{`0[0-7]+[LlUu]*`, LiteralNumberOct, nil},
{`\d+[LlUu]*`, LiteralNumberInteger, nil},
{`\*/`, Error, nil},
{`[~!%^&*+=|?:<>/-]`, Operator, nil},
{`[()\[\],.]`, Punctuation, nil},
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
{`(_Bool|_Complex|_Imaginary|array|atomic_bool|atomic_char|atomic_int|atomic_llong|atomic_long|atomic_schar|atomic_short|atomic_uchar|atomic_uint|atomic_ullong|atomic_ulong|atomic_ushort|auto|bool|boolean|BooleanVariables|Byte|byte|Char|char|char16_t|char32_t|class|complex|Const|const|const_cast|delete|double|dynamic_cast|enum|explicit|extern|Float|float|friend|inline|Int|int|int16_t|int32_t|int64_t|int8_t|Long|long|new|NULL|null|operator|private|PROGMEM|protected|public|register|reinterpret_cast|short|signed|sizeof|Static|static|static_cast|String|struct|typedef|uint16_t|uint32_t|uint64_t|uint8_t|union|unsigned|virtual|Void|void|Volatile|volatile|word)\b`, KeywordType, nil},
// Start of: Arduino-specific syntax
{`(and|final|If|Loop|loop|not|or|override|setup|Setup|throw|try|xor)\b`, Keyword, nil}, // Addition to keywords already defined by C++
{`(ANALOG_MESSAGE|BIN|CHANGE|DEC|DEFAULT|DIGITAL_MESSAGE|EXTERNAL|FALLING|FIRMATA_STRING|HALF_PI|HEX|HIGH|INPUT|INPUT_PULLUP|INTERNAL|INTERNAL1V1|INTERNAL1V1|INTERNAL2V56|INTERNAL2V56|LED_BUILTIN|LED_BUILTIN_RX|LED_BUILTIN_TX|LOW|LSBFIRST|MSBFIRST|OCT|OUTPUT|PI|REPORT_ANALOG|REPORT_DIGITAL|RISING|SET_PIN_MODE|SYSEX_START|SYSTEM_RESET|TWO_PI)\b`, KeywordConstant, nil},
{`(boolean|const|byte|word|string|String|array)\b`, NameVariable, nil},
{`(Keyboard|KeyboardController|MouseController|SoftwareSerial|EthernetServer|EthernetClient|LiquidCrystal|RobotControl|GSMVoiceCall|EthernetUDP|EsploraTFT|HttpClient|RobotMotor|WiFiClient|GSMScanner|FileSystem|Scheduler|GSMServer|YunClient|YunServer|IPAddress|GSMClient|GSMModem|Keyboard|Ethernet|Console|GSMBand|Esplora|Stepper|Process|WiFiUDP|GSM_SMS|Mailbox|USBHost|Firmata|PImage|Client|Server|GSMPIN|FileIO|Bridge|Serial|EEPROM|Stream|Mouse|Audio|Servo|File|Task|GPRS|WiFi|Wire|TFT|GSM|SPI|SD)\b`, NameClass, nil},
{`(abs|Abs|accept|ACos|acos|acosf|addParameter|analogRead|AnalogRead|analogReadResolution|AnalogReadResolution|analogReference|AnalogReference|analogWrite|AnalogWrite|analogWriteResolution|AnalogWriteResolution|answerCall|asin|ASin|asinf|atan|ATan|atan2|ATan2|atan2f|atanf|attach|attached|attachGPRS|attachInterrupt|AttachInterrupt|autoscroll|available|availableForWrite|background|beep|begin|beginPacket|beginSD|beginSMS|beginSpeaker|beginTFT|beginTransmission|beginWrite|bit|Bit|BitClear|bitClear|bitRead|BitRead|bitSet|BitSet|BitWrite|bitWrite|blink|blinkVersion|BSSID|buffer|byte|cbrt|cbrtf|Ceil|ceil|ceilf|changePIN|char|charAt|checkPIN|checkPUK|checkReg|circle|cityNameRead|cityNameWrite|clear|clearScreen|click|close|compareTo|compassRead|concat|config|connect|connected|constrain|Constrain|copysign|copysignf|cos|Cos|cosf|cosh|coshf|countryNameRead|countryNameWrite|createChar|cursor|debugPrint|degrees|Delay|delay|DelayMicroseconds|delayMicroseconds|detach|DetachInterrupt|detachInterrupt|DigitalPinToInterrupt|digitalPinToInterrupt|DigitalRead|digitalRead|DigitalWrite|digitalWrite|disconnect|display|displayLogos|drawBMP|drawCompass|encryptionType|end|endPacket|endSMS|endsWith|endTransmission|endWrite|equals|equalsIgnoreCase|exists|exitValue|Exp|exp|expf|fabs|fabsf|fdim|fdimf|fill|find|findUntil|float|floor|Floor|floorf|flush|fma|fmaf|fmax|fmaxf|fmin|fminf|fmod|fmodf|gatewayIP|get|getAsynchronously|getBand|getButton|getBytes|getCurrentCarrier|getIMEI|getKey|getModifiers|getOemKey|getPINUsed|getResult|getSignalStrength|getSocket|getVoiceCallStatus|getXChange|getYChange|hangCall|height|highByte|HighByte|home|hypot|hypotf|image|indexOf|int|interrupts|IPAddress|IRread|isActionDone|isAlpha|isAlphaNumeric|isAscii|isControl|isDigit|isDirectory|isfinite|isGraph|isHexadecimalDigit|isinf|isListening|isLowerCase|isnan|isPIN|isPressed|isPrintable|isPunct|isSpace|isUpperCase|isValid|isWhitespace|keyboardRead|keyPressed|keyReleased|knobRead|lastIndexOf|ldexp|ldexpf|leftToRight|length|line|lineFollowConfig|listen|listenOnLocalhost|loadImage|localIP|log|Log|log10|log10f|logf|long|lowByte|LowByte|lrint|lrintf|lround|lroundf|macAddress|maintain|map|Map|Max|max|messageAvailable|Micros|micros|millis|Millis|Min|min|mkdir|motorsStop|motorsWrite|mouseDragged|mouseMoved|mousePressed|mouseReleased|move|noAutoscroll|noBlink|noBuffer|noCursor|noDisplay|noFill|noInterrupts|NoInterrupts|noListenOnLocalhost|noStroke|noTone|NoTone|onReceive|onRequest|open|openNextFile|overflow|parseCommand|parseFloat|parseInt|parsePacket|pauseMode|peek|PinMode|pinMode|playFile|playMelody|point|pointTo|position|Pow|pow|powf|prepare|press|print|printFirmwareVersion|println|printVersion|process|processInput|PulseIn|pulseIn|pulseInLong|PulseInLong|put|radians|random|Random|randomSeed|RandomSeed|read|readAccelerometer|readBlue|readButton|readBytes|readBytesUntil|readGreen|readJoystickButton|readJoystickSwitch|readJoystickX|readJoystickY|readLightSensor|readMessage|readMicrophone|readNetworks|readRed|readSlider|readString|readStringUntil|readTemperature|ready|rect|release|releaseAll|remoteIP|remoteNumber|remotePort|remove|replace|requestFrom|retrieveCallingNumber|rewindDirectory|rightToLeft|rmdir|robotNameRead|robotNameWrite|round|roundf|RSSI|run|runAsynchronously|running|runShellCommand|runShellCommandAsynchronously|scanNetworks|scrollDisplayLeft|scrollDisplayRight|seek|sendAnalog|sendDigitalPortPair|sendDigitalPorts|sendString|sendSysex|Serial_Available|Serial_Begin|Serial_End|Serial_Flush|Serial_Peek|Serial_Print|Serial_Println|Serial_Read|serialEvent|setBand|setBitOrder|setCharAt|setClockDivider|setCursor|setDataMode|setDNS|setFirmwareVersion|setMode|setPINUsed|setSpeed|setTextSize|setTimeout|ShiftIn|shiftIn|ShiftOut|shiftOut|shutdown|signbit|sin|Sin|sinf|sinh|sinhf|size|sizeof|Sq|sq|Sqrt|sqrt|sqrtf|SSID|startLoop|startsWith|step|stop|stroke|subnetMask|substring|switchPIN|tan|Tan|tanf|tanh|tanhf|tempoWrite|text|toCharArray|toInt|toLowerCase|tone|Tone|toUpperCase|transfer|trim|trunc|truncf|tuneWrite|turn|updateIR|userNameRead|userNameWrite|voiceCall|waitContinue|width|WiFiServer|word|write|writeBlue|writeGreen|writeJSON|writeMessage|writeMicroseconds|writeRed|writeRGB|yield|Yield)\b`, NameFunction, nil},
// End of: Arduino-specific syntax
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
{`(true|false|NULL)\b`, NameBuiltin, nil},
{`([a-zA-Z_]\w*)(\s*)(:)(?!:)`, ByGroups(NameLabel, Text, Punctuation), nil},
{`[a-zA-Z_]\w*`, Name, nil},
},
"root": {
Include("whitespace"),
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;{]*)(\{)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), Push("function")},
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;]*)(;)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), nil},
Default(Push("statement")),
{Words(`__`, `\b`, `virtual_inheritance`, `uuidof`, `super`, `single_inheritance`, `multiple_inheritance`, `interface`, `event`), KeywordReserved, nil},
{`__(offload|blockingoffload|outer)\b`, KeywordPseudo, nil},
},
"classname": {
{`[a-zA-Z_]\w*`, NameClass, Pop(1)},
{`\s*(?=>)`, Text, Pop(1)},
},
"whitespace": {
{`^#if\s+0`, CommentPreproc, Push("if0")},
{`^#`, CommentPreproc, Push("macro")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("if0")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("macro")},
{`\n`, Text, nil},
{`\s+`, Text, nil},
{`\\\n`, Text, nil},
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
},
"statement": {
Include("whitespace"),
Include("statements"),
{`[{}]`, Punctuation, nil},
{`;`, Punctuation, Pop(1)},
},
"function": {
Include("whitespace"),
Include("statements"),
{`;`, Punctuation, nil},
{`\{`, Punctuation, Push()},
{`\}`, Punctuation, Pop(1)},
},
"string": {
{`"`, LiteralString, Pop(1)},
{`\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})`, LiteralStringEscape, nil},
{`[^\\"\n]+`, LiteralString, nil},
{`\\\n`, LiteralString, nil},
{`\\`, LiteralString, nil},
},
"macro": {
{`(include)(\s*(?:/[*].*?[*]/\s*)?)([^\n]+)`, ByGroups(CommentPreproc, Text, CommentPreprocFile), nil},
{`[^/\n]+`, CommentPreproc, nil},
{`/[*](.|\n)*?[*]/`, CommentMultiline, nil},
{`//.*?\n`, CommentSingle, Pop(1)},
{`/`, CommentPreproc, nil},
{`(?<=\\)\n`, CommentPreproc, nil},
{`\n`, CommentPreproc, Pop(1)},
},
"if0": {
{`^\s*#if.*?(?<!\\)\n`, CommentPreproc, Push()},
{`^\s*#el(?:se|if).*\n`, CommentPreproc, Pop(1)},
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
{`.*?\n`, Comment, nil},
},
},
))

View file

@ -0,0 +1,48 @@
package a
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Awk lexer.
var Awk = internal.Register(MustNewLexer(
&Config{
Name: "Awk",
Aliases: []string{"awk", "gawk", "mawk", "nawk"},
Filenames: []string{"*.awk"},
MimeTypes: []string{"application/x-awk"},
},
Rules{
"commentsandwhitespace": {
{`\s+`, Text, nil},
{`#.*$`, CommentSingle, nil},
},
"slashstartsregex": {
Include("commentsandwhitespace"),
{`/(\\.|[^[/\\\n]|\[(\\.|[^\]\\\n])*])+/\B`, LiteralStringRegex, Pop(1)},
{`(?=/)`, Text, Push("#pop", "badregex")},
Default(Pop(1)),
},
"badregex": {
{`\n`, Text, Pop(1)},
},
"root": {
{`^(?=\s|/)`, Text, Push("slashstartsregex")},
Include("commentsandwhitespace"),
{`\+\+|--|\|\||&&|in\b|\$|!?~|(\*\*|[-<>+*%\^/!=|])=?`, Operator, Push("slashstartsregex")},
{`[{(\[;,]`, Punctuation, Push("slashstartsregex")},
{`[})\].]`, Punctuation, nil},
{`(break|continue|do|while|exit|for|if|else|return)\b`, Keyword, Push("slashstartsregex")},
{`function\b`, KeywordDeclaration, Push("slashstartsregex")},
{`(atan2|cos|exp|int|log|rand|sin|sqrt|srand|gensub|gsub|index|length|match|split|sprintf|sub|substr|tolower|toupper|close|fflush|getline|next|nextfile|print|printf|strftime|systime|delete|system)\b`, KeywordReserved, nil},
{`(ARGC|ARGIND|ARGV|BEGIN|CONVFMT|ENVIRON|END|ERRNO|FIELDWIDTHS|FILENAME|FNR|FS|IGNORECASE|NF|NR|OFMT|OFS|ORFS|RLENGTH|RS|RSTART|RT|SUBSEP)\b`, NameBuiltin, nil},
{`[$a-zA-Z_]\w*`, NameOther, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
},
},
))

View file

@ -0,0 +1,46 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Ballerina lexer.
var Ballerina = internal.Register(MustNewLexer(
&Config{
Name: "Ballerina",
Aliases: []string{"ballerina"},
Filenames: []string{"*.bal"},
MimeTypes: []string{"text/x-ballerina"},
DotAll: true,
},
Rules{
"root": {
{`[^\S\n]+`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`(break|catch|continue|done|else|finally|foreach|forever|fork|if|lock|match|return|throw|transaction|try|while)\b`, Keyword, nil},
{`((?:(?:[^\W\d]|\$)[\w.\[\]$<>]*\s+)+?)((?:[^\W\d]|\$)[\w$]*)(\s*)(\()`, ByGroups(UsingSelf("root"), NameFunction, Text, Operator), nil},
{`@[^\W\d][\w.]*`, NameDecorator, nil},
{`(annotation|bind|but|endpoint|error|function|object|private|public|returns|service|type|var|with|worker)\b`, KeywordDeclaration, nil},
{`(boolean|byte|decimal|float|int|json|map|nil|record|string|table|xml)\b`, KeywordType, nil},
{`(true|false|null)\b`, KeywordConstant, nil},
{`(import)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
{`'\\.'|'[^\\]'|'\\u[0-9a-fA-F]{4}'`, LiteralStringChar, nil},
{`(\.)((?:[^\W\d]|\$)[\w$]*)`, ByGroups(Operator, NameAttribute), nil},
{`^\s*([^\W\d]|\$)[\w$]*:`, NameLabel, nil},
{`([^\W\d]|\$)[\w$]*`, Name, nil},
{`([0-9][0-9_]*\.([0-9][0-9_]*)?|\.[0-9][0-9_]*)([eE][+\-]?[0-9][0-9_]*)?[fFdD]?|[0-9][eE][+\-]?[0-9][0-9_]*[fFdD]?|[0-9]([eE][+\-]?[0-9][0-9_]*)?[fFdD]|0[xX]([0-9a-fA-F][0-9a-fA-F_]*\.?|([0-9a-fA-F][0-9a-fA-F_]*)?\.[0-9a-fA-F][0-9a-fA-F_]*)[pP][+\-]?[0-9][0-9_]*[fFdD]?`, LiteralNumberFloat, nil},
{`0[xX][0-9a-fA-F][0-9a-fA-F_]*[lL]?`, LiteralNumberHex, nil},
{`0[bB][01][01_]*[lL]?`, LiteralNumberBin, nil},
{`0[0-7_]+[lL]?`, LiteralNumberOct, nil},
{`0|[1-9][0-9_]*[lL]?`, LiteralNumberInteger, nil},
{`[~^*!%&\[\](){}<>|+=:;,./?-]`, Operator, nil},
{`\n`, Text, nil},
},
"import": {
{`[\w.]+`, NameNamespace, Pop(1)},
},
},
))

View file

@ -0,0 +1,95 @@
package b
import (
"regexp"
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
var bashAnalyserRe = regexp.MustCompile(`(?m)^#!.*/bin/(?:env |)(?:bash|zsh|sh|ksh)`)
// Bash lexer.
var Bash = internal.Register(MustNewLexer(
&Config{
Name: "Bash",
Aliases: []string{"bash", "sh", "ksh", "zsh", "shell"},
Filenames: []string{"*.sh", "*.ksh", "*.bash", "*.ebuild", "*.eclass", "*.exheres-0", "*.exlib", "*.zsh", "*.zshrc", ".bashrc", "bashrc", ".bash_*", "bash_*", "zshrc", ".zshrc", "PKGBUILD"},
MimeTypes: []string{"application/x-sh", "application/x-shellscript"},
},
Rules{
"root": {
Include("basic"),
{"`", LiteralStringBacktick, Push("backticks")},
Include("data"),
Include("interp"),
},
"interp": {
{`\$\(\(`, Keyword, Push("math")},
{`\$\(`, Keyword, Push("paren")},
{`\$\{#?`, LiteralStringInterpol, Push("curly")},
{`\$[a-zA-Z_]\w*`, NameVariable, nil},
{`\$(?:\d+|[#$?!_*@-])`, NameVariable, nil},
{`\$`, Text, nil},
},
"basic": {
{`\b(if|fi|else|while|do|done|for|then|return|function|case|select|continue|until|esac|elif)(\s*)\b`, ByGroups(Keyword, Text), nil},
{"\\b(alias|bg|bind|break|builtin|caller|cd|command|compgen|complete|declare|dirs|disown|echo|enable|eval|exec|exit|export|false|fc|fg|getopts|hash|help|history|jobs|kill|let|local|logout|popd|printf|pushd|pwd|read|readonly|set|shift|shopt|source|suspend|test|time|times|trap|true|type|typeset|ulimit|umask|unalias|unset|wait)(?=[\\s)`])", NameBuiltin, nil},
{`\A#!.+\n`, CommentPreproc, nil},
{`#.*\S`, CommentSingle, nil},
{`\\[\w\W]`, LiteralStringEscape, nil},
{`(\b\w+)(\s*)(\+?=)`, ByGroups(NameVariable, Text, Operator), nil},
{`[\[\]{}()=]`, Operator, nil},
{`<<<`, Operator, nil},
{`<<-?\s*(\'?)\\?(\w+)[\w\W]+?\2`, LiteralString, nil},
{`&&|\|\|`, Operator, nil},
},
"data": {
{`(?s)\$?"(\\\\|\\[0-7]+|\\.|[^"\\$])*"`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Push("string")},
{`(?s)\$'(\\\\|\\[0-7]+|\\.|[^'\\])*'`, LiteralStringSingle, nil},
{`(?s)'.*?'`, LiteralStringSingle, nil},
{`;`, Punctuation, nil},
{`&`, Punctuation, nil},
{`\|`, Punctuation, nil},
{`\s+`, Text, nil},
{`\d+(?= |$)`, LiteralNumber, nil},
{"[^=\\s\\[\\]{}()$\"\\'`\\\\<&|;]+", Text, nil},
{`<`, Text, nil},
},
"string": {
{`"`, LiteralStringDouble, Pop(1)},
{`(?s)(\\\\|\\[0-7]+|\\.|[^"\\$])+`, LiteralStringDouble, nil},
Include("interp"),
},
"curly": {
{`\}`, LiteralStringInterpol, Pop(1)},
{`:-`, Keyword, nil},
{`\w+`, NameVariable, nil},
{"[^}:\"\\'`$\\\\]+", Punctuation, nil},
{`:`, Punctuation, nil},
Include("root"),
},
"paren": {
{`\)`, Keyword, Pop(1)},
Include("root"),
},
"math": {
{`\)\)`, Keyword, Pop(1)},
{`[-+*/%^|&]|\*\*|\|\|`, Operator, nil},
{`\d+#\d+`, LiteralNumber, nil},
{`\d+#(?! )`, LiteralNumber, nil},
{`\d+`, LiteralNumber, nil},
Include("root"),
},
"backticks": {
{"`", LiteralStringBacktick, Pop(1)},
Include("root"),
},
},
).SetAnalyser(func(text string) float32 {
if bashAnalyserRe.FindString(text) != "" {
return 1.0
}
return 0.0
}))

View file

@ -0,0 +1,194 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Batchfile lexer.
var Batchfile = internal.Register(MustNewLexer(
&Config{
Name: "Batchfile",
Aliases: []string{"bat", "batch", "dosbatch", "winbatch"},
Filenames: []string{"*.bat", "*.cmd"},
MimeTypes: []string{"application/x-dos-batch"},
CaseInsensitive: true,
},
Rules{
"root": {
{`\)((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*)`, CommentSingle, nil},
{`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow")},
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
Include("redirect"),
{`[\n\x1a]+`, Text, nil},
{`\(`, Punctuation, Push("root/compound")},
{`@+`, Punctuation, nil},
{`((?:for|if|rem)(?:(?=(?:\^[\n\x1a]?)?/)|(?:(?!\^)|(?<=m))(?:(?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+)?(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?)`, ByGroups(Keyword, UsingSelf("text")), Push("follow")},
{`(goto(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|])*(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|])*)`, ByGroups(Keyword, UsingSelf("text")), Push("follow")},
{Words(``, `(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])`, `assoc`, `break`, `cd`, `chdir`, `cls`, `color`, `copy`, `date`, `del`, `dir`, `dpath`, `echo`, `endlocal`, `erase`, `exit`, `ftype`, `keys`, `md`, `mkdir`, `mklink`, `move`, `path`, `pause`, `popd`, `prompt`, `pushd`, `rd`, `ren`, `rename`, `rmdir`, `setlocal`, `shift`, `start`, `time`, `title`, `type`, `ver`, `verify`, `vol`), Keyword, Push("follow")},
{`(call)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("call")},
{`call(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])`, Keyword, nil},
{`(for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/f(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/f", "for")},
{`(for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/l(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/l", "for")},
{`for(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])(?!\^)`, Keyword, Push("for2", "for")},
{`(goto(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:?)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("label")},
{`(if(?:(?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:/i(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:not(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), Keyword, UsingSelf("text")), Push("(?", "if")},
{`rem(((?=\()|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)?.*|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])(?:(?:[^\n\x1a^]|\^[\n\x1a]?[\w\W])*))`, CommentSingle, Push("follow")},
{`(set(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:\^[\n\x1a]?)?[^\S\n])*)(/a)`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("arithmetic")},
{`(set(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:/p)?)((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|^=]|\^[\n\x1a]?[^"=])+)?)((?:(?:\^[\n\x1a]?)?=)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), UsingSelf("variable"), Punctuation), Push("follow")},
Default(Push("follow")),
},
"follow": {
{`((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:)([\t\v\f\r ,;=\xa0]*)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*))(.*)`, ByGroups(Text, Punctuation, Text, NameLabel, CommentSingle), nil},
Include("redirect"),
{`(?=[\n\x1a])`, Text, Pop(1)},
{`\|\|?|&&?`, Punctuation, Pop(1)},
Include("text"),
},
"arithmetic": {
{`0[0-7]+`, LiteralNumberOct, nil},
{`0x[\da-f]+`, LiteralNumberHex, nil},
{`\d+`, LiteralNumberInteger, nil},
{`[(),]+`, Punctuation, nil},
{`([=+\-*/!~]|%|\^\^)+`, Operator, nil},
{`((?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(\^[\n\x1a]?)?[^()=+\-*/!~%^"\n\x1a&<>|\t\v\f\r ,;=\xa0]|\^[\n\x1a\t\v\f\r ,;=\xa0]?[\w\W])+`, UsingSelf("variable"), nil},
{`(?=[\x00|&])`, Text, Pop(1)},
Include("follow"),
},
"call": {
{`(:?)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*))`, ByGroups(Punctuation, NameLabel), Pop(1)},
},
"label": {
{`((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^]|\^[\n\x1a]?[\w\W])*)?)((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|\^[\n\x1a]?[\w\W]|[^"%^\n\x1a&<>|])*)`, ByGroups(NameLabel, CommentSingle), Pop(1)},
},
"redirect": {
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])\d)?)(>>?&|<&)([\n\x1a\t\v\f\r ,;=\xa0]*)(\d)`, ByGroups(LiteralNumberInteger, Punctuation, Text, LiteralNumberInteger), nil},
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])(?<!\^[\n\x1a])\d)?)(>>?|<)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(LiteralNumberInteger, Punctuation, UsingSelf("text")), nil},
},
"root/compound": {
{`\)`, Punctuation, Pop(1)},
{`(?=((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:))`, Text, Push("follow/compound")},
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
Include("redirect/compound"),
{`[\n\x1a]+`, Text, nil},
{`\(`, Punctuation, Push("root/compound")},
{`@+`, Punctuation, nil},
{`((?:for|if|rem)(?:(?=(?:\^[\n\x1a]?)?/)|(?:(?!\^)|(?<=m))(?:(?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0)])+)?(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?)`, ByGroups(Keyword, UsingSelf("text")), Push("follow/compound")},
{`(goto(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|)])*(?:\^[\n\x1a]?)?/(?:\^[\n\x1a]?)?\?(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"%\n\x1a&<>|)])*)`, ByGroups(Keyword, UsingSelf("text")), Push("follow/compound")},
{Words(``, `(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))`, `assoc`, `break`, `cd`, `chdir`, `cls`, `color`, `copy`, `date`, `del`, `dir`, `dpath`, `echo`, `endlocal`, `erase`, `exit`, `ftype`, `keys`, `md`, `mkdir`, `mklink`, `move`, `path`, `pause`, `popd`, `prompt`, `pushd`, `rd`, `ren`, `rename`, `rmdir`, `setlocal`, `shift`, `start`, `time`, `title`, `type`, `ver`, `verify`, `vol`), Keyword, Push("follow/compound")},
{`(call)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("call/compound")},
{`call(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))`, Keyword, nil},
{`(for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/f(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/f", "for")},
{`(for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(/l(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("for/l", "for")},
{`for(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?!\^)`, Keyword, Push("for2", "for")},
{`(goto(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(:?)`, ByGroups(Keyword, UsingSelf("text"), Punctuation), Push("label/compound")},
{`(if(?:(?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))(?!\^))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:/i(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)((?:not(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))?)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), Keyword, UsingSelf("text")), Push("(?", "if")},
{`rem(((?=\()|(?:(?=\))|(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)?.*|(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(]))(?:(?:[^\n\x1a^)]|\^[\n\x1a]?[^)])*))`, CommentSingle, Push("follow/compound")},
{`(set(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:\^[\n\x1a]?)?[^\S\n])*)(/a)`, ByGroups(Keyword, UsingSelf("text"), Keyword), Push("arithmetic/compound")},
{`(set(?:(?=\))|(?=(?:\^[\n\x1a]?)?[\t\v\f\r ,;=\xa0+./:[\\\]]|[\n\x1a&<>|(])))((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:/p)?)((?:(?:\^[\n\x1a]?)?[^\S\n])*)((?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|^=)]|\^[\n\x1a]?[^"=])+)?)((?:(?:\^[\n\x1a]?)?=)?)`, ByGroups(Keyword, UsingSelf("text"), Keyword, UsingSelf("text"), UsingSelf("variable"), Punctuation), Push("follow/compound")},
Default(Push("follow/compound")),
},
"follow/compound": {
{`(?=\))`, Text, Pop(1)},
{`((?:(?<=^[^:])|^[^:]?)[\t\v\f\r ,;=\xa0]*)(:)([\t\v\f\r ,;=\xa0]*)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*))(.*)`, ByGroups(Text, Punctuation, Text, NameLabel, CommentSingle), nil},
Include("redirect/compound"),
{`(?=[\n\x1a])`, Text, Pop(1)},
{`\|\|?|&&?`, Punctuation, Pop(1)},
Include("text"),
},
"arithmetic/compound": {
{`(?=\))`, Text, Pop(1)},
{`0[0-7]+`, LiteralNumberOct, nil},
{`0x[\da-f]+`, LiteralNumberHex, nil},
{`\d+`, LiteralNumberInteger, nil},
{`[(),]+`, Punctuation, nil},
{`([=+\-*/!~]|%|\^\^)+`, Operator, nil},
{`((?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(\^[\n\x1a]?)?[^()=+\-*/!~%^"\n\x1a&<>|\t\v\f\r ,;=\xa0]|\^[\n\x1a\t\v\f\r ,;=\xa0]?[^)])+`, UsingSelf("variable"), nil},
{`(?=[\x00|&])`, Text, Pop(1)},
Include("follow"),
},
"call/compound": {
{`(?=\))`, Text, Pop(1)},
{`(:?)((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*))`, ByGroups(Punctuation, NameLabel), Pop(1)},
},
"label/compound": {
{`(?=\))`, Text, Pop(1)},
{`((?:(?:[^\n\x1a&<>|\t\v\f\r ,;=\xa0+:^)]|\^[\n\x1a]?[^)])*)?)((?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|\^[\n\x1a]?[^)]|[^"%^\n\x1a&<>|)])*)`, ByGroups(NameLabel, CommentSingle), Pop(1)},
},
"redirect/compound": {
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])\d)?)(>>?&|<&)([\n\x1a\t\v\f\r ,;=\xa0]*)(\d)`, ByGroups(LiteralNumberInteger, Punctuation, Text, LiteralNumberInteger), nil},
{`((?:(?<=[\n\x1a\t\v\f\r ,;=\xa0])(?<!\^[\n\x1a])\d)?)(>>?|<)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0)])+))+))`, ByGroups(LiteralNumberInteger, Punctuation, UsingSelf("text")), nil},
},
"variable-or-escape": {
{`(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))`, NameVariable, nil},
{`%%|\^[\n\x1a]?(\^!|[\w\W])`, LiteralStringEscape, nil},
},
"string": {
{`"`, LiteralStringDouble, Pop(1)},
{`(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))`, NameVariable, nil},
{`\^!|%%`, LiteralStringEscape, nil},
{`[^"%^\n\x1a]+|[%^]`, LiteralStringDouble, nil},
Default(Pop(1)),
},
"sqstring": {
Include("variable-or-escape"),
{`[^%]+|%`, LiteralStringSingle, nil},
},
"bqstring": {
Include("variable-or-escape"),
{`[^%]+|%`, LiteralStringBacktick, nil},
},
"text": {
{`"`, LiteralStringDouble, Push("string")},
Include("variable-or-escape"),
{`[^"%^\n\x1a&<>|\t\v\f\r ,;=\xa0\d)]+|.`, Text, nil},
},
"variable": {
{`"`, LiteralStringDouble, Push("string")},
Include("variable-or-escape"),
{`[^"%^\n\x1a]+|.`, NameVariable, nil},
},
"for": {
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(in)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(\()`, ByGroups(UsingSelf("text"), Keyword, UsingSelf("text"), Punctuation), Pop(1)},
Include("follow"),
},
"for2": {
{`\)`, Punctuation, nil},
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(do(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))`, ByGroups(UsingSelf("text"), Keyword), Pop(1)},
{`[\n\x1a]+`, Text, nil},
Include("follow"),
},
"for/f": {
{`(")((?:(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[^"])*?")([\n\x1a\t\v\f\r ,;=\xa0]*)(\))`, ByGroups(LiteralStringDouble, UsingSelf("string"), Text, Punctuation), nil},
{`"`, LiteralStringDouble, Push("#pop", "for2", "string")},
{`('(?:%%|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|[\w\W])*?')([\n\x1a\t\v\f\r ,;=\xa0]*)(\))`, ByGroups(UsingSelf("sqstring"), Text, Punctuation), nil},
{"(`(?:%%|(?:(?:%(?:\\*|(?:~[a-z]*(?:\\$[^:]+:)?)?\\d|[^%:\\n\\x1a]+(?::(?:~(?:-?\\d+)?(?:,(?:-?\\d+)?)?|(?:[^%\\n\\x1a^]|\\^[^%\\n\\x1a])[^=\\n\\x1a]*=(?:[^%\\n\\x1a^]|\\^[^%\\n\\x1a])*)?)?%))|(?:\\^?![^!:\\n\\x1a]+(?::(?:~(?:-?\\d+)?(?:,(?:-?\\d+)?)?|(?:[^!\\n\\x1a^]|\\^[^!\\n\\x1a])[^=\\n\\x1a]*=(?:[^!\\n\\x1a^]|\\^[^!\\n\\x1a])*)?)?\\^?!))|[\\w\\W])*?`)([\\n\\x1a\\t\\v\\f\\r ,;=\\xa0]*)(\\))", ByGroups(UsingSelf("bqstring"), Text, Punctuation), nil},
Include("for2"),
},
"for/l": {
{`-?\d+`, LiteralNumberInteger, nil},
Include("for2"),
},
"if": {
{`((?:cmdextversion|errorlevel)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))(\d+)`, ByGroups(Keyword, UsingSelf("text"), LiteralNumberInteger), Pop(1)},
{`(defined(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(Keyword, UsingSelf("text"), UsingSelf("variable")), Pop(1)},
{`(exist(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(Keyword, UsingSelf("text")), Pop(1)},
{`((?:-?(?:0[0-7]+|0x[\da-f]+|\d+)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a]))(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:equ|geq|gtr|leq|lss|neq))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:-?(?:0[0-7]+|0x[\da-f]+|\d+)(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])))`, ByGroups(UsingSelf("arithmetic"), OperatorWord, UsingSelf("arithmetic")), Pop(1)},
{`(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+)`, UsingSelf("text"), Push("#pop", "if2")},
},
"if2": {
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?)(==)((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)?(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(UsingSelf("text"), Operator, UsingSelf("text")), Pop(1)},
{`((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+))((?:equ|geq|gtr|leq|lss|neq))((?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)(?:[&<>|]+|(?:(?:"[^\n\x1a"]*(?:"|(?=[\n\x1a])))|(?:(?:%(?:\*|(?:~[a-z]*(?:\$[^:]+:)?)?\d|[^%:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^%\n\x1a^]|\^[^%\n\x1a])[^=\n\x1a]*=(?:[^%\n\x1a^]|\^[^%\n\x1a])*)?)?%))|(?:\^?![^!:\n\x1a]+(?::(?:~(?:-?\d+)?(?:,(?:-?\d+)?)?|(?:[^!\n\x1a^]|\^[^!\n\x1a])[^=\n\x1a]*=(?:[^!\n\x1a^]|\^[^!\n\x1a])*)?)?\^?!))|(?:(?:(?:\^[\n\x1a]?)?[^"\n\x1a&<>|\t\v\f\r ,;=\xa0])+))+))`, ByGroups(UsingSelf("text"), OperatorWord, UsingSelf("text")), Pop(1)},
},
"(?": {
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
{`\(`, Punctuation, Push("#pop", "else?", "root/compound")},
Default(Pop(1)),
},
"else?": {
{`(?:(?:(?:\^[\n\x1a])?[\t\v\f\r ,;=\xa0])+)`, UsingSelf("text"), nil},
{`else(?=\^?[\t\v\f\r ,;=\xa0]|[&<>|\n\x1a])`, Keyword, Pop(1)},
Default(Pop(1)),
},
},
))

View file

@ -0,0 +1,76 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Bibtex lexer.
var Bibtex = internal.Register(MustNewLexer(
&Config{
Name: "BibTeX",
Aliases: []string{"bib", "bibtex"},
Filenames: []string{"*.bib"},
MimeTypes: []string{"text/x-bibtex"},
NotMultiline: true,
CaseInsensitive: true,
},
Rules{
"root": {
Include("whitespace"),
{`@comment`, Comment, nil},
{`@preamble`, NameClass, Push("closing-brace", "value", "opening-brace")},
{`@string`, NameClass, Push("closing-brace", "field", "opening-brace")},
{"@[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameClass, Push("closing-brace", "command-body", "opening-brace")},
{`.+`, Comment, nil},
},
"opening-brace": {
Include("whitespace"),
{`[{(]`, Punctuation, Pop(1)},
},
"closing-brace": {
Include("whitespace"),
{`[})]`, Punctuation, Pop(1)},
},
"command-body": {
Include("whitespace"),
{`[^\s\,\}]+`, NameLabel, Push("#pop", "fields")},
},
"fields": {
Include("whitespace"),
{`,`, Punctuation, Push("field")},
Default(Pop(1)),
},
"field": {
Include("whitespace"),
{"[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameAttribute, Push("value", "=")},
Default(Pop(1)),
},
"=": {
Include("whitespace"),
{`=`, Punctuation, Pop(1)},
},
"value": {
Include("whitespace"),
{"[a-z_@!$&*+\\-./:;<>?\\[\\\\\\]^`|~][\\w@!$&*+\\-./:;<>?\\[\\\\\\]^`|~]*", NameVariable, nil},
{`"`, LiteralString, Push("quoted-string")},
{`\{`, LiteralString, Push("braced-string")},
{`[\d]+`, LiteralNumber, nil},
{`#`, Punctuation, nil},
Default(Pop(1)),
},
"quoted-string": {
{`\{`, LiteralString, Push("braced-string")},
{`"`, LiteralString, Pop(1)},
{`[^\{\"]+`, LiteralString, nil},
},
"braced-string": {
{`\{`, LiteralString, Push()},
{`\}`, LiteralString, Pop(1)},
{`[^\{\}]+`, LiteralString, nil},
},
"whitespace": {
{`\s+`, Text, nil},
},
},
))

View file

@ -0,0 +1,48 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Blitzbasic lexer.
var Blitzbasic = internal.Register(MustNewLexer(
&Config{
Name: "BlitzBasic",
Aliases: []string{"blitzbasic", "b3d", "bplus"},
Filenames: []string{"*.bb", "*.decls"},
MimeTypes: []string{"text/x-bb"},
CaseInsensitive: true,
},
Rules{
"root": {
{`[ \t]+`, Text, nil},
{`;.*?\n`, CommentSingle, nil},
{`"`, LiteralStringDouble, Push("string")},
{`[0-9]+\.[0-9]*(?!\.)`, LiteralNumberFloat, nil},
{`\.[0-9]+(?!\.)`, LiteralNumberFloat, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`\$[0-9a-f]+`, LiteralNumberHex, nil},
{`\%[10]+`, LiteralNumberBin, nil},
{Words(`\b`, `\b`, `Shl`, `Shr`, `Sar`, `Mod`, `Or`, `And`, `Not`, `Abs`, `Sgn`, `Handle`, `Int`, `Float`, `Str`, `First`, `Last`, `Before`, `After`), Operator, nil},
{`([+\-*/~=<>^])`, Operator, nil},
{`[(),:\[\]\\]`, Punctuation, nil},
{`\.([ \t]*)([a-z]\w*)`, NameLabel, nil},
{`\b(New)\b([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameClass), nil},
{`\b(Gosub|Goto)\b([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameLabel), nil},
{`\b(Object)\b([ \t]*)([.])([ \t]*)([a-z]\w*)\b`, ByGroups(Operator, Text, Punctuation, Text, NameClass), nil},
{`\b([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?\b([ \t]*)(\()`, ByGroups(NameFunction, Text, KeywordType, Text, Punctuation, Text, NameClass, Text, Punctuation), nil},
{`\b(Function)\b([ \t]+)([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?`, ByGroups(KeywordReserved, Text, NameFunction, Text, KeywordType, Text, Punctuation, Text, NameClass), nil},
{`\b(Type)([ \t]+)([a-z]\w*)`, ByGroups(KeywordReserved, Text, NameClass), nil},
{`\b(Pi|True|False|Null)\b`, KeywordConstant, nil},
{`\b(Local|Global|Const|Field|Dim)\b`, KeywordDeclaration, nil},
{Words(`\b`, `\b`, `End`, `Return`, `Exit`, `Chr`, `Len`, `Asc`, `New`, `Delete`, `Insert`, `Include`, `Function`, `Type`, `If`, `Then`, `Else`, `ElseIf`, `EndIf`, `For`, `To`, `Next`, `Step`, `Each`, `While`, `Wend`, `Repeat`, `Until`, `Forever`, `Select`, `Case`, `Default`, `Goto`, `Gosub`, `Data`, `Read`, `Restore`), KeywordReserved, nil},
{`([a-z]\w*)(?:([ \t]*)(@{1,2}|[#$%])|([ \t]*)([.])([ \t]*)(?:([a-z]\w*)))?`, ByGroups(NameVariable, Text, KeywordType, Text, Punctuation, Text, NameClass), nil},
},
"string": {
{`""`, LiteralStringDouble, nil},
{`"C?`, LiteralStringDouble, Pop(1)},
{`[^"]+`, LiteralStringDouble, nil},
},
},
))

View file

@ -0,0 +1,24 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Bnf lexer.
var Bnf = internal.Register(MustNewLexer(
&Config{
Name: "BNF",
Aliases: []string{"bnf"},
Filenames: []string{"*.bnf"},
MimeTypes: []string{"text/x-bnf"},
},
Rules{
"root": {
{`(<)([ -;=?-~]+)(>)`, ByGroups(Punctuation, NameClass, Punctuation), nil},
{`::=`, Operator, nil},
{`[^<>:]+`, Text, nil},
{`.`, Text, nil},
},
},
))

View file

@ -0,0 +1,34 @@
package b
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Brainfuck lexer.
var Brainfuck = internal.Register(MustNewLexer(
&Config{
Name: "Brainfuck",
Aliases: []string{"brainfuck", "bf"},
Filenames: []string{"*.bf", "*.b"},
MimeTypes: []string{"application/x-brainfuck"},
},
Rules{
"common": {
{`[.,]+`, NameTag, nil},
{`[+-]+`, NameBuiltin, nil},
{`[<>]+`, NameVariable, nil},
{`[^.,+\-<>\[\]]+`, Comment, nil},
},
"root": {
{`\[`, Keyword, Push("loop")},
{`\]`, Error, nil},
Include("common"),
},
"loop": {
{`\[`, Keyword, Push()},
{`\]`, Keyword, Pop(1)},
Include("common"),
},
},
))

View file

@ -0,0 +1,91 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// C lexer.
var C = internal.Register(MustNewLexer(
&Config{
Name: "C",
Aliases: []string{"c"},
Filenames: []string{"*.c", "*.h", "*.idc"},
MimeTypes: []string{"text/x-chdr", "text/x-csrc"},
},
Rules{
"whitespace": {
{`^#if\s+0`, CommentPreproc, Push("if0")},
{`^#`, CommentPreproc, Push("macro")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("if0")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("macro")},
{`\n`, Text, nil},
{`\s+`, Text, nil},
{`\\\n`, Text, nil},
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
},
"statements": {
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
{`(L?)(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')`, ByGroups(LiteralStringAffix, LiteralStringChar, LiteralStringChar, LiteralStringChar), nil},
{`(\d+\.\d*|\.\d+|\d+)[eE][+-]?\d+[LlUu]*`, LiteralNumberFloat, nil},
{`(\d+\.\d*|\.\d+|\d+[fF])[fF]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+[LlUu]*`, LiteralNumberHex, nil},
{`0[0-7]+[LlUu]*`, LiteralNumberOct, nil},
{`\d+[LlUu]*`, LiteralNumberInteger, nil},
{`\*/`, Error, nil},
{`[~!%^&*+=|?:<>/-]`, Operator, nil},
{`[()\[\],.]`, Punctuation, nil},
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
{`(bool|int|long|float|short|double|char|unsigned|signed|void)\b`, KeywordType, nil},
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `wchar_t`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
{`(true|false|NULL)\b`, NameBuiltin, nil},
{`([a-zA-Z_]\w*)(\s*)(:)(?!:)`, ByGroups(NameLabel, Text, Punctuation), nil},
{`[a-zA-Z_]\w*`, Name, nil},
},
"root": {
Include("whitespace"),
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;{]*)(\{)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), Push("function")},
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;]*)(;)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), nil},
Default(Push("statement")),
},
"statement": {
Include("whitespace"),
Include("statements"),
{`[{}]`, Punctuation, nil},
{`;`, Punctuation, Pop(1)},
},
"function": {
Include("whitespace"),
Include("statements"),
{`;`, Punctuation, nil},
{`\{`, Punctuation, Push()},
{`\}`, Punctuation, Pop(1)},
},
"string": {
{`"`, LiteralString, Pop(1)},
{`\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})`, LiteralStringEscape, nil},
{`[^\\"\n]+`, LiteralString, nil},
{`\\\n`, LiteralString, nil},
{`\\`, LiteralString, nil},
},
"macro": {
{`(include)(\s*(?:/[*].*?[*]/\s*)?)([^\n]+)`, ByGroups(CommentPreproc, Text, CommentPreprocFile), nil},
{`[^/\n]+`, CommentPreproc, nil},
{`/[*](.|\n)*?[*]/`, CommentMultiline, nil},
{`//.*?\n`, CommentSingle, Pop(1)},
{`/`, CommentPreproc, nil},
{`(?<=\\)\n`, CommentPreproc, nil},
{`\n`, CommentPreproc, Pop(1)},
},
"if0": {
{`^\s*#if.*?(?<!\\)\n`, CommentPreproc, Push()},
{`^\s*#el(?:se|if).*\n`, CommentPreproc, Pop(1)},
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
{`.*?\n`, Comment, nil},
},
},
))

View file

@ -0,0 +1,61 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cap'N'Proto Proto lexer.
var CapNProto = internal.Register(MustNewLexer(
&Config{
Name: "Cap'n Proto",
Aliases: []string{"capnp"},
Filenames: []string{"*.capnp"},
MimeTypes: []string{},
},
Rules{
"root": {
{`#.*?$`, CommentSingle, nil},
{`@[0-9a-zA-Z]*`, NameDecorator, nil},
{`=`, Literal, Push("expression")},
{`:`, NameClass, Push("type")},
{`\$`, NameAttribute, Push("annotation")},
{`(struct|enum|interface|union|import|using|const|annotation|extends|in|of|on|as|with|from|fixed)\b`, Keyword, nil},
{`[\w.]+`, Name, nil},
{`[^#@=:$\w]+`, Text, nil},
},
"type": {
{`[^][=;,(){}$]+`, NameClass, nil},
{`[[(]`, NameClass, Push("parentype")},
Default(Pop(1)),
},
"parentype": {
{`[^][;()]+`, NameClass, nil},
{`[[(]`, NameClass, Push()},
{`[])]`, NameClass, Pop(1)},
Default(Pop(1)),
},
"expression": {
{`[^][;,(){}$]+`, Literal, nil},
{`[[(]`, Literal, Push("parenexp")},
Default(Pop(1)),
},
"parenexp": {
{`[^][;()]+`, Literal, nil},
{`[[(]`, Literal, Push()},
{`[])]`, Literal, Pop(1)},
Default(Pop(1)),
},
"annotation": {
{`[^][;,(){}=:]+`, NameAttribute, nil},
{`[[(]`, NameAttribute, Push("annexp")},
Default(Pop(1)),
},
"annexp": {
{`[^][;()]+`, NameAttribute, nil},
{`[[(]`, NameAttribute, Push()},
{`[])]`, NameAttribute, Pop(1)},
Default(Pop(1)),
},
},
))

View file

@ -0,0 +1,63 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Ceylon lexer.
var Ceylon = internal.Register(MustNewLexer(
&Config{
Name: "Ceylon",
Aliases: []string{"ceylon"},
Filenames: []string{"*.ceylon"},
MimeTypes: []string{"text/x-ceylon"},
DotAll: true,
},
Rules{
"root": {
{`^(\s*(?:[a-zA-Z_][\w.\[\]]*\s+)+?)([a-zA-Z_]\w*)(\s*)(\()`, ByGroups(UsingSelf("root"), NameFunction, Text, Operator), nil},
{`[^\S\n]+`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*`, CommentMultiline, Push("comment")},
{`(shared|abstract|formal|default|actual|variable|deprecated|small|late|literal|doc|by|see|throws|optional|license|tagged|final|native|annotation|sealed)\b`, NameDecorator, nil},
{`(break|case|catch|continue|else|finally|for|in|if|return|switch|this|throw|try|while|is|exists|dynamic|nonempty|then|outer|assert|let)\b`, Keyword, nil},
{`(abstracts|extends|satisfies|super|given|of|out|assign)\b`, KeywordDeclaration, nil},
{`(function|value|void|new)\b`, KeywordType, nil},
{`(assembly|module|package)(\s+)`, ByGroups(KeywordNamespace, Text), nil},
{`(true|false|null)\b`, KeywordConstant, nil},
{`(class|interface|object|alias)(\s+)`, ByGroups(KeywordDeclaration, Text), Push("class")},
{`(import)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
{`'\\.'|'[^\\]'|'\\\{#[0-9a-fA-F]{4}\}'`, LiteralStringChar, nil},
{"\".*``.*``.*\"", LiteralStringInterpol, nil},
{`(\.)([a-z_]\w*)`, ByGroups(Operator, NameAttribute), nil},
{`[a-zA-Z_]\w*:`, NameLabel, nil},
{`[a-zA-Z_]\w*`, Name, nil},
{`[~^*!%&\[\](){}<>|+=:;,./?-]`, Operator, nil},
{`\d{1,3}(_\d{3})+\.\d{1,3}(_\d{3})+[kMGTPmunpf]?`, LiteralNumberFloat, nil},
{`\d{1,3}(_\d{3})+\.[0-9]+([eE][+-]?[0-9]+)?[kMGTPmunpf]?`, LiteralNumberFloat, nil},
{`[0-9][0-9]*\.\d{1,3}(_\d{3})+[kMGTPmunpf]?`, LiteralNumberFloat, nil},
{`[0-9][0-9]*\.[0-9]+([eE][+-]?[0-9]+)?[kMGTPmunpf]?`, LiteralNumberFloat, nil},
{`#([0-9a-fA-F]{4})(_[0-9a-fA-F]{4})+`, LiteralNumberHex, nil},
{`#[0-9a-fA-F]+`, LiteralNumberHex, nil},
{`\$([01]{4})(_[01]{4})+`, LiteralNumberBin, nil},
{`\$[01]+`, LiteralNumberBin, nil},
{`\d{1,3}(_\d{3})+[kMGTP]?`, LiteralNumberInteger, nil},
{`[0-9]+[kMGTP]?`, LiteralNumberInteger, nil},
{`\n`, Text, nil},
},
"class": {
{`[A-Za-z_]\w*`, NameClass, Pop(1)},
},
"import": {
{`[a-z][\w.]*`, NameNamespace, Pop(1)},
},
"comment": {
{`[^*/]`, CommentMultiline, nil},
{`/\*`, CommentMultiline, Push()},
{`\*/`, CommentMultiline, Pop(1)},
{`[*/]`, CommentMultiline, nil},
},
},
))

View file

@ -0,0 +1,56 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cfengine3 lexer.
var Cfengine3 = internal.Register(MustNewLexer(
&Config{
Name: "CFEngine3",
Aliases: []string{"cfengine3", "cf3"},
Filenames: []string{"*.cf"},
MimeTypes: []string{},
},
Rules{
"root": {
{`#.*?\n`, Comment, nil},
{`(body)(\s+)(\S+)(\s+)(control)`, ByGroups(Keyword, Text, Keyword, Text, Keyword), nil},
{`(body|bundle)(\s+)(\S+)(\s+)(\w+)(\()`, ByGroups(Keyword, Text, Keyword, Text, NameFunction, Punctuation), Push("arglist")},
{`(body|bundle)(\s+)(\S+)(\s+)(\w+)`, ByGroups(Keyword, Text, Keyword, Text, NameFunction), nil},
{`(")([^"]+)(")(\s+)(string|slist|int|real)(\s*)(=>)(\s*)`, ByGroups(Punctuation, NameVariable, Punctuation, Text, KeywordType, Text, Operator, Text), nil},
{`(\S+)(\s*)(=>)(\s*)`, ByGroups(KeywordReserved, Text, Operator, Text), nil},
{`"`, LiteralString, Push("string")},
{`(\w+)(\()`, ByGroups(NameFunction, Punctuation), nil},
{`([\w.!&|()]+)(::)`, ByGroups(NameClass, Punctuation), nil},
{`(\w+)(:)`, ByGroups(KeywordDeclaration, Punctuation), nil},
{`@[{(][^)}]+[})]`, NameVariable, nil},
{`[(){},;]`, Punctuation, nil},
{`=>`, Operator, nil},
{`->`, Operator, nil},
{`\d+\.\d+`, LiteralNumberFloat, nil},
{`\d+`, LiteralNumberInteger, nil},
{`\w+`, NameFunction, nil},
{`\s+`, Text, nil},
},
"string": {
{`\$[{(]`, LiteralStringInterpol, Push("interpol")},
{`\\.`, LiteralStringEscape, nil},
{`"`, LiteralString, Pop(1)},
{`\n`, LiteralString, nil},
{`.`, LiteralString, nil},
},
"interpol": {
{`\$[{(]`, LiteralStringInterpol, Push()},
{`[})]`, LiteralStringInterpol, Pop(1)},
{`[^${()}]+`, LiteralStringInterpol, nil},
},
"arglist": {
{`\)`, Punctuation, Pop(1)},
{`,`, Punctuation, nil},
{`\w+`, NameVariable, nil},
{`\s+`, Text, nil},
},
},
))

View file

@ -0,0 +1,63 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Chaiscript lexer.
var Chaiscript = internal.Register(MustNewLexer(
&Config{
Name: "ChaiScript",
Aliases: []string{"chai", "chaiscript"},
Filenames: []string{"*.chai"},
MimeTypes: []string{"text/x-chaiscript", "application/x-chaiscript"},
DotAll: true,
},
Rules{
"commentsandwhitespace": {
{`\s+`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`^\#.*?\n`, CommentSingle, nil},
},
"slashstartsregex": {
Include("commentsandwhitespace"),
{`/(\\.|[^[/\\\n]|\[(\\.|[^\]\\\n])*])+/([gim]+\b|\B)`, LiteralStringRegex, Pop(1)},
{`(?=/)`, Text, Push("#pop", "badregex")},
Default(Pop(1)),
},
"badregex": {
{`\n`, Text, Pop(1)},
},
"root": {
Include("commentsandwhitespace"),
{`\n`, Text, nil},
{`[^\S\n]+`, Text, nil},
{`\+\+|--|~|&&|\?|:|\|\||\\(?=\n)|\.\.(<<|>>>?|==?|!=?|[-<>+*%&|^/])=?`, Operator, Push("slashstartsregex")},
{`[{(\[;,]`, Punctuation, Push("slashstartsregex")},
{`[})\].]`, Punctuation, nil},
{`[=+\-*/]`, Operator, nil},
{`(for|in|while|do|break|return|continue|if|else|throw|try|catch)\b`, Keyword, Push("slashstartsregex")},
{`(var)\b`, KeywordDeclaration, Push("slashstartsregex")},
{`(attr|def|fun)\b`, KeywordReserved, nil},
{`(true|false)\b`, KeywordConstant, nil},
{`(eval|throw)\b`, NameBuiltin, nil},
{"`\\S+`", NameBuiltin, nil},
{`[$a-zA-Z_]\w*`, NameOther, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`"`, LiteralStringDouble, Push("dqstring")},
{`'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
},
"dqstring": {
{`\$\{[^"}]+?\}`, LiteralStringInterpol, nil},
{`\$`, LiteralStringDouble, nil},
{`\\\\`, LiteralStringDouble, nil},
{`\\"`, LiteralStringDouble, nil},
{`[^\\"$]+`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Pop(1)},
},
},
))

View file

@ -0,0 +1,37 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
. "github.com/alecthomas/chroma/lexers/p" // nolint
)
// Cheetah lexer.
var Cheetah = internal.Register(MustNewLexer(
&Config{
Name: "Cheetah",
Aliases: []string{"cheetah", "spitfire"},
Filenames: []string{"*.tmpl", "*.spt"},
MimeTypes: []string{"application/x-cheetah", "application/x-spitfire"},
},
Rules{
"root": {
{`(##[^\n]*)$`, ByGroups(Comment), nil},
{`#[*](.|\n)*?[*]#`, Comment, nil},
{`#end[^#\n]*(?:#|$)`, CommentPreproc, nil},
{`#slurp$`, CommentPreproc, nil},
{`(#[a-zA-Z]+)([^#\n]*)(#|$)`, ByGroups(CommentPreproc, Using(Python), CommentPreproc), nil},
{`(\$)([a-zA-Z_][\w.]*\w)`, ByGroups(CommentPreproc, Using(Python)), nil},
{`(\$\{!?)(.*?)(\})(?s)`, ByGroups(CommentPreproc, Using(Python), CommentPreproc), nil},
{`(?sx)
(.+?) # anything, followed by:
(?:
(?=\#[#a-zA-Z]*) | # an eval comment
(?=\$[a-zA-Z_{]) | # a substitution
\Z # end of string
)
`, Other, nil},
{`\s+`, Text, nil},
},
},
))

View file

@ -0,0 +1,306 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
var (
clBuiltinFunctions = []string{
"<", "<=", "=", ">", ">=", "-", "/", "/=", "*", "+", "1-", "1+",
"abort", "abs", "acons", "acos", "acosh", "add-method", "adjoin",
"adjustable-array-p", "adjust-array", "allocate-instance",
"alpha-char-p", "alphanumericp", "append", "apply", "apropos",
"apropos-list", "aref", "arithmetic-error-operands",
"arithmetic-error-operation", "array-dimension", "array-dimensions",
"array-displacement", "array-element-type", "array-has-fill-pointer-p",
"array-in-bounds-p", "arrayp", "array-rank", "array-row-major-index",
"array-total-size", "ash", "asin", "asinh", "assoc", "assoc-if",
"assoc-if-not", "atan", "atanh", "atom", "bit", "bit-and", "bit-andc1",
"bit-andc2", "bit-eqv", "bit-ior", "bit-nand", "bit-nor", "bit-not",
"bit-orc1", "bit-orc2", "bit-vector-p", "bit-xor", "boole",
"both-case-p", "boundp", "break", "broadcast-stream-streams",
"butlast", "byte", "byte-position", "byte-size", "caaaar", "caaadr",
"caaar", "caadar", "caaddr", "caadr", "caar", "cadaar", "cadadr",
"cadar", "caddar", "cadddr", "caddr", "cadr", "call-next-method", "car",
"cdaaar", "cdaadr", "cdaar", "cdadar", "cdaddr", "cdadr", "cdar",
"cddaar", "cddadr", "cddar", "cdddar", "cddddr", "cdddr", "cddr", "cdr",
"ceiling", "cell-error-name", "cerror", "change-class", "char", "char<",
"char<=", "char=", "char>", "char>=", "char/=", "character",
"characterp", "char-code", "char-downcase", "char-equal",
"char-greaterp", "char-int", "char-lessp", "char-name",
"char-not-equal", "char-not-greaterp", "char-not-lessp", "char-upcase",
"cis", "class-name", "class-of", "clear-input", "clear-output",
"close", "clrhash", "code-char", "coerce", "compile",
"compiled-function-p", "compile-file", "compile-file-pathname",
"compiler-macro-function", "complement", "complex", "complexp",
"compute-applicable-methods", "compute-restarts", "concatenate",
"concatenated-stream-streams", "conjugate", "cons", "consp",
"constantly", "constantp", "continue", "copy-alist", "copy-list",
"copy-pprint-dispatch", "copy-readtable", "copy-seq", "copy-structure",
"copy-symbol", "copy-tree", "cos", "cosh", "count", "count-if",
"count-if-not", "decode-float", "decode-universal-time", "delete",
"delete-duplicates", "delete-file", "delete-if", "delete-if-not",
"delete-package", "denominator", "deposit-field", "describe",
"describe-object", "digit-char", "digit-char-p", "directory",
"directory-namestring", "disassemble", "documentation", "dpb",
"dribble", "echo-stream-input-stream", "echo-stream-output-stream",
"ed", "eighth", "elt", "encode-universal-time", "endp",
"enough-namestring", "ensure-directories-exist",
"ensure-generic-function", "eq", "eql", "equal", "equalp", "error",
"eval", "evenp", "every", "exp", "export", "expt", "fboundp",
"fceiling", "fdefinition", "ffloor", "fifth", "file-author",
"file-error-pathname", "file-length", "file-namestring",
"file-position", "file-string-length", "file-write-date",
"fill", "fill-pointer", "find", "find-all-symbols", "find-class",
"find-if", "find-if-not", "find-method", "find-package", "find-restart",
"find-symbol", "finish-output", "first", "float", "float-digits",
"floatp", "float-precision", "float-radix", "float-sign", "floor",
"fmakunbound", "force-output", "format", "fourth", "fresh-line",
"fround", "ftruncate", "funcall", "function-keywords",
"function-lambda-expression", "functionp", "gcd", "gensym", "gentemp",
"get", "get-decoded-time", "get-dispatch-macro-character", "getf",
"gethash", "get-internal-real-time", "get-internal-run-time",
"get-macro-character", "get-output-stream-string", "get-properties",
"get-setf-expansion", "get-universal-time", "graphic-char-p",
"hash-table-count", "hash-table-p", "hash-table-rehash-size",
"hash-table-rehash-threshold", "hash-table-size", "hash-table-test",
"host-namestring", "identity", "imagpart", "import",
"initialize-instance", "input-stream-p", "inspect",
"integer-decode-float", "integer-length", "integerp",
"interactive-stream-p", "intern", "intersection",
"invalid-method-error", "invoke-debugger", "invoke-restart",
"invoke-restart-interactively", "isqrt", "keywordp", "last", "lcm",
"ldb", "ldb-test", "ldiff", "length", "lisp-implementation-type",
"lisp-implementation-version", "list", "list*", "list-all-packages",
"listen", "list-length", "listp", "load",
"load-logical-pathname-translations", "log", "logand", "logandc1",
"logandc2", "logbitp", "logcount", "logeqv", "logical-pathname",
"logical-pathname-translations", "logior", "lognand", "lognor",
"lognot", "logorc1", "logorc2", "logtest", "logxor", "long-site-name",
"lower-case-p", "machine-instance", "machine-type", "machine-version",
"macroexpand", "macroexpand-1", "macro-function", "make-array",
"make-broadcast-stream", "make-concatenated-stream", "make-condition",
"make-dispatch-macro-character", "make-echo-stream", "make-hash-table",
"make-instance", "make-instances-obsolete", "make-list",
"make-load-form", "make-load-form-saving-slots", "make-package",
"make-pathname", "make-random-state", "make-sequence", "make-string",
"make-string-input-stream", "make-string-output-stream", "make-symbol",
"make-synonym-stream", "make-two-way-stream", "makunbound", "map",
"mapc", "mapcan", "mapcar", "mapcon", "maphash", "map-into", "mapl",
"maplist", "mask-field", "max", "member", "member-if", "member-if-not",
"merge", "merge-pathnames", "method-combination-error",
"method-qualifiers", "min", "minusp", "mismatch", "mod",
"muffle-warning", "name-char", "namestring", "nbutlast", "nconc",
"next-method-p", "nintersection", "ninth", "no-applicable-method",
"no-next-method", "not", "notany", "notevery", "nreconc", "nreverse",
"nset-difference", "nset-exclusive-or", "nstring-capitalize",
"nstring-downcase", "nstring-upcase", "nsublis", "nsubst", "nsubst-if",
"nsubst-if-not", "nsubstitute", "nsubstitute-if", "nsubstitute-if-not",
"nth", "nthcdr", "null", "numberp", "numerator", "nunion", "oddp",
"open", "open-stream-p", "output-stream-p", "package-error-package",
"package-name", "package-nicknames", "packagep",
"package-shadowing-symbols", "package-used-by-list", "package-use-list",
"pairlis", "parse-integer", "parse-namestring", "pathname",
"pathname-device", "pathname-directory", "pathname-host",
"pathname-match-p", "pathname-name", "pathnamep", "pathname-type",
"pathname-version", "peek-char", "phase", "plusp", "position",
"position-if", "position-if-not", "pprint", "pprint-dispatch",
"pprint-fill", "pprint-indent", "pprint-linear", "pprint-newline",
"pprint-tab", "pprint-tabular", "prin1", "prin1-to-string", "princ",
"princ-to-string", "print", "print-object", "probe-file", "proclaim",
"provide", "random", "random-state-p", "rassoc", "rassoc-if",
"rassoc-if-not", "rational", "rationalize", "rationalp", "read",
"read-byte", "read-char", "read-char-no-hang", "read-delimited-list",
"read-from-string", "read-line", "read-preserving-whitespace",
"read-sequence", "readtable-case", "readtablep", "realp", "realpart",
"reduce", "reinitialize-instance", "rem", "remhash", "remove",
"remove-duplicates", "remove-if", "remove-if-not", "remove-method",
"remprop", "rename-file", "rename-package", "replace", "require",
"rest", "restart-name", "revappend", "reverse", "room", "round",
"row-major-aref", "rplaca", "rplacd", "sbit", "scale-float", "schar",
"search", "second", "set", "set-difference",
"set-dispatch-macro-character", "set-exclusive-or",
"set-macro-character", "set-pprint-dispatch", "set-syntax-from-char",
"seventh", "shadow", "shadowing-import", "shared-initialize",
"short-site-name", "signal", "signum", "simple-bit-vector-p",
"simple-condition-format-arguments", "simple-condition-format-control",
"simple-string-p", "simple-vector-p", "sin", "sinh", "sixth", "sleep",
"slot-boundp", "slot-exists-p", "slot-makunbound", "slot-missing",
"slot-unbound", "slot-value", "software-type", "software-version",
"some", "sort", "special-operator-p", "sqrt", "stable-sort",
"standard-char-p", "store-value", "stream-element-type",
"stream-error-stream", "stream-external-format", "streamp", "string",
"string<", "string<=", "string=", "string>", "string>=", "string/=",
"string-capitalize", "string-downcase", "string-equal",
"string-greaterp", "string-left-trim", "string-lessp",
"string-not-equal", "string-not-greaterp", "string-not-lessp",
"stringp", "string-right-trim", "string-trim", "string-upcase",
"sublis", "subseq", "subsetp", "subst", "subst-if", "subst-if-not",
"substitute", "substitute-if", "substitute-if-not", "subtypep", "svref",
"sxhash", "symbol-function", "symbol-name", "symbolp", "symbol-package",
"symbol-plist", "symbol-value", "synonym-stream-symbol", "syntax:",
"tailp", "tan", "tanh", "tenth", "terpri", "third",
"translate-logical-pathname", "translate-pathname", "tree-equal",
"truename", "truncate", "two-way-stream-input-stream",
"two-way-stream-output-stream", "type-error-datum",
"type-error-expected-type", "type-of", "typep", "unbound-slot-instance",
"unexport", "unintern", "union", "unread-char", "unuse-package",
"update-instance-for-different-class",
"update-instance-for-redefined-class", "upgraded-array-element-type",
"upgraded-complex-part-type", "upper-case-p", "use-package",
"user-homedir-pathname", "use-value", "values", "values-list", "vector",
"vectorp", "vector-pop", "vector-push", "vector-push-extend", "warn",
"wild-pathname-p", "write", "write-byte", "write-char", "write-line",
"write-sequence", "write-string", "write-to-string", "yes-or-no-p",
"y-or-n-p", "zerop",
}
clSpecialForms = []string{
"block", "catch", "declare", "eval-when", "flet", "function", "go", "if",
"labels", "lambda", "let", "let*", "load-time-value", "locally", "macrolet",
"multiple-value-call", "multiple-value-prog1", "progn", "progv", "quote",
"return-from", "setq", "symbol-macrolet", "tagbody", "the", "throw",
"unwind-protect",
}
clMacros = []string{
"and", "assert", "call-method", "case", "ccase", "check-type", "cond",
"ctypecase", "decf", "declaim", "defclass", "defconstant", "defgeneric",
"define-compiler-macro", "define-condition", "define-method-combination",
"define-modify-macro", "define-setf-expander", "define-symbol-macro",
"defmacro", "defmethod", "defpackage", "defparameter", "defsetf",
"defstruct", "deftype", "defun", "defvar", "destructuring-bind", "do",
"do*", "do-all-symbols", "do-external-symbols", "dolist", "do-symbols",
"dotimes", "ecase", "etypecase", "formatter", "handler-bind",
"handler-case", "ignore-errors", "incf", "in-package", "lambda", "loop",
"loop-finish", "make-method", "multiple-value-bind", "multiple-value-list",
"multiple-value-setq", "nth-value", "or", "pop",
"pprint-exit-if-list-exhausted", "pprint-logical-block", "pprint-pop",
"print-unreadable-object", "prog", "prog*", "prog1", "prog2", "psetf",
"psetq", "push", "pushnew", "remf", "restart-bind", "restart-case",
"return", "rotatef", "setf", "shiftf", "step", "time", "trace", "typecase",
"unless", "untrace", "when", "with-accessors", "with-compilation-unit",
"with-condition-restarts", "with-hash-table-iterator",
"with-input-from-string", "with-open-file", "with-open-stream",
"with-output-to-string", "with-package-iterator", "with-simple-restart",
"with-slots", "with-standard-io-syntax",
}
clLambdaListKeywords = []string{
"&allow-other-keys", "&aux", "&body", "&environment", "&key", "&optional",
"&rest", "&whole",
}
clDeclarations = []string{
"dynamic-extent", "ignore", "optimize", "ftype", "inline", "special",
"ignorable", "notinline", "type",
}
clBuiltinTypes = []string{
"atom", "boolean", "base-char", "base-string", "bignum", "bit",
"compiled-function", "extended-char", "fixnum", "keyword", "nil",
"signed-byte", "short-float", "single-float", "double-float", "long-float",
"simple-array", "simple-base-string", "simple-bit-vector", "simple-string",
"simple-vector", "standard-char", "unsigned-byte",
// Condition Types
"arithmetic-error", "cell-error", "condition", "control-error",
"division-by-zero", "end-of-file", "error", "file-error",
"floating-point-inexact", "floating-point-overflow",
"floating-point-underflow", "floating-point-invalid-operation",
"parse-error", "package-error", "print-not-readable", "program-error",
"reader-error", "serious-condition", "simple-condition", "simple-error",
"simple-type-error", "simple-warning", "stream-error", "storage-condition",
"style-warning", "type-error", "unbound-variable", "unbound-slot",
"undefined-function", "warning",
}
clBuiltinClasses = []string{
"array", "broadcast-stream", "bit-vector", "built-in-class", "character",
"class", "complex", "concatenated-stream", "cons", "echo-stream",
"file-stream", "float", "function", "generic-function", "hash-table",
"integer", "list", "logical-pathname", "method-combination", "method",
"null", "number", "package", "pathname", "ratio", "rational", "readtable",
"real", "random-state", "restart", "sequence", "standard-class",
"standard-generic-function", "standard-method", "standard-object",
"string-stream", "stream", "string", "structure-class", "structure-object",
"symbol", "synonym-stream", "t", "two-way-stream", "vector",
}
)
// Common Lisp lexer.
var CommonLisp = internal.Register(TypeRemappingLexer(MustNewLexer(
&Config{
Name: "Common Lisp",
Aliases: []string{"common-lisp", "cl", "lisp"},
Filenames: []string{"*.cl", "*.lisp"},
MimeTypes: []string{"text/x-common-lisp"},
CaseInsensitive: true,
},
Rules{
"root": {
Default(Push("body")),
},
"multiline-comment": {
{`#\|`, CommentMultiline, Push()},
{`\|#`, CommentMultiline, Pop(1)},
{`[^|#]+`, CommentMultiline, nil},
{`[|#]`, CommentMultiline, nil},
},
"commented-form": {
{`\(`, CommentPreproc, Push()},
{`\)`, CommentPreproc, Pop(1)},
{`[^()]+`, CommentPreproc, nil},
},
"body": {
{`\s+`, Text, nil},
{`;.*$`, CommentSingle, nil},
{`#\|`, CommentMultiline, Push("multiline-comment")},
{`#\d*Y.*$`, CommentSpecial, nil},
{`"(\\.|\\\n|[^"\\])*"`, LiteralString, nil},
{`:(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringSymbol, nil},
{`::(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringSymbol, nil},
{`:#(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringSymbol, nil},
{`'(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringSymbol, nil},
{`'`, Operator, nil},
{"`", Operator, nil},
{"[-+]?\\d+\\.?(?=[ \"()\\'\\n,;`])", LiteralNumberInteger, nil},
{"[-+]?\\d+/\\d+(?=[ \"()\\'\\n,;`])", LiteralNumber, nil},
{"[-+]?(\\d*\\.\\d+([defls][-+]?\\d+)?|\\d+(\\.\\d*)?[defls][-+]?\\d+)(?=[ \"()\\'\\n,;`])", LiteralNumberFloat, nil},
{"#\\\\.(?=[ \"()\\'\\n,;`])", LiteralStringChar, nil},
{`#\\(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringChar, nil},
{`#\(`, Operator, Push("body")},
{`#\d*\*[01]*`, LiteralOther, nil},
{`#:(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, LiteralStringSymbol, nil},
{`#[.,]`, Operator, nil},
{`#\'`, NameFunction, nil},
{`#b[+-]?[01]+(/[01]+)?`, LiteralNumberBin, nil},
{`#o[+-]?[0-7]+(/[0-7]+)?`, LiteralNumberOct, nil},
{`#x[+-]?[0-9a-f]+(/[0-9a-f]+)?`, LiteralNumberHex, nil},
{`#\d+r[+-]?[0-9a-z]+(/[0-9a-z]+)?`, LiteralNumber, nil},
{`(#c)(\()`, ByGroups(LiteralNumber, Punctuation), Push("body")},
{`(#\d+a)(\()`, ByGroups(LiteralOther, Punctuation), Push("body")},
{`(#s)(\()`, ByGroups(LiteralOther, Punctuation), Push("body")},
{`#p?"(\\.|[^"])*"`, LiteralOther, nil},
{`#\d+=`, Operator, nil},
{`#\d+#`, Operator, nil},
{"#+nil(?=[ \"()\\'\\n,;`])\\s*\\(", CommentPreproc, Push("commented-form")},
{`#[+-]`, Operator, nil},
{`(,@|,|\.)`, Operator, nil},
{"(t|nil)(?=[ \"()\\'\\n,;`])", NameConstant, nil},
{`\*(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)\*`, NameVariableGlobal, nil},
{`(\|[^|]+\||(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~])(?:\\.|[\w!$%&*+-/<=>?@\[\]^{}~]|[#.:])*)`, NameVariable, nil},
{`\(`, Punctuation, Push("body")},
{`\)`, Punctuation, Pop(1)},
},
},
), TypeMapping{
{NameVariable, NameFunction, clBuiltinFunctions},
{NameVariable, Keyword, clSpecialForms},
{NameVariable, NameBuiltin, clMacros},
{NameVariable, Keyword, clLambdaListKeywords},
{NameVariable, Keyword, clDeclarations},
{NameVariable, KeywordType, clBuiltinTypes},
{NameVariable, NameClass, clBuiltinClasses},
}))

View file

@ -0,0 +1,38 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Clojure lexer.
var Clojure = internal.Register(MustNewLexer(
&Config{
Name: "Clojure",
Aliases: []string{"clojure", "clj"},
Filenames: []string{"*.clj"},
MimeTypes: []string{"text/x-clojure", "application/x-clojure"},
},
Rules{
"root": {
{`;.*$`, CommentSingle, nil},
{`[,\s]+`, Text, nil},
{`-?\d+\.\d+`, LiteralNumberFloat, nil},
{`-?\d+`, LiteralNumberInteger, nil},
{`0x-?[abcdef\d]+`, LiteralNumberHex, nil},
{`"(\\\\|\\"|[^"])*"`, LiteralString, nil},
{`'(?!#)[\w!$%*+<=>?/.#-]+`, LiteralStringSymbol, nil},
{`\\(.|[a-z]+)`, LiteralStringChar, nil},
{`::?#?(?!#)[\w!$%*+<=>?/.#-]+`, LiteralStringSymbol, nil},
{"~@|[`\\'#^~&@]", Operator, nil},
{Words(``, ` `, `.`, `def`, `do`, `fn`, `if`, `let`, `new`, `quote`, `var`, `loop`), Keyword, nil},
{Words(``, ` `, `def-`, `defn`, `defn-`, `defmacro`, `defmulti`, `defmethod`, `defstruct`, `defonce`, `declare`, `definline`, `definterface`, `defprotocol`, `defrecord`, `deftype`, `defproject`, `ns`), KeywordDeclaration, nil},
{Words(``, ` `, `*`, `+`, `-`, `->`, `/`, `<`, `<=`, `=`, `==`, `>`, `>=`, `..`, `accessor`, `agent`, `agent-errors`, `aget`, `alength`, `all-ns`, `alter`, `and`, `append-child`, `apply`, `array-map`, `aset`, `aset-boolean`, `aset-byte`, `aset-char`, `aset-double`, `aset-float`, `aset-int`, `aset-long`, `aset-short`, `assert`, `assoc`, `await`, `await-for`, `bean`, `binding`, `bit-and`, `bit-not`, `bit-or`, `bit-shift-left`, `bit-shift-right`, `bit-xor`, `boolean`, `branch?`, `butlast`, `byte`, `cast`, `char`, `children`, `class`, `clear-agent-errors`, `comment`, `commute`, `comp`, `comparator`, `complement`, `concat`, `conj`, `cons`, `constantly`, `cond`, `if-not`, `construct-proxy`, `contains?`, `count`, `create-ns`, `create-struct`, `cycle`, `dec`, `deref`, `difference`, `disj`, `dissoc`, `distinct`, `doall`, `doc`, `dorun`, `doseq`, `dosync`, `dotimes`, `doto`, `double`, `down`, `drop`, `drop-while`, `edit`, `end?`, `ensure`, `eval`, `every?`, `false?`, `ffirst`, `file-seq`, `filter`, `find`, `find-doc`, `find-ns`, `find-var`, `first`, `float`, `flush`, `for`, `fnseq`, `frest`, `gensym`, `get-proxy-class`, `get`, `hash-map`, `hash-set`, `identical?`, `identity`, `if-let`, `import`, `in-ns`, `inc`, `index`, `insert-child`, `insert-left`, `insert-right`, `inspect-table`, `inspect-tree`, `instance?`, `int`, `interleave`, `intersection`, `into`, `into-array`, `iterate`, `join`, `key`, `keys`, `keyword`, `keyword?`, `last`, `lazy-cat`, `lazy-cons`, `left`, `lefts`, `line-seq`, `list*`, `list`, `load`, `load-file`, `locking`, `long`, `loop`, `macroexpand`, `macroexpand-1`, `make-array`, `make-node`, `map`, `map-invert`, `map?`, `mapcat`, `max`, `max-key`, `memfn`, `merge`, `merge-with`, `meta`, `min`, `min-key`, `name`, `namespace`, `neg?`, `new`, `newline`, `next`, `nil?`, `node`, `not`, `not-any?`, `not-every?`, `not=`, `ns-imports`, `ns-interns`, `ns-map`, `ns-name`, `ns-publics`, `ns-refers`, `ns-resolve`, `ns-unmap`, `nth`, `nthrest`, `or`, `parse`, `partial`, `path`, `peek`, `pop`, `pos?`, `pr`, `pr-str`, `print`, `print-str`, `println`, `println-str`, `prn`, `prn-str`, `project`, `proxy`, `proxy-mappings`, `quot`, `rand`, `rand-int`, `range`, `re-find`, `re-groups`, `re-matcher`, `re-matches`, `re-pattern`, `re-seq`, `read`, `read-line`, `reduce`, `ref`, `ref-set`, `refer`, `rem`, `remove`, `remove-method`, `remove-ns`, `rename`, `rename-keys`, `repeat`, `replace`, `replicate`, `resolve`, `rest`, `resultset-seq`, `reverse`, `rfirst`, `right`, `rights`, `root`, `rrest`, `rseq`, `second`, `select`, `select-keys`, `send`, `send-off`, `seq`, `seq-zip`, `seq?`, `set`, `short`, `slurp`, `some`, `sort`, `sort-by`, `sorted-map`, `sorted-map-by`, `sorted-set`, `special-symbol?`, `split-at`, `split-with`, `str`, `string?`, `struct`, `struct-map`, `subs`, `subvec`, `symbol`, `symbol?`, `sync`, `take`, `take-nth`, `take-while`, `test`, `time`, `to-array`, `to-array-2d`, `tree-seq`, `true?`, `union`, `up`, `update-proxy`, `val`, `vals`, `var-get`, `var-set`, `var?`, `vector`, `vector-zip`, `vector?`, `when`, `when-first`, `when-let`, `when-not`, `with-local-vars`, `with-meta`, `with-open`, `with-out-str`, `xml-seq`, `xml-zip`, `zero?`, `zipmap`, `zipper`), NameBuiltin, nil},
{`(?<=\()(?!#)[\w!$%*+<=>?/.#-]+`, NameFunction, nil},
{`(?!#)[\w!$%*+<=>?/.#-]+`, NameVariable, nil},
{`(\[|\])`, Punctuation, nil},
{`(\{|\})`, Punctuation, nil},
{`(\(|\))`, Punctuation, nil},
},
},
))

View file

@ -0,0 +1,44 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cmake lexer.
var Cmake = internal.Register(MustNewLexer(
&Config{
Name: "CMake",
Aliases: []string{"cmake"},
Filenames: []string{"*.cmake", "CMakeLists.txt"},
MimeTypes: []string{"text/x-cmake"},
},
Rules{
"root": {
{`\b(\w+)([ \t]*)(\()`, ByGroups(NameBuiltin, Text, Punctuation), Push("args")},
Include("keywords"),
Include("ws"),
},
"args": {
{`\(`, Punctuation, Push()},
{`\)`, Punctuation, Pop(1)},
{`(\$\{)(.+?)(\})`, ByGroups(Operator, NameVariable, Operator), nil},
{`(\$ENV\{)(.+?)(\})`, ByGroups(Operator, NameVariable, Operator), nil},
{`(\$<)(.+?)(>)`, ByGroups(Operator, NameVariable, Operator), nil},
{`(?s)".*?"`, LiteralStringDouble, nil},
{`\\\S+`, LiteralString, nil},
{`[^)$"# \t\n]+`, LiteralString, nil},
{`\n`, Text, nil},
Include("keywords"),
Include("ws"),
},
"string": {},
"keywords": {
{`\b(WIN32|UNIX|APPLE|CYGWIN|BORLAND|MINGW|MSVC|MSVC_IDE|MSVC60|MSVC70|MSVC71|MSVC80|MSVC90)\b`, Keyword, nil},
},
"ws": {
{`[ \t]+`, Text, nil},
{`#.*\n`, Comment, nil},
},
},
))

View file

@ -0,0 +1,51 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cobol lexer.
var Cobol = internal.Register(MustNewLexer(
&Config{
Name: "COBOL",
Aliases: []string{"cobol"},
Filenames: []string{"*.cob", "*.COB", "*.cpy", "*.CPY"},
MimeTypes: []string{"text/x-cobol"},
CaseInsensitive: true,
},
Rules{
"root": {
Include("comment"),
Include("strings"),
Include("core"),
Include("nums"),
{`[a-z0-9]([\w\-]*[a-z0-9]+)?`, NameVariable, nil},
{`[ \t]+`, Text, nil},
},
"comment": {
{`(^.{6}[*/].*\n|^.{6}|\*>.*\n)`, Comment, nil},
},
"core": {
{`(^|(?<=[^\w\-]))(ALL\s+)?((ZEROES)|(HIGH-VALUE|LOW-VALUE|QUOTE|SPACE|ZERO)(S)?)\s*($|(?=[^\w\-]))`, NameConstant, nil},
{Words(`(^|(?<=[^\w\-]))`, `\s*($|(?=[^\w\-]))`, `ACCEPT`, `ADD`, `ALLOCATE`, `CALL`, `CANCEL`, `CLOSE`, `COMPUTE`, `CONFIGURATION`, `CONTINUE`, `DATA`, `DELETE`, `DISPLAY`, `DIVIDE`, `DIVISION`, `ELSE`, `END`, `END-ACCEPT`, `END-ADD`, `END-CALL`, `END-COMPUTE`, `END-DELETE`, `END-DISPLAY`, `END-DIVIDE`, `END-EVALUATE`, `END-IF`, `END-MULTIPLY`, `END-OF-PAGE`, `END-PERFORM`, `END-READ`, `END-RETURN`, `END-REWRITE`, `END-SEARCH`, `END-START`, `END-STRING`, `END-SUBTRACT`, `END-UNSTRING`, `END-WRITE`, `ENVIRONMENT`, `EVALUATE`, `EXIT`, `FD`, `FILE`, `FILE-CONTROL`, `FOREVER`, `FREE`, `GENERATE`, `GO`, `GOBACK`, `IDENTIFICATION`, `IF`, `INITIALIZE`, `INITIATE`, `INPUT-OUTPUT`, `INSPECT`, `INVOKE`, `I-O-CONTROL`, `LINKAGE`, `LOCAL-STORAGE`, `MERGE`, `MOVE`, `MULTIPLY`, `OPEN`, `PERFORM`, `PROCEDURE`, `PROGRAM-ID`, `RAISE`, `READ`, `RELEASE`, `RESUME`, `RETURN`, `REWRITE`, `SCREEN`, `SD`, `SEARCH`, `SECTION`, `SET`, `SORT`, `START`, `STOP`, `STRING`, `SUBTRACT`, `SUPPRESS`, `TERMINATE`, `THEN`, `UNLOCK`, `UNSTRING`, `USE`, `VALIDATE`, `WORKING-STORAGE`, `WRITE`), KeywordReserved, nil},
{Words(`(^|(?<=[^\w\-]))`, `\s*($|(?=[^\w\-]))`, `ACCESS`, `ADDRESS`, `ADVANCING`, `AFTER`, `ALL`, `ALPHABET`, `ALPHABETIC`, `ALPHABETIC-LOWER`, `ALPHABETIC-UPPER`, `ALPHANUMERIC`, `ALPHANUMERIC-EDITED`, `ALSO`, `ALTER`, `ALTERNATEANY`, `ARE`, `AREA`, `AREAS`, `ARGUMENT-NUMBER`, `ARGUMENT-VALUE`, `AS`, `ASCENDING`, `ASSIGN`, `AT`, `AUTO`, `AUTO-SKIP`, `AUTOMATIC`, `AUTOTERMINATE`, `BACKGROUND-COLOR`, `BASED`, `BEEP`, `BEFORE`, `BELL`, `BLANK`, `BLINK`, `BLOCK`, `BOTTOM`, `BY`, `BYTE-LENGTH`, `CHAINING`, `CHARACTER`, `CHARACTERS`, `CLASS`, `CODE`, `CODE-SET`, `COL`, `COLLATING`, `COLS`, `COLUMN`, `COLUMNS`, `COMMA`, `COMMAND-LINE`, `COMMIT`, `COMMON`, `CONSTANT`, `CONTAINS`, `CONTENT`, `CONTROL`, `CONTROLS`, `CONVERTING`, `COPY`, `CORR`, `CORRESPONDING`, `COUNT`, `CRT`, `CURRENCY`, `CURSOR`, `CYCLE`, `DATE`, `DAY`, `DAY-OF-WEEK`, `DE`, `DEBUGGING`, `DECIMAL-POINT`, `DECLARATIVES`, `DEFAULT`, `DELIMITED`, `DELIMITER`, `DEPENDING`, `DESCENDING`, `DETAIL`, `DISK`, `DOWN`, `DUPLICATES`, `DYNAMIC`, `EBCDIC`, `ENTRY`, `ENVIRONMENT-NAME`, `ENVIRONMENT-VALUE`, `EOL`, `EOP`, `EOS`, `ERASE`, `ERROR`, `ESCAPE`, `EXCEPTION`, `EXCLUSIVE`, `EXTEND`, `EXTERNAL`, `FILE-ID`, `FILLER`, `FINAL`, `FIRST`, `FIXED`, `FLOAT-LONG`, `FLOAT-SHORT`, `FOOTING`, `FOR`, `FOREGROUND-COLOR`, `FORMAT`, `FROM`, `FULL`, `FUNCTION`, `FUNCTION-ID`, `GIVING`, `GLOBAL`, `GROUP`, `HEADING`, `HIGHLIGHT`, `I-O`, `ID`, `IGNORE`, `IGNORING`, `IN`, `INDEX`, `INDEXED`, `INDICATE`, `INITIAL`, `INITIALIZED`, `INPUT`, `INTO`, `INTRINSIC`, `INVALID`, `IS`, `JUST`, `JUSTIFIED`, `KEY`, `LABEL`, `LAST`, `LEADING`, `LEFT`, `LENGTH`, `LIMIT`, `LIMITS`, `LINAGE`, `LINAGE-COUNTER`, `LINE`, `LINES`, `LOCALE`, `LOCK`, `LOWLIGHT`, `MANUAL`, `MEMORY`, `MINUS`, `MODE`, `MULTIPLE`, `NATIONAL`, `NATIONAL-EDITED`, `NATIVE`, `NEGATIVE`, `NEXT`, `NO`, `NULL`, `NULLS`, `NUMBER`, `NUMBERS`, `NUMERIC`, `NUMERIC-EDITED`, `OBJECT-COMPUTER`, `OCCURS`, `OF`, `OFF`, `OMITTED`, `ON`, `ONLY`, `OPTIONAL`, `ORDER`, `ORGANIZATION`, `OTHER`, `OUTPUT`, `OVERFLOW`, `OVERLINE`, `PACKED-DECIMAL`, `PADDING`, `PAGE`, `PARAGRAPH`, `PLUS`, `POINTER`, `POSITION`, `POSITIVE`, `PRESENT`, `PREVIOUS`, `PRINTER`, `PRINTING`, `PROCEDURE-POINTER`, `PROCEDURES`, `PROCEED`, `PROGRAM`, `PROGRAM-POINTER`, `PROMPT`, `QUOTE`, `QUOTES`, `RANDOM`, `RD`, `RECORD`, `RECORDING`, `RECORDS`, `RECURSIVE`, `REDEFINES`, `REEL`, `REFERENCE`, `RELATIVE`, `REMAINDER`, `REMOVAL`, `RENAMES`, `REPLACING`, `REPORT`, `REPORTING`, `REPORTS`, `REPOSITORY`, `REQUIRED`, `RESERVE`, `RETURNING`, `REVERSE-VIDEO`, `REWIND`, `RIGHT`, `ROLLBACK`, `ROUNDED`, `RUN`, `SAME`, `SCROLL`, `SECURE`, `SEGMENT-LIMIT`, `SELECT`, `SENTENCE`, `SEPARATE`, `SEQUENCE`, `SEQUENTIAL`, `SHARING`, `SIGN`, `SIGNED`, `SIGNED-INT`, `SIGNED-LONG`, `SIGNED-SHORT`, `SIZE`, `SORT-MERGE`, `SOURCE`, `SOURCE-COMPUTER`, `SPECIAL-NAMES`, `STANDARD`, `STANDARD-1`, `STANDARD-2`, `STATUS`, `SUM`, `SYMBOLIC`, `SYNC`, `SYNCHRONIZED`, `TALLYING`, `TAPE`, `TEST`, `THROUGH`, `THRU`, `TIME`, `TIMES`, `TO`, `TOP`, `TRAILING`, `TRANSFORM`, `TYPE`, `UNDERLINE`, `UNIT`, `UNSIGNED`, `UNSIGNED-INT`, `UNSIGNED-LONG`, `UNSIGNED-SHORT`, `UNTIL`, `UP`, `UPDATE`, `UPON`, `USAGE`, `USING`, `VALUE`, `VALUES`, `VARYING`, `WAIT`, `WHEN`, `WITH`, `WORDS`, `YYYYDDD`, `YYYYMMDD`), KeywordPseudo, nil},
{Words(`(^|(?<=[^\w\-]))`, `\s*($|(?=[^\w\-]))`, `ACTIVE-CLASS`, `ALIGNED`, `ANYCASE`, `ARITHMETIC`, `ATTRIBUTE`, `B-AND`, `B-NOT`, `B-OR`, `B-XOR`, `BIT`, `BOOLEAN`, `CD`, `CENTER`, `CF`, `CH`, `CHAIN`, `CLASS-ID`, `CLASSIFICATION`, `COMMUNICATION`, `CONDITION`, `DATA-POINTER`, `DESTINATION`, `DISABLE`, `EC`, `EGI`, `EMI`, `ENABLE`, `END-RECEIVE`, `ENTRY-CONVENTION`, `EO`, `ESI`, `EXCEPTION-OBJECT`, `EXPANDS`, `FACTORY`, `FLOAT-BINARY-16`, `FLOAT-BINARY-34`, `FLOAT-BINARY-7`, `FLOAT-DECIMAL-16`, `FLOAT-DECIMAL-34`, `FLOAT-EXTENDED`, `FORMAT`, `FUNCTION-POINTER`, `GET`, `GROUP-USAGE`, `IMPLEMENTS`, `INFINITY`, `INHERITS`, `INTERFACE`, `INTERFACE-ID`, `INVOKE`, `LC_ALL`, `LC_COLLATE`, `LC_CTYPE`, `LC_MESSAGES`, `LC_MONETARY`, `LC_NUMERIC`, `LC_TIME`, `LINE-COUNTER`, `MESSAGE`, `METHOD`, `METHOD-ID`, `NESTED`, `NONE`, `NORMAL`, `OBJECT`, `OBJECT-REFERENCE`, `OPTIONS`, `OVERRIDE`, `PAGE-COUNTER`, `PF`, `PH`, `PROPERTY`, `PROTOTYPE`, `PURGE`, `QUEUE`, `RAISE`, `RAISING`, `RECEIVE`, `RELATION`, `REPLACE`, `REPRESENTS-NOT-A-NUMBER`, `RESET`, `RESUME`, `RETRY`, `RF`, `RH`, `SECONDS`, `SEGMENT`, `SELF`, `SEND`, `SOURCES`, `STATEMENT`, `STEP`, `STRONG`, `SUB-QUEUE-1`, `SUB-QUEUE-2`, `SUB-QUEUE-3`, `SUPER`, `SYMBOL`, `SYSTEM-DEFAULT`, `TABLE`, `TERMINAL`, `TEXT`, `TYPEDEF`, `UCS-4`, `UNIVERSAL`, `USER-DEFAULT`, `UTF-16`, `UTF-8`, `VAL-STATUS`, `VALID`, `VALIDATE`, `VALIDATE-STATUS`), Error, nil},
{`(^|(?<=[^\w\-]))(PIC\s+.+?(?=(\s|\.\s))|PICTURE\s+.+?(?=(\s|\.\s))|(COMPUTATIONAL)(-[1-5X])?|(COMP)(-[1-5X])?|BINARY-C-LONG|BINARY-CHAR|BINARY-DOUBLE|BINARY-LONG|BINARY-SHORT|BINARY)\s*($|(?=[^\w\-]))`, KeywordType, nil},
{`(\*\*|\*|\+|-|/|<=|>=|<|>|==|/=|=)`, Operator, nil},
{`([(),;:&%.])`, Punctuation, nil},
{`(^|(?<=[^\w\-]))(ABS|ACOS|ANNUITY|ASIN|ATAN|BYTE-LENGTH|CHAR|COMBINED-DATETIME|CONCATENATE|COS|CURRENT-DATE|DATE-OF-INTEGER|DATE-TO-YYYYMMDD|DAY-OF-INTEGER|DAY-TO-YYYYDDD|EXCEPTION-(?:FILE|LOCATION|STATEMENT|STATUS)|EXP10|EXP|E|FACTORIAL|FRACTION-PART|INTEGER-OF-(?:DATE|DAY|PART)|INTEGER|LENGTH|LOCALE-(?:DATE|TIME(?:-FROM-SECONDS)?)|LOG(?:10)?|LOWER-CASE|MAX|MEAN|MEDIAN|MIDRANGE|MIN|MOD|NUMVAL(?:-C)?|ORD(?:-MAX|-MIN)?|PI|PRESENT-VALUE|RANDOM|RANGE|REM|REVERSE|SECONDS-FROM-FORMATTED-TIME|SECONDS-PAST-MIDNIGHT|SIGN|SIN|SQRT|STANDARD-DEVIATION|STORED-CHAR-LENGTH|SUBSTITUTE(?:-CASE)?|SUM|TAN|TEST-DATE-YYYYMMDD|TEST-DAY-YYYYDDD|TRIM|UPPER-CASE|VARIANCE|WHEN-COMPILED|YEAR-TO-YYYY)\s*($|(?=[^\w\-]))`, NameFunction, nil},
{`(^|(?<=[^\w\-]))(true|false)\s*($|(?=[^\w\-]))`, NameBuiltin, nil},
{`(^|(?<=[^\w\-]))(equal|equals|ne|lt|le|gt|ge|greater|less|than|not|and|or)\s*($|(?=[^\w\-]))`, OperatorWord, nil},
},
"strings": {
{`"[^"\n]*("|\n)`, LiteralStringDouble, nil},
{`'[^'\n]*('|\n)`, LiteralStringSingle, nil},
},
"nums": {
{`\d+(\s*|\.$|$)`, LiteralNumberInteger, nil},
{`[+-]?\d*\.\d+(E[-+]?\d+)?`, LiteralNumberFloat, nil},
{`[+-]?\d+\.\d*(E[-+]?\d+)?`, LiteralNumberFloat, nil},
},
},
))

View file

@ -0,0 +1,91 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Coffeescript lexer.
var Coffeescript = internal.Register(MustNewLexer(
&Config{
Name: "CoffeeScript",
Aliases: []string{"coffee-script", "coffeescript", "coffee"},
Filenames: []string{"*.coffee"},
MimeTypes: []string{"text/coffeescript"},
NotMultiline: true,
DotAll: true,
},
Rules{
"commentsandwhitespace": {
{`\s+`, Text, nil},
{`###[^#].*?###`, CommentMultiline, nil},
{`#(?!##[^#]).*?\n`, CommentSingle, nil},
},
"multilineregex": {
{`[^/#]+`, LiteralStringRegex, nil},
{`///([gim]+\b|\B)`, LiteralStringRegex, Pop(1)},
{`#\{`, LiteralStringInterpol, Push("interpoling_string")},
{`[/#]`, LiteralStringRegex, nil},
},
"slashstartsregex": {
Include("commentsandwhitespace"),
{`///`, LiteralStringRegex, Push("#pop", "multilineregex")},
{`/(?! )(\\.|[^[/\\\n]|\[(\\.|[^\]\\\n])*])+/([gim]+\b|\B)`, LiteralStringRegex, Pop(1)},
{`/`, Operator, nil},
Default(Pop(1)),
},
"root": {
Include("commentsandwhitespace"),
{`^(?=\s|/)`, Text, Push("slashstartsregex")},
{"\\+\\+|~|&&|\\band\\b|\\bor\\b|\\bis\\b|\\bisnt\\b|\\bnot\\b|\\?|:|\\|\\||\\\\(?=\\n)|(<<|>>>?|==?(?!>)|!=?|=(?!>)|-(?!>)|[<>+*`%&\\|\\^/])=?", Operator, Push("slashstartsregex")},
{`(?:\([^()]*\))?\s*[=-]>`, NameFunction, Push("slashstartsregex")},
{`[{(\[;,]`, Punctuation, Push("slashstartsregex")},
{`[})\].]`, Punctuation, nil},
{`(?<![.$])(for|own|in|of|while|until|loop|break|return|continue|switch|when|then|if|unless|else|throw|try|catch|finally|new|delete|typeof|instanceof|super|extends|this|class|by)\b`, Keyword, Push("slashstartsregex")},
{`(?<![.$])(true|false|yes|no|on|off|null|NaN|Infinity|undefined)\b`, KeywordConstant, nil},
{`(Array|Boolean|Date|Error|Function|Math|netscape|Number|Object|Packages|RegExp|String|sun|decodeURI|decodeURIComponent|encodeURI|encodeURIComponent|eval|isFinite|isNaN|parseFloat|parseInt|document|window)\b`, NameBuiltin, nil},
{`[$a-zA-Z_][\w.:$]*\s*[:=]\s`, NameVariable, Push("slashstartsregex")},
{`@[$a-zA-Z_][\w.:$]*\s*[:=]\s`, NameVariableInstance, Push("slashstartsregex")},
{`@`, NameOther, Push("slashstartsregex")},
{`@?[$a-zA-Z_][\w$]*`, NameOther, nil},
{`[0-9][0-9]*\.[0-9]+([eE][0-9]+)?[fd]?`, LiteralNumberFloat, nil},
{`0x[0-9a-fA-F]+`, LiteralNumberHex, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`"""`, LiteralString, Push("tdqs")},
{`'''`, LiteralString, Push("tsqs")},
{`"`, LiteralString, Push("dqs")},
{`'`, LiteralString, Push("sqs")},
},
"strings": {
{`[^#\\\'"]+`, LiteralString, nil},
},
"interpoling_string": {
{`\}`, LiteralStringInterpol, Pop(1)},
Include("root"),
},
"dqs": {
{`"`, LiteralString, Pop(1)},
{`\\.|\'`, LiteralString, nil},
{`#\{`, LiteralStringInterpol, Push("interpoling_string")},
{`#`, LiteralString, nil},
Include("strings"),
},
"sqs": {
{`'`, LiteralString, Pop(1)},
{`#|\\.|"`, LiteralString, nil},
Include("strings"),
},
"tdqs": {
{`"""`, LiteralString, Pop(1)},
{`\\.|\'|"`, LiteralString, nil},
{`#\{`, LiteralStringInterpol, Push("interpoling_string")},
{`#`, LiteralString, nil},
Include("strings"),
},
"tsqs": {
{`'''`, LiteralString, Pop(1)},
{`#|\\.|\'|"`, LiteralString, nil},
Include("strings"),
},
},
))

View file

@ -0,0 +1,48 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cfstatement lexer.
var Cfstatement = internal.Register(MustNewLexer(
&Config{
Name: "cfstatement",
Aliases: []string{"cfs"},
Filenames: []string{},
MimeTypes: []string{},
NotMultiline: true,
CaseInsensitive: true,
},
Rules{
"root": {
{`//.*?\n`, CommentSingle, nil},
{`/\*(?:.|\n)*?\*/`, CommentMultiline, nil},
{`\+\+|--`, Operator, nil},
{`[-+*/^&=!]`, Operator, nil},
{`<=|>=|<|>|==`, Operator, nil},
{`mod\b`, Operator, nil},
{`(eq|lt|gt|lte|gte|not|is|and|or)\b`, Operator, nil},
{`\|\||&&`, Operator, nil},
{`\?`, Operator, nil},
{`"`, LiteralStringDouble, Push("string")},
{`'.*?'`, LiteralStringSingle, nil},
{`\d+`, LiteralNumber, nil},
{`(if|else|len|var|xml|default|break|switch|component|property|function|do|try|catch|in|continue|for|return|while|required|any|array|binary|boolean|component|date|guid|numeric|query|string|struct|uuid|case)\b`, Keyword, nil},
{`(true|false|null)\b`, KeywordConstant, nil},
{`(application|session|client|cookie|super|this|variables|arguments)\b`, NameConstant, nil},
{`([a-z_$][\w.]*)(\s*)(\()`, ByGroups(NameFunction, Text, Punctuation), nil},
{`[a-z_$][\w.]*`, NameVariable, nil},
{`[()\[\]{};:,.\\]`, Punctuation, nil},
{`\s+`, Text, nil},
},
"string": {
{`""`, LiteralStringDouble, nil},
{`#.+?#`, LiteralStringInterpol, nil},
{`[^"#]+`, LiteralStringDouble, nil},
{`#`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Pop(1)},
},
},
))

View file

@ -0,0 +1,63 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Coq lexer.
var Coq = internal.Register(MustNewLexer(
&Config{
Name: "Coq",
Aliases: []string{"coq"},
Filenames: []string{"*.v"},
MimeTypes: []string{"text/x-coq"},
},
Rules{
"root": {
{`\s+`, Text, nil},
{`false|true|\(\)|\[\]`, NameBuiltinPseudo, nil},
{`\(\*`, Comment, Push("comment")},
{Words(`\b`, `\b`, `Section`, `Module`, `End`, `Require`, `Import`, `Export`, `Variable`, `Variables`, `Parameter`, `Parameters`, `Axiom`, `Hypothesis`, `Hypotheses`, `Notation`, `Local`, `Tactic`, `Reserved`, `Scope`, `Open`, `Close`, `Bind`, `Delimit`, `Definition`, `Let`, `Ltac`, `Fixpoint`, `CoFixpoint`, `Morphism`, `Relation`, `Implicit`, `Arguments`, `Set`, `Unset`, `Contextual`, `Strict`, `Prenex`, `Implicits`, `Inductive`, `CoInductive`, `Record`, `Structure`, `Canonical`, `Coercion`, `Theorem`, `Lemma`, `Corollary`, `Proposition`, `Fact`, `Remark`, `Example`, `Proof`, `Goal`, `Save`, `Qed`, `Defined`, `Hint`, `Resolve`, `Rewrite`, `View`, `Search`, `Show`, `Print`, `Printing`, `All`, `Graph`, `Projections`, `inside`, `outside`, `Check`, `Global`, `Instance`, `Class`, `Existing`, `Universe`, `Polymorphic`, `Monomorphic`, `Context`), KeywordNamespace, nil},
{Words(`\b`, `\b`, `forall`, `exists`, `exists2`, `fun`, `fix`, `cofix`, `struct`, `match`, `end`, `in`, `return`, `let`, `if`, `is`, `then`, `else`, `for`, `of`, `nosimpl`, `with`, `as`), Keyword, nil},
{Words(`\b`, `\b`, `Type`, `Prop`), KeywordType, nil},
{Words(`\b`, `\b`, `pose`, `set`, `move`, `case`, `elim`, `apply`, `clear`, `hnf`, `intro`, `intros`, `generalize`, `rename`, `pattern`, `after`, `destruct`, `induction`, `using`, `refine`, `inversion`, `injection`, `rewrite`, `congr`, `unlock`, `compute`, `ring`, `field`, `replace`, `fold`, `unfold`, `change`, `cutrewrite`, `simpl`, `have`, `suff`, `wlog`, `suffices`, `without`, `loss`, `nat_norm`, `assert`, `cut`, `trivial`, `revert`, `bool_congr`, `nat_congr`, `symmetry`, `transitivity`, `auto`, `split`, `left`, `right`, `autorewrite`, `tauto`, `setoid_rewrite`, `intuition`, `eauto`, `eapply`, `econstructor`, `etransitivity`, `constructor`, `erewrite`, `red`, `cbv`, `lazy`, `vm_compute`, `native_compute`, `subst`), Keyword, nil},
{Words(`\b`, `\b`, `by`, `done`, `exact`, `reflexivity`, `tauto`, `romega`, `omega`, `assumption`, `solve`, `contradiction`, `discriminate`, `congruence`), KeywordPseudo, nil},
{Words(`\b`, `\b`, `do`, `last`, `first`, `try`, `idtac`, `repeat`), KeywordReserved, nil},
{`\b([A-Z][\w\']*)`, Name, nil},
{"(\u03bb|\u03a0|\\|\\}|\\{\\||\\\\/|/\\\\|=>|~|\\}|\\|]|\\||\\{<|\\{|`|_|]|\\[\\||\\[>|\\[<|\\[|\\?\\?|\\?|>\\}|>]|>|=|<->|<-|<|;;|;|:>|:=|::|:|\\.\\.|\\.|->|-\\.|-|,|\\+|\\*|\\)|\\(|&&|&|#|!=)", Operator, nil},
{`([=<>@^|&+\*/$%-]|[!?~])?[!$%&*+\./:<=>?@^|~-]`, Operator, nil},
{`\b(unit|nat|bool|string|ascii|list)\b`, KeywordType, nil},
{`[^\W\d][\w']*`, Name, nil},
{`\d[\d_]*`, LiteralNumberInteger, nil},
{`0[xX][\da-fA-F][\da-fA-F_]*`, LiteralNumberHex, nil},
{`0[oO][0-7][0-7_]*`, LiteralNumberOct, nil},
{`0[bB][01][01_]*`, LiteralNumberBin, nil},
{`-?\d[\d_]*(.[\d_]*)?([eE][+\-]?\d[\d_]*)`, LiteralNumberFloat, nil},
{`'(?:(\\[\\\"'ntbr ])|(\\[0-9]{3})|(\\x[0-9a-fA-F]{2}))'`, LiteralStringChar, nil},
{`'.'`, LiteralStringChar, nil},
{`'`, Keyword, nil},
{`"`, LiteralStringDouble, Push("string")},
{`[~?][a-z][\w\']*:`, Name, nil},
},
"comment": {
{`[^(*)]+`, Comment, nil},
{`\(\*`, Comment, Push()},
{`\*\)`, Comment, Pop(1)},
{`[(*)]`, Comment, nil},
},
"string": {
{`[^"]+`, LiteralStringDouble, nil},
{`""`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Pop(1)},
},
"dotted": {
{`\s+`, Text, nil},
{`\.`, Punctuation, nil},
{`[A-Z][\w\']*(?=\s*\.)`, NameNamespace, nil},
{`[A-Z][\w\']*`, NameClass, Pop(1)},
{`[a-z][a-z0-9_\']*`, Name, Pop(1)},
Default(Pop(1)),
},
},
))

View file

@ -0,0 +1,106 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// CPP lexer.
var CPP = internal.Register(MustNewLexer(
&Config{
Name: "C++",
Aliases: []string{"cpp", "c++"},
Filenames: []string{"*.cpp", "*.hpp", "*.c++", "*.h++", "*.cc", "*.hh", "*.cxx", "*.hxx", "*.C", "*.H", "*.cp", "*.CPP"},
MimeTypes: []string{"text/x-c++hdr", "text/x-c++src"},
EnsureNL: true,
},
Rules{
"statements": {
{Words(``, `\b`, `catch`, `const_cast`, `delete`, `dynamic_cast`, `explicit`, `export`, `friend`, `mutable`, `namespace`, `new`, `operator`, `private`, `protected`, `public`, `reinterpret_cast`, `restrict`, `static_cast`, `template`, `this`, `throw`, `throws`, `try`, `typeid`, `typename`, `using`, `virtual`, `constexpr`, `nullptr`, `decltype`, `thread_local`, `alignas`, `alignof`, `static_assert`, `noexcept`, `override`, `final`, `concept`, `requires`, `consteval`, `co_await`, `co_return`, `co_yield`), Keyword, nil},
{`(enum)\b(\s+)(class)\b(\s*)`, ByGroups(Keyword, Text, Keyword, Text), Push("classname")},
{`(class|struct|enum|union)\b(\s*)`, ByGroups(Keyword, Text), Push("classname")},
{`\[\[.+\]\]`, NameAttribute, nil},
{`(R)(")([^\\()\s]{,16})(\()((?:.|\n)*?)(\)\3)(")`, ByGroups(LiteralStringAffix, LiteralString, LiteralStringDelimiter, LiteralStringDelimiter, LiteralString, LiteralStringDelimiter, LiteralString), nil},
{`(u8|u|U)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
{`(L?)(")`, ByGroups(LiteralStringAffix, LiteralString), Push("string")},
{`(L?)(')(\\.|\\[0-7]{1,3}|\\x[a-fA-F0-9]{1,2}|[^\\\'\n])(')`, ByGroups(LiteralStringAffix, LiteralStringChar, LiteralStringChar, LiteralStringChar), nil},
{`(\d+\.\d*|\.\d+|\d+)[eE][+-]?\d+[LlUu]*`, LiteralNumberFloat, nil},
{`(\d+\.\d*|\.\d+|\d+[fF])[fF]?`, LiteralNumberFloat, nil},
{`0[xX]([0-9A-Fa-f]('?[0-9A-Fa-f]+)*)[LlUu]*`, LiteralNumberHex, nil},
{`0('?[0-7]+)+[LlUu]*`, LiteralNumberOct, nil},
{`0[Bb][01]('?[01]+)*[LlUu]*`, LiteralNumberBin, nil},
{`[0-9]('?[0-9]+)*[LlUu]*`, LiteralNumberInteger, nil},
{`\*/`, Error, nil},
{`[~!%^&*+=|?:<>/-]`, Operator, nil},
{`[()\[\],.]`, Punctuation, nil},
{Words(``, `\b`, `asm`, `auto`, `break`, `case`, `const`, `continue`, `default`, `do`, `else`, `enum`, `extern`, `for`, `goto`, `if`, `register`, `restricted`, `return`, `sizeof`, `static`, `struct`, `switch`, `typedef`, `union`, `volatile`, `while`), Keyword, nil},
{`(bool|int|long|float|short|double|char((8|16|32)_t)?|wchar_t|unsigned|signed|void|u?int(_fast|_least|)(8|16|32|64)_t)\b`, KeywordType, nil},
{Words(``, `\b`, `inline`, `_inline`, `__inline`, `naked`, `restrict`, `thread`, `typename`), KeywordReserved, nil},
{`(__m(128i|128d|128|64))\b`, KeywordReserved, nil},
{Words(`__`, `\b`, `asm`, `int8`, `based`, `except`, `int16`, `stdcall`, `cdecl`, `fastcall`, `int32`, `declspec`, `finally`, `int64`, `try`, `leave`, `w64`, `unaligned`, `raise`, `noop`, `identifier`, `forceinline`, `assume`), KeywordReserved, nil},
{`(true|false|NULL)\b`, NameBuiltin, nil},
{`([a-zA-Z_]\w*)(\s*)(:)(?!:)`, ByGroups(NameLabel, Text, Punctuation), nil},
{`[a-zA-Z_]\w*`, Name, nil},
},
"root": {
Include("whitespace"),
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;{]*)(\{)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), Push("function")},
{`((?:[\w*\s])+?(?:\s|[*]))([a-zA-Z_]\w*)(\s*\([^;]*?\))([^;]*)(;)`, ByGroups(UsingSelf("root"), NameFunction, UsingSelf("root"), UsingSelf("root"), Punctuation), nil},
Default(Push("statement")),
{Words(`__`, `\b`, `virtual_inheritance`, `uuidof`, `super`, `single_inheritance`, `multiple_inheritance`, `interface`, `event`), KeywordReserved, nil},
{`__(offload|blockingoffload|outer)\b`, KeywordPseudo, nil},
},
"classname": {
{`(\[\[.+\]\])(\s*)`, ByGroups(NameAttribute, Text), nil},
{`[a-zA-Z_]\w*`, NameClass, Pop(1)},
{`\s*(?=[>{])`, Text, Pop(1)},
},
"whitespace": {
{`^#if\s+0`, CommentPreproc, Push("if0")},
{`^#`, CommentPreproc, Push("macro")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#if\s+0)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("if0")},
{`^(\s*(?:/[*].*?[*]/\s*)?)(#)`, ByGroups(UsingSelf("root"), CommentPreproc), Push("macro")},
{`\n`, Text, nil},
{`\s+`, Text, nil},
{`\\\n`, Text, nil},
{`//(\n|[\w\W]*?[^\\]\n)`, CommentSingle, nil},
{`/(\\\n)?[*][\w\W]*?[*](\\\n)?/`, CommentMultiline, nil},
{`/(\\\n)?[*][\w\W]*`, CommentMultiline, nil},
},
"statement": {
Include("whitespace"),
Include("statements"),
{`[{]`, Punctuation, Push("root")},
{`[;}]`, Punctuation, Pop(1)},
},
"function": {
Include("whitespace"),
Include("statements"),
{`;`, Punctuation, nil},
{`\{`, Punctuation, Push()},
{`\}`, Punctuation, Pop(1)},
},
"string": {
{`"`, LiteralString, Pop(1)},
{`\\([\\abfnrtv"\']|x[a-fA-F0-9]{2,4}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|[0-7]{1,3})`, LiteralStringEscape, nil},
{`[^\\"\n]+`, LiteralString, nil},
{`\\\n`, LiteralString, nil},
{`\\`, LiteralString, nil},
},
"macro": {
{`(include)(\s*(?:/[*].*?[*]/\s*)?)([^\n]+)`, ByGroups(CommentPreproc, Text, CommentPreprocFile), nil},
{`[^/\n]+`, CommentPreproc, nil},
{`/[*](.|\n)*?[*]/`, CommentMultiline, nil},
{`//.*?\n`, CommentSingle, Pop(1)},
{`/`, CommentPreproc, nil},
{`(?<=\\)\n`, CommentPreproc, nil},
{`\n`, CommentPreproc, Pop(1)},
},
"if0": {
{`^\s*#if.*?(?<!\\)\n`, CommentPreproc, Push()},
{`^\s*#el(?:se|if).*\n`, CommentPreproc, Pop(1)},
{`^\s*#endif.*?(?<!\\)\n`, CommentPreproc, Pop(1)},
{`.*?\n`, Comment, nil},
},
},
))

View file

@ -0,0 +1,27 @@
package c_test
import (
"testing"
"github.com/alecthomas/assert"
"github.com/alecthomas/chroma"
"github.com/alecthomas/chroma/lexers/c"
)
func TestIssue290(t *testing.T) {
input := `// 64-bit floats have 53 digits of precision, including the whole-number-part.
double a = 0011111110111001100110011001100110011001100110011001100110011010; // imperfect representation of 0.1
double b = 0011111111001001100110011001100110011001100110011001100110011010; // imperfect representation of 0.2
double c = 0011111111010011001100110011001100110011001100110011001100110011; // imperfect representation of 0.3
double a + b = 0011111111010011001100110011001100110011001100110011001100110100; // Note that this is not quite equal to the "canonical" 0.3!a
`
it, err := c.CPP.Tokenise(nil, input)
assert.NoError(t, err)
for {
token := it()
if token == chroma.EOF {
break
}
}
}

View file

@ -0,0 +1,69 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// CassandraCQL lexer.
var CassandraCQL = internal.Register(MustNewLexer(
&Config{
Name: "Cassandra CQL",
Aliases: []string{"cassandra", "cql"},
Filenames: []string{"*.cql"},
MimeTypes: []string{"text/x-cql"},
NotMultiline: true,
CaseInsensitive: true,
},
Rules{
"root": {
{`\s+`, TextWhitespace, nil},
{`(--|\/\/).*\n?`, CommentSingle, nil},
{`/\*`, CommentMultiline, Push("multiline-comments")},
{`(ascii|bigint|blob|boolean|counter|date|decimal|double|float|frozen|inet|int|list|map|set|smallint|text|time|timestamp|timeuuid|tinyint|tuple|uuid|varchar|varint)\b`, NameBuiltin, nil},
{Words(``, `\b`, `ADD`, `AGGREGATE`, `ALL`, `ALLOW`, `ALTER`, `AND`, `ANY`, `APPLY`, `AS`, `ASC`, `AUTHORIZE`, `BATCH`, `BEGIN`, `BY`, `CLUSTERING`, `COLUMNFAMILY`, `COMPACT`, `CONSISTENCY`, `COUNT`, `CREATE`, `CUSTOM`, `DELETE`, `DESC`, `DISTINCT`, `DROP`, `EACH_QUORUM`, `ENTRIES`, `EXISTS`, `FILTERING`, `FROM`, `FULL`, `GRANT`, `IF`, `IN`, `INDEX`, `INFINITY`, `INSERT`, `INTO`, `KEY`, `KEYS`, `KEYSPACE`, `KEYSPACES`, `LEVEL`, `LIMIT`, `LOCAL_ONE`, `LOCAL_QUORUM`, `MATERIALIZED`, `MODIFY`, `NAN`, `NORECURSIVE`, `NOSUPERUSER`, `NOT`, `OF`, `ON`, `ONE`, `ORDER`, `PARTITION`, `PASSWORD`, `PER`, `PERMISSION`, `PERMISSIONS`, `PRIMARY`, `QUORUM`, `RENAME`, `REVOKE`, `SCHEMA`, `SELECT`, `STATIC`, `STORAGE`, `SUPERUSER`, `TABLE`, `THREE`, `TO`, `TOKEN`, `TRUNCATE`, `TTL`, `TWO`, `TYPE`, `UNLOGGED`, `UPDATE`, `USE`, `USER`, `USERS`, `USING`, `VALUES`, `VIEW`, `WHERE`, `WITH`, `WRITETIME`, `REPLICATION`, `OR`, `REPLACE`, `FUNCTION`, `CALLED`, `INPUT`, `RETURNS`, `LANGUAGE`, `ROLE`, `ROLES`, `TRIGGER`, `DURABLE_WRITES`, `LOGIN`, `OPTIONS`, `LOGGED`, `SFUNC`, `STYPE`, `FINALFUNC`, `INITCOND`, `IS`, `CONTAINS`, `JSON`, `PAGING`, `OFF`), Keyword, nil},
{"[+*/<>=~!@#%^&|`?-]+", Operator, nil},
{`(?s)(java|javascript)(\s+)(AS)(\s+)('|\$\$)(.*?)(\5)`,
UsingByGroup(
internal.Get,
1, 6,
NameBuiltin, TextWhitespace, Keyword, TextWhitespace,
LiteralStringHeredoc, LiteralStringHeredoc, LiteralStringHeredoc,
),
nil,
},
{`(true|false|null)\b`, KeywordConstant, nil},
{`0x[0-9a-f]+`, LiteralNumberHex, nil},
{`[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}`, LiteralNumberHex, nil},
{`\.[0-9]+(e[+-]?[0-9]+)?`, Error, nil},
{`-?[0-9]+(\.[0-9])?(e[+-]?[0-9]+)?`, LiteralNumberFloat, nil},
{`[0-9]+`, LiteralNumberInteger, nil},
{`'`, LiteralStringSingle, Push("string")},
{`"`, LiteralStringName, Push("quoted-ident")},
{`\$\$`, LiteralStringHeredoc, Push("dollar-string")},
{`[a-z_]\w*`, Name, nil},
{`:(['"]?)[a-z]\w*\b\1`, NameVariable, nil},
{`[;:()\[\]\{\},.]`, Punctuation, nil},
},
"multiline-comments": {
{`/\*`, CommentMultiline, Push("multiline-comments")},
{`\*/`, CommentMultiline, Pop(1)},
{`[^/*]+`, CommentMultiline, nil},
{`[/*]`, CommentMultiline, nil},
},
"string": {
{`[^']+`, LiteralStringSingle, nil},
{`''`, LiteralStringSingle, nil},
{`'`, LiteralStringSingle, Pop(1)},
},
"quoted-ident": {
{`[^"]+`, LiteralStringName, nil},
{`""`, LiteralStringName, nil},
{`"`, LiteralStringName, Pop(1)},
},
"dollar-string": {
{`[^\$]+`, LiteralStringHeredoc, nil},
{`\$\$`, LiteralStringHeredoc, Pop(1)},
},
},
))

View file

@ -0,0 +1,262 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Crystal lexer.
var Crystal = internal.Register(MustNewLexer(
&Config{
Name: "Crystal",
Aliases: []string{"cr", "crystal"},
Filenames: []string{"*.cr"},
MimeTypes: []string{"text/x-crystal"},
DotAll: true,
},
Rules{
"root": {
{`#.*?$`, CommentSingle, nil},
{Words(``, `\b`, `abstract`, `asm`, `as`, `begin`, `break`, `case`, `do`, `else`, `elsif`, `end`, `ensure`, `extend`, `ifdef`, `if`, `include`, `instance_sizeof`, `next`, `of`, `pointerof`, `private`, `protected`, `rescue`, `return`, `require`, `sizeof`, `super`, `then`, `typeof`, `unless`, `until`, `when`, `while`, `with`, `yield`), Keyword, nil},
{Words(``, `\b`, `true`, `false`, `nil`), KeywordConstant, nil},
{`(module|lib)(\s+)([a-zA-Z_]\w*(?:::[a-zA-Z_]\w*)*)`, ByGroups(Keyword, Text, NameNamespace), nil},
{`(def|fun|macro)(\s+)((?:[a-zA-Z_]\w*::)*)`, ByGroups(Keyword, Text, NameNamespace), Push("funcname")},
{"def(?=[*%&^`~+-/\\[<>=])", Keyword, Push("funcname")},
{`(class|struct|union|type|alias|enum)(\s+)((?:[a-zA-Z_]\w*::)*)`, ByGroups(Keyword, Text, NameNamespace), Push("classname")},
{`(self|out|uninitialized)\b|(is_a|responds_to)\?`, KeywordPseudo, nil},
{Words(``, `\b`, `debugger`, `record`, `pp`, `assert_responds_to`, `spawn`, `parallel`, `getter`, `setter`, `property`, `delegate`, `def_hash`, `def_equals`, `def_equals_and_hash`, `forward_missing_to`), NameBuiltinPseudo, nil},
{`getter[!?]|property[!?]|__(DIR|FILE|LINE)__\b`, NameBuiltinPseudo, nil},
{Words(`(?<!\.)`, `\b`, `Object`, `Value`, `Struct`, `Reference`, `Proc`, `Class`, `Nil`, `Symbol`, `Enum`, `Void`, `Bool`, `Number`, `Int`, `Int8`, `Int16`, `Int32`, `Int64`, `UInt8`, `UInt16`, `UInt32`, `UInt64`, `Float`, `Float32`, `Float64`, `Char`, `String`, `Pointer`, `Slice`, `Range`, `Exception`, `Regex`, `Mutex`, `StaticArray`, `Array`, `Hash`, `Set`, `Tuple`, `Deque`, `Box`, `Process`, `File`, `Dir`, `Time`, `Channel`, `Concurrent`, `Scheduler`, `abort`, `at_exit`, `caller`, `delay`, `exit`, `fork`, `future`, `get_stack_top`, `gets`, `lazy`, `loop`, `main`, `p`, `print`, `printf`, `puts`, `raise`, `rand`, `read_line`, `sleep`, `sprintf`, `system`, `with_color`), NameBuiltin, nil},
{"(?<!\\w)(<<-?)([\"`\\']?)([a-zA-Z_]\\w*)(\\2)(.*?\\n)", StringHeredoc, nil},
{`(<<-?)("|\')()(\2)(.*?\n)`, StringHeredoc, nil},
{`__END__`, CommentPreproc, Push("end-part")},
{`(?:^|(?<=[=<>~!:])|(?<=(?:\s|;)when\s)|(?<=(?:\s|;)or\s)|(?<=(?:\s|;)and\s)|(?<=\.index\s)|(?<=\.scan\s)|(?<=\.sub\s)|(?<=\.sub!\s)|(?<=\.gsub\s)|(?<=\.gsub!\s)|(?<=\.match\s)|(?<=(?:\s|;)if\s)|(?<=(?:\s|;)elsif\s)|(?<=^when\s)|(?<=^index\s)|(?<=^scan\s)|(?<=^sub\s)|(?<=^gsub\s)|(?<=^sub!\s)|(?<=^gsub!\s)|(?<=^match\s)|(?<=^if\s)|(?<=^elsif\s))(\s*)(/)`, ByGroups(Text, LiteralStringRegex), Push("multiline-regex")},
{`(?<=\(|,|\[)/`, LiteralStringRegex, Push("multiline-regex")},
{`(\s+)(/)(?![\s=])`, ByGroups(Text, LiteralStringRegex), Push("multiline-regex")},
{`(0o[0-7]+(?:_[0-7]+)*(?:_?[iu][0-9]+)?)\b(\s*)([/?])?`, ByGroups(LiteralNumberOct, Text, Operator), nil},
{`(0x[0-9A-Fa-f]+(?:_[0-9A-Fa-f]+)*(?:_?[iu][0-9]+)?)\b(\s*)([/?])?`, ByGroups(LiteralNumberHex, Text, Operator), nil},
{`(0b[01]+(?:_[01]+)*(?:_?[iu][0-9]+)?)\b(\s*)([/?])?`, ByGroups(LiteralNumberBin, Text, Operator), nil},
{`((?:0(?![0-9])|[1-9][\d_]*)(?:\.\d[\d_]*)(?:e[+-]?[0-9]+)?(?:_?f[0-9]+)?)(\s*)([/?])?`, ByGroups(LiteralNumberFloat, Text, Operator), nil},
{`((?:0(?![0-9])|[1-9][\d_]*)(?:\.\d[\d_]*)?(?:e[+-]?[0-9]+)(?:_?f[0-9]+)?)(\s*)([/?])?`, ByGroups(LiteralNumberFloat, Text, Operator), nil},
{`((?:0(?![0-9])|[1-9][\d_]*)(?:\.\d[\d_]*)?(?:e[+-]?[0-9]+)?(?:_?f[0-9]+))(\s*)([/?])?`, ByGroups(LiteralNumberFloat, Text, Operator), nil},
{`(0\b|[1-9][\d]*(?:_\d+)*(?:_?[iu][0-9]+)?)\b(\s*)([/?])?`, ByGroups(LiteralNumberInteger, Text, Operator), nil},
{`@@[a-zA-Z_]\w*`, NameVariableClass, nil},
{`@[a-zA-Z_]\w*`, NameVariableInstance, nil},
{`\$\w+`, NameVariableGlobal, nil},
{"\\$[!@&`\\'+~=/\\\\,;.<>_*$?:\"^-]", NameVariableGlobal, nil},
{`\$-[0adFiIlpvw]`, NameVariableGlobal, nil},
{`::`, Operator, nil},
Include("strings"),
{`\?(\\[MC]-)*(\\([\\befnrtv#"\']|x[a-fA-F0-9]{1,2}|[0-7]{1,3})|\S)(?!\w)`, LiteralStringChar, nil},
{`[A-Z][A-Z_]+\b`, NameConstant, nil},
{`\{%`, LiteralStringInterpol, Push("in-macro-control")},
{`\{\{`, LiteralStringInterpol, Push("in-macro-expr")},
{`(@\[)(\s*)([A-Z]\w*)`, ByGroups(Operator, Text, NameDecorator), Push("in-attr")},
{Words(`(\.|::)`, ``, `!=`, `!~`, `!`, `%`, `&&`, `&`, `**`, `*`, `+`, `-`, `/`, `<=>`, `<<`, `<=`, `<`, `===`, `==`, `=~`, `=`, `>=`, `>>`, `>`, `[]=`, `[]?`, `[]`, `^`, `||`, `|`, `~`), ByGroups(Operator, NameOperator), nil},
{"(\\.|::)([a-zA-Z_]\\w*[!?]?|[*%&^`~+\\-/\\[<>=])", ByGroups(Operator, Name), nil},
{`[a-zA-Z_]\w*(?:[!?](?!=))?`, Name, nil},
{`(\[|\]\??|\*\*|<=>?|>=|<<?|>>?|=~|===|!~|&&?|\|\||\.{1,3})`, Operator, nil},
{`[-+/*%=<>&!^|~]=?`, Operator, nil},
{`[(){};,/?:\\]`, Punctuation, nil},
{`\s+`, Text, nil},
},
"funcname": {
{"(?:([a-zA-Z_]\\w*)(\\.))?([a-zA-Z_]\\w*[!?]?|\\*\\*?|[-+]@?|[/%&|^`~]|\\[\\]=?|<<|>>|<=?>|>=?|===?)", ByGroups(NameClass, Operator, NameFunction), Pop(1)},
Default(Pop(1)),
},
"classname": {
{`[A-Z_]\w*`, NameClass, nil},
{`(\()(\s*)([A-Z_]\w*)(\s*)(\))`, ByGroups(Punctuation, Text, NameClass, Text, Punctuation), nil},
Default(Pop(1)),
},
"in-intp": {
{`\{`, LiteralStringInterpol, Push()},
{`\}`, LiteralStringInterpol, Pop(1)},
Include("root"),
},
"string-intp": {
{`#\{`, LiteralStringInterpol, Push("in-intp")},
},
"string-escaped": {
{`\\([\\befnstv#"\']|x[a-fA-F0-9]{1,2}|[0-7]{1,3})`, LiteralStringEscape, nil},
},
"string-intp-escaped": {
Include("string-intp"),
Include("string-escaped"),
},
"interpolated-regex": {
Include("string-intp"),
{`[\\#]`, LiteralStringRegex, nil},
{`[^\\#]+`, LiteralStringRegex, nil},
},
"interpolated-string": {
Include("string-intp"),
{`[\\#]`, LiteralStringOther, nil},
{`[^\\#]+`, LiteralStringOther, nil},
},
"multiline-regex": {
Include("string-intp"),
{`\\\\`, LiteralStringRegex, nil},
{`\\/`, LiteralStringRegex, nil},
{`[\\#]`, LiteralStringRegex, nil},
{`[^\\/#]+`, LiteralStringRegex, nil},
{`/[imsx]*`, LiteralStringRegex, Pop(1)},
},
"end-part": {
{`.+`, CommentPreproc, Pop(1)},
},
"in-macro-control": {
{`\{%`, LiteralStringInterpol, Push()},
{`%\}`, LiteralStringInterpol, Pop(1)},
{`for\b|in\b`, Keyword, nil},
Include("root"),
},
"in-macro-expr": {
{`\{\{`, LiteralStringInterpol, Push()},
{`\}\}`, LiteralStringInterpol, Pop(1)},
Include("root"),
},
"in-attr": {
{`\[`, Operator, Push()},
{`\]`, Operator, Pop(1)},
Include("root"),
},
"strings": {
{`\:@{0,2}[a-zA-Z_]\w*[!?]?`, LiteralStringSymbol, nil},
{Words(`\:@{0,2}`, ``, `!=`, `!~`, `!`, `%`, `&&`, `&`, `**`, `*`, `+`, `-`, `/`, `<=>`, `<<`, `<=`, `<`, `===`, `==`, `=~`, `=`, `>=`, `>>`, `>`, `[]=`, `[]?`, `[]`, `^`, `||`, `|`, `~`), LiteralStringSymbol, nil},
{`:'(\\\\|\\'|[^'])*'`, LiteralStringSymbol, nil},
{`'(\\\\|\\'|[^']|\\[^'\\]+)'`, LiteralStringChar, nil},
{`:"`, LiteralStringSymbol, Push("simple-sym")},
{`([a-zA-Z_]\w*)(:)(?!:)`, ByGroups(LiteralStringSymbol, Punctuation), nil},
{`"`, LiteralStringDouble, Push("simple-string")},
{"(?<!\\.)`", LiteralStringBacktick, Push("simple-backtick")},
{`%\{`, LiteralStringOther, Push("cb-intp-string")},
{`%[wi]\{`, LiteralStringOther, Push("cb-string")},
{`%r\{`, LiteralStringRegex, Push("cb-regex")},
{`%\[`, LiteralStringOther, Push("sb-intp-string")},
{`%[wi]\[`, LiteralStringOther, Push("sb-string")},
{`%r\[`, LiteralStringRegex, Push("sb-regex")},
{`%\(`, LiteralStringOther, Push("pa-intp-string")},
{`%[wi]\(`, LiteralStringOther, Push("pa-string")},
{`%r\(`, LiteralStringRegex, Push("pa-regex")},
{`%<`, LiteralStringOther, Push("ab-intp-string")},
{`%[wi]<`, LiteralStringOther, Push("ab-string")},
{`%r<`, LiteralStringRegex, Push("ab-regex")},
{`(%r([\W_]))((?:\\\2|(?!\2).)*)(\2[imsx]*)`, String, nil},
{`(%[wi]([\W_]))((?:\\\2|(?!\2).)*)(\2)`, String, nil},
{`(?<=[-+/*%=<>&!^|~,(])(\s*)(%([\t ])(?:(?:\\\3|(?!\3).)*)\3)`, ByGroups(Text, LiteralStringOther, None), nil},
{`^(\s*)(%([\t ])(?:(?:\\\3|(?!\3).)*)\3)`, ByGroups(Text, LiteralStringOther, None), nil},
{`(%([\[{(<]))((?:\\\2|(?!\2).)*)(\2)`, String, nil},
},
"simple-string": {
Include("string-intp-escaped"),
{`[^\\"#]+`, LiteralStringDouble, nil},
{`[\\#]`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Pop(1)},
},
"simple-sym": {
Include("string-escaped"),
{`[^\\"#]+`, LiteralStringSymbol, nil},
{`[\\#]`, LiteralStringSymbol, nil},
{`"`, LiteralStringSymbol, Pop(1)},
},
"simple-backtick": {
Include("string-intp-escaped"),
{"[^\\\\`#]+", LiteralStringBacktick, nil},
{`[\\#]`, LiteralStringBacktick, nil},
{"`", LiteralStringBacktick, Pop(1)},
},
"cb-intp-string": {
{`\\[\{]`, LiteralStringOther, nil},
{`\{`, LiteralStringOther, Push()},
{`\}`, LiteralStringOther, Pop(1)},
Include("string-intp-escaped"),
{`[\\#{}]`, LiteralStringOther, nil},
{`[^\\#{}]+`, LiteralStringOther, nil},
},
"cb-string": {
{`\\[\\{}]`, LiteralStringOther, nil},
{`\{`, LiteralStringOther, Push()},
{`\}`, LiteralStringOther, Pop(1)},
{`[\\#{}]`, LiteralStringOther, nil},
{`[^\\#{}]+`, LiteralStringOther, nil},
},
"cb-regex": {
{`\\[\\{}]`, LiteralStringRegex, nil},
{`\{`, LiteralStringRegex, Push()},
{`\}[imsx]*`, LiteralStringRegex, Pop(1)},
Include("string-intp"),
{`[\\#{}]`, LiteralStringRegex, nil},
{`[^\\#{}]+`, LiteralStringRegex, nil},
},
"sb-intp-string": {
{`\\[\[]`, LiteralStringOther, nil},
{`\[`, LiteralStringOther, Push()},
{`\]`, LiteralStringOther, Pop(1)},
Include("string-intp-escaped"),
{`[\\#\[\]]`, LiteralStringOther, nil},
{`[^\\#\[\]]+`, LiteralStringOther, nil},
},
"sb-string": {
{`\\[\\\[\]]`, LiteralStringOther, nil},
{`\[`, LiteralStringOther, Push()},
{`\]`, LiteralStringOther, Pop(1)},
{`[\\#\[\]]`, LiteralStringOther, nil},
{`[^\\#\[\]]+`, LiteralStringOther, nil},
},
"sb-regex": {
{`\\[\\\[\]]`, LiteralStringRegex, nil},
{`\[`, LiteralStringRegex, Push()},
{`\][imsx]*`, LiteralStringRegex, Pop(1)},
Include("string-intp"),
{`[\\#\[\]]`, LiteralStringRegex, nil},
{`[^\\#\[\]]+`, LiteralStringRegex, nil},
},
"pa-intp-string": {
{`\\[\(]`, LiteralStringOther, nil},
{`\(`, LiteralStringOther, Push()},
{`\)`, LiteralStringOther, Pop(1)},
Include("string-intp-escaped"),
{`[\\#()]`, LiteralStringOther, nil},
{`[^\\#()]+`, LiteralStringOther, nil},
},
"pa-string": {
{`\\[\\()]`, LiteralStringOther, nil},
{`\(`, LiteralStringOther, Push()},
{`\)`, LiteralStringOther, Pop(1)},
{`[\\#()]`, LiteralStringOther, nil},
{`[^\\#()]+`, LiteralStringOther, nil},
},
"pa-regex": {
{`\\[\\()]`, LiteralStringRegex, nil},
{`\(`, LiteralStringRegex, Push()},
{`\)[imsx]*`, LiteralStringRegex, Pop(1)},
Include("string-intp"),
{`[\\#()]`, LiteralStringRegex, nil},
{`[^\\#()]+`, LiteralStringRegex, nil},
},
"ab-intp-string": {
{`\\[<]`, LiteralStringOther, nil},
{`<`, LiteralStringOther, Push()},
{`>`, LiteralStringOther, Pop(1)},
Include("string-intp-escaped"),
{`[\\#<>]`, LiteralStringOther, nil},
{`[^\\#<>]+`, LiteralStringOther, nil},
},
"ab-string": {
{`\\[\\<>]`, LiteralStringOther, nil},
{`<`, LiteralStringOther, Push()},
{`>`, LiteralStringOther, Pop(1)},
{`[\\#<>]`, LiteralStringOther, nil},
{`[^\\#<>]+`, LiteralStringOther, nil},
},
"ab-regex": {
{`\\[\\<>]`, LiteralStringRegex, nil},
{`<`, LiteralStringRegex, Push()},
{`>[imsx]*`, LiteralStringRegex, Pop(1)},
Include("string-intp"),
{`[\\#<>]`, LiteralStringRegex, nil},
{`[^\\#<>]+`, LiteralStringRegex, nil},
},
},
))

View file

@ -0,0 +1,51 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// CSharp lexer.
var CSharp = internal.Register(MustNewLexer(
&Config{
Name: "C#",
Aliases: []string{"csharp", "c#"},
Filenames: []string{"*.cs"},
MimeTypes: []string{"text/x-csharp"},
DotAll: true,
EnsureNL: true,
},
Rules{
"root": {
{`^\s*\[.*?\]`, NameAttribute, nil},
{`[^\S\n]+`, Text, nil},
{`\\\n`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/[*].*?[*]/`, CommentMultiline, nil},
{`\n`, Text, nil},
{`[~!%^&*()+=|\[\]:;,.<>/?-]`, Punctuation, nil},
{`[{}]`, Punctuation, nil},
{`@"(""|[^"])*"`, LiteralString, nil},
{`\$@?"(""|[^"])*"`, LiteralString, nil},
{`"(\\\\|\\"|[^"\n])*["\n]`, LiteralString, nil},
{`'\\.'|'[^\\]'`, LiteralStringChar, nil},
{`[0-9](\.[0-9]*)?([eE][+-][0-9]+)?[flFLdD]?|0[xX][0-9a-fA-F]+[Ll]?`, LiteralNumber, nil},
{`#[ \t]*(if|endif|else|elif|define|undef|line|error|warning|region|endregion|pragma)\b.*?\n`, CommentPreproc, nil},
{`\b(extern)(\s+)(alias)\b`, ByGroups(Keyword, Text, Keyword), nil},
{`(abstract|as|async|await|base|break|by|case|catch|checked|const|continue|default|delegate|do|else|enum|event|explicit|extern|false|finally|fixed|for|foreach|goto|if|implicit|in|interface|internal|is|let|lock|new|null|on|operator|out|override|params|private|protected|public|readonly|ref|return|sealed|sizeof|stackalloc|static|switch|this|throw|true|try|typeof|unchecked|unsafe|virtual|void|while|get|set|new|partial|yield|add|remove|value|alias|ascending|descending|from|group|into|orderby|select|thenby|where|join|equals)\b`, Keyword, nil},
{`(global)(::)`, ByGroups(Keyword, Punctuation), nil},
{`(bool|byte|char|decimal|double|dynamic|float|int|long|object|sbyte|short|string|uint|ulong|ushort|var)\b\??`, KeywordType, nil},
{`(class|struct)(\s+)`, ByGroups(Keyword, Text), Push("class")},
{`(namespace|using)(\s+)`, ByGroups(Keyword, Text), Push("namespace")},
{`@?[_a-zA-Z]\w*`, Name, nil},
},
"class": {
{`@?[_a-zA-Z]\w*`, NameClass, Pop(1)},
Default(Pop(1)),
},
"namespace": {
{`(?=\()`, Text, Pop(1)},
{`(@?[_a-zA-Z]\w*|\.)+`, NameNamespace, Pop(1)},
},
},
))

File diff suppressed because one or more lines are too long

View file

@ -0,0 +1,135 @@
package c
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Cython lexer.
var Cython = internal.Register(MustNewLexer(
&Config{
Name: "Cython",
Aliases: []string{"cython", "pyx", "pyrex"},
Filenames: []string{"*.pyx", "*.pxd", "*.pxi"},
MimeTypes: []string{"text/x-cython", "application/x-cython"},
},
Rules{
"root": {
{`\n`, Text, nil},
{`^(\s*)("""(?:.|\n)*?""")`, ByGroups(Text, LiteralStringDoc), nil},
{`^(\s*)('''(?:.|\n)*?''')`, ByGroups(Text, LiteralStringDoc), nil},
{`[^\S\n]+`, Text, nil},
{`#.*$`, Comment, nil},
{`[]{}:(),;[]`, Punctuation, nil},
{`\\\n`, Text, nil},
{`\\`, Text, nil},
{`(in|is|and|or|not)\b`, OperatorWord, nil},
{`(<)([a-zA-Z0-9.?]+)(>)`, ByGroups(Punctuation, KeywordType, Punctuation), nil},
{`!=|==|<<|>>|[-~+/*%=<>&^|.?]`, Operator, nil},
{`(from)(\d+)(<=)(\s+)(<)(\d+)(:)`, ByGroups(Keyword, LiteralNumberInteger, Operator, Name, Operator, Name, Punctuation), nil},
Include("keywords"),
{`(def|property)(\s+)`, ByGroups(Keyword, Text), Push("funcname")},
{`(cp?def)(\s+)`, ByGroups(Keyword, Text), Push("cdef")},
{`(cdef)(:)`, ByGroups(Keyword, Punctuation), nil},
{`(class|struct)(\s+)`, ByGroups(Keyword, Text), Push("classname")},
{`(from)(\s+)`, ByGroups(Keyword, Text), Push("fromimport")},
{`(c?import)(\s+)`, ByGroups(Keyword, Text), Push("import")},
Include("builtins"),
Include("backtick"),
{`(?:[rR]|[uU][rR]|[rR][uU])"""`, LiteralString, Push("tdqs")},
{`(?:[rR]|[uU][rR]|[rR][uU])'''`, LiteralString, Push("tsqs")},
{`(?:[rR]|[uU][rR]|[rR][uU])"`, LiteralString, Push("dqs")},
{`(?:[rR]|[uU][rR]|[rR][uU])'`, LiteralString, Push("sqs")},
{`[uU]?"""`, LiteralString, Combined("stringescape", "tdqs")},
{`[uU]?'''`, LiteralString, Combined("stringescape", "tsqs")},
{`[uU]?"`, LiteralString, Combined("stringescape", "dqs")},
{`[uU]?'`, LiteralString, Combined("stringescape", "sqs")},
Include("name"),
Include("numbers"),
},
"keywords": {
{Words(``, `\b`, `assert`, `break`, `by`, `continue`, `ctypedef`, `del`, `elif`, `else`, `except`, `except?`, `exec`, `finally`, `for`, `fused`, `gil`, `global`, `if`, `include`, `lambda`, `nogil`, `pass`, `print`, `raise`, `return`, `try`, `while`, `yield`, `as`, `with`), Keyword, nil},
{`(DEF|IF|ELIF|ELSE)\b`, CommentPreproc, nil},
},
"builtins": {
{Words(`(?<!\.)`, `\b`, `__import__`, `abs`, `all`, `any`, `apply`, `basestring`, `bin`, `bool`, `buffer`, `bytearray`, `bytes`, `callable`, `chr`, `classmethod`, `cmp`, `coerce`, `compile`, `complex`, `delattr`, `dict`, `dir`, `divmod`, `enumerate`, `eval`, `execfile`, `exit`, `file`, `filter`, `float`, `frozenset`, `getattr`, `globals`, `hasattr`, `hash`, `hex`, `id`, `input`, `int`, `intern`, `isinstance`, `issubclass`, `iter`, `len`, `list`, `locals`, `long`, `map`, `max`, `min`, `next`, `object`, `oct`, `open`, `ord`, `pow`, `property`, `range`, `raw_input`, `reduce`, `reload`, `repr`, `reversed`, `round`, `set`, `setattr`, `slice`, `sorted`, `staticmethod`, `str`, `sum`, `super`, `tuple`, `type`, `unichr`, `unicode`, `unsigned`, `vars`, `xrange`, `zip`), NameBuiltin, nil},
{`(?<!\.)(self|None|Ellipsis|NotImplemented|False|True|NULL)\b`, NameBuiltinPseudo, nil},
{Words(`(?<!\.)`, `\b`, `ArithmeticError`, `AssertionError`, `AttributeError`, `BaseException`, `DeprecationWarning`, `EOFError`, `EnvironmentError`, `Exception`, `FloatingPointError`, `FutureWarning`, `GeneratorExit`, `IOError`, `ImportError`, `ImportWarning`, `IndentationError`, `IndexError`, `KeyError`, `KeyboardInterrupt`, `LookupError`, `MemoryError`, `NameError`, `NotImplemented`, `NotImplementedError`, `OSError`, `OverflowError`, `OverflowWarning`, `PendingDeprecationWarning`, `ReferenceError`, `RuntimeError`, `RuntimeWarning`, `StandardError`, `StopIteration`, `SyntaxError`, `SyntaxWarning`, `SystemError`, `SystemExit`, `TabError`, `TypeError`, `UnboundLocalError`, `UnicodeDecodeError`, `UnicodeEncodeError`, `UnicodeError`, `UnicodeTranslateError`, `UnicodeWarning`, `UserWarning`, `ValueError`, `Warning`, `ZeroDivisionError`), NameException, nil},
},
"numbers": {
{`(\d+\.?\d*|\d*\.\d+)([eE][+-]?[0-9]+)?`, LiteralNumberFloat, nil},
{`0\d+`, LiteralNumberOct, nil},
{`0[xX][a-fA-F0-9]+`, LiteralNumberHex, nil},
{`\d+L`, LiteralNumberIntegerLong, nil},
{`\d+`, LiteralNumberInteger, nil},
},
"backtick": {
{"`.*?`", LiteralStringBacktick, nil},
},
"name": {
{`@\w+`, NameDecorator, nil},
{`[a-zA-Z_]\w*`, Name, nil},
},
"funcname": {
{`[a-zA-Z_]\w*`, NameFunction, Pop(1)},
},
"cdef": {
{`(public|readonly|extern|api|inline)\b`, KeywordReserved, nil},
{`(struct|enum|union|class)\b`, Keyword, nil},
{`([a-zA-Z_]\w*)(\s*)(?=[(:#=]|$)`, ByGroups(NameFunction, Text), Pop(1)},
{`([a-zA-Z_]\w*)(\s*)(,)`, ByGroups(NameFunction, Text, Punctuation), nil},
{`from\b`, Keyword, Pop(1)},
{`as\b`, Keyword, nil},
{`:`, Punctuation, Pop(1)},
{`(?=["\'])`, Text, Pop(1)},
{`[a-zA-Z_]\w*`, KeywordType, nil},
{`.`, Text, nil},
},
"classname": {
{`[a-zA-Z_]\w*`, NameClass, Pop(1)},
},
"import": {
{`(\s+)(as)(\s+)`, ByGroups(Text, Keyword, Text), nil},
{`[a-zA-Z_][\w.]*`, NameNamespace, nil},
{`(\s*)(,)(\s*)`, ByGroups(Text, Operator, Text), nil},
Default(Pop(1)),
},
"fromimport": {
{`(\s+)(c?import)\b`, ByGroups(Text, Keyword), Pop(1)},
{`[a-zA-Z_.][\w.]*`, NameNamespace, nil},
Default(Pop(1)),
},
"stringescape": {
{`\\([\\abfnrtv"\']|\n|N\{.*?\}|u[a-fA-F0-9]{4}|U[a-fA-F0-9]{8}|x[a-fA-F0-9]{2}|[0-7]{1,3})`, LiteralStringEscape, nil},
},
"strings": {
{`%(\([a-zA-Z0-9]+\))?[-#0 +]*([0-9]+|[*])?(\.([0-9]+|[*]))?[hlL]?[E-GXc-giorsux%]`, LiteralStringInterpol, nil},
{`[^\\\'"%\n]+`, LiteralString, nil},
{`[\'"\\]`, LiteralString, nil},
{`%`, LiteralString, nil},
},
"nl": {
{`\n`, LiteralString, nil},
},
"dqs": {
{`"`, LiteralString, Pop(1)},
{`\\\\|\\"|\\\n`, LiteralStringEscape, nil},
Include("strings"),
},
"sqs": {
{`'`, LiteralString, Pop(1)},
{`\\\\|\\'|\\\n`, LiteralStringEscape, nil},
Include("strings"),
},
"tdqs": {
{`"""`, LiteralString, Pop(1)},
Include("strings"),
Include("nl"),
},
"tsqs": {
{`'''`, LiteralString, Pop(1)},
Include("strings"),
Include("nl"),
},
},
))

View file

@ -0,0 +1,2 @@
// Package circular exists to break circular dependencies between lexers.
package circular

View file

@ -0,0 +1,91 @@
package circular
import (
"strings"
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/h"
"github.com/alecthomas/chroma/lexers/internal"
)
// PHP lexer.
var PHP = internal.Register(DelegatingLexer(h.HTML, MustNewLexer(
&Config{
Name: "PHP",
Aliases: []string{"php", "php3", "php4", "php5"},
Filenames: []string{"*.php", "*.php[345]", "*.inc"},
MimeTypes: []string{"text/x-php"},
DotAll: true,
CaseInsensitive: true,
EnsureNL: true,
},
Rules{
"root": {
{`<\?(php)?`, CommentPreproc, Push("php")},
{`[^<]+`, Other, nil},
{`<`, Other, nil},
},
"php": {
{`\?>`, CommentPreproc, Pop(1)},
{`(<<<)([\'"]?)((?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*)(\2\n.*?\n\s*)(\3)(;?)(\n)`, ByGroups(LiteralString, LiteralString, LiteralStringDelimiter, LiteralString, LiteralStringDelimiter, Punctuation, Text), nil},
{`\s+`, Text, nil},
{`#.*?\n`, CommentSingle, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*\*/`, CommentMultiline, nil},
{`/\*\*.*?\*/`, LiteralStringDoc, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`(->|::)(\s*)((?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*)`, ByGroups(Operator, Text, NameAttribute), nil},
{`[~!%^&*+=|:.<>/@-]+`, Operator, nil},
{`\?`, Operator, nil},
{`[\[\]{}();,]+`, Punctuation, nil},
{`(class)(\s+)`, ByGroups(Keyword, Text), Push("classname")},
{`(function)(\s*)(?=\()`, ByGroups(Keyword, Text), nil},
{`(function)(\s+)(&?)(\s*)`, ByGroups(Keyword, Text, Operator, Text), Push("functionname")},
{`(const)(\s+)((?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*)`, ByGroups(Keyword, Text, NameConstant), nil},
{`(and|E_PARSE|old_function|E_ERROR|or|as|E_WARNING|parent|eval|PHP_OS|break|exit|case|extends|PHP_VERSION|cfunction|FALSE|print|for|require|continue|foreach|require_once|declare|return|default|static|do|switch|die|stdClass|echo|else|TRUE|elseif|var|empty|if|xor|enddeclare|include|virtual|endfor|include_once|while|endforeach|global|endif|list|endswitch|new|endwhile|not|array|E_ALL|NULL|final|php_user_filter|interface|implements|public|private|protected|abstract|clone|try|catch|throw|this|use|namespace|trait|yield|finally)\b`, Keyword, nil},
{`(true|false|null)\b`, KeywordConstant, nil},
Include("magicconstants"),
{`\$\{\$+(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*\}`, NameVariable, nil},
{`\$+(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*`, NameVariable, nil},
{`(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*`, NameOther, nil},
{`(\d+\.\d*|\d*\.\d+)(e[+-]?[0-9]+)?`, LiteralNumberFloat, nil},
{`\d+e[+-]?[0-9]+`, LiteralNumberFloat, nil},
{`0[0-7]+`, LiteralNumberOct, nil},
{`0x[a-f0-9]+`, LiteralNumberHex, nil},
{`\d+`, LiteralNumberInteger, nil},
{`0b[01]+`, LiteralNumberBin, nil},
{`'([^'\\]*(?:\\.[^'\\]*)*)'`, LiteralStringSingle, nil},
{"`([^`\\\\]*(?:\\\\.[^`\\\\]*)*)`", LiteralStringBacktick, nil},
{`"`, LiteralStringDouble, Push("string")},
},
"magicfuncs": {
{Words(``, `\b`, `__construct`, `__destruct`, `__call`, `__callStatic`, `__get`, `__set`, `__isset`, `__unset`, `__sleep`, `__wakeup`, `__toString`, `__invoke`, `__set_state`, `__clone`, `__debugInfo`), NameFunctionMagic, nil},
},
"magicconstants": {
{Words(``, `\b`, `__LINE__`, `__FILE__`, `__DIR__`, `__FUNCTION__`, `__CLASS__`, `__TRAIT__`, `__METHOD__`, `__NAMESPACE__`), NameConstant, nil},
},
"classname": {
{`(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*`, NameClass, Pop(1)},
},
"functionname": {
Include("magicfuncs"),
{`(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*`, NameFunction, Pop(1)},
Default(Pop(1)),
},
"string": {
{`"`, LiteralStringDouble, Pop(1)},
{`[^{$"\\]+`, LiteralStringDouble, nil},
{`\\([nrt"$\\]|[0-7]{1,3}|x[0-9a-f]{1,2})`, LiteralStringEscape, nil},
{`\$(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*(\[\S+?\]|->(?:[\\_a-z]|[^\x00-\x7f])(?:[\\\w]|[^\x00-\x7f])*)?`, LiteralStringInterpol, nil},
{`(\{\$\{)(.*?)(\}\})`, ByGroups(LiteralStringInterpol, UsingSelf("root"), LiteralStringInterpol), nil},
{`(\{)(\$.*?)(\})`, ByGroups(LiteralStringInterpol, UsingSelf("root"), LiteralStringInterpol), nil},
{`(\$\{)(\S+)(\})`, ByGroups(LiteralStringInterpol, NameVariable, LiteralStringInterpol), nil},
{`[${\\]`, LiteralStringDouble, nil},
},
},
).SetAnalyser(func(text string) float32 {
if strings.Contains(text, "<?php") {
return 0.5
}
return 0.0
})))

View file

@ -0,0 +1,69 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// D lexer. https://dlang.org/spec/lex.html
var D = internal.Register(MustNewLexer(
&Config{
Name: "D",
Aliases: []string{"d"},
Filenames: []string{"*.d", "*.di"},
MimeTypes: []string{"text/x-d"},
EnsureNL: true,
},
Rules{
"root": {
{`[^\S\n]+`, Text, nil},
// https://dlang.org/spec/lex.html#comment
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`/\+.*?\+/`, CommentMultiline, nil},
// https://dlang.org/spec/lex.html#keywords
{`(asm|assert|body|break|case|cast|catch|continue|default|debug|delete|deprecated|do|else|finally|for|foreach|foreach_reverse|goto|if|in|invariant|is|macro|mixin|new|out|pragma|return|super|switch|this|throw|try|version|while|with)\b`, Keyword, nil},
{`__(FILE|FILE_FULL_PATH|MODULE|LINE|FUNCTION|PRETTY_FUNCTION|DATE|EOF|TIME|TIMESTAMP|VENDOR|VERSION)__\b`, NameBuiltin, nil},
{`__(traits|vector|parameters)\b`, NameBuiltin, nil},
{`((?:(?:[^\W\d]|\$)[\w.\[\]$<>]*\s+)+?)((?:[^\W\d]|\$)[\w$]*)(\s*)(\()`, ByGroups(UsingSelf("root"), NameFunction, Text, Operator), nil},
// https://dlang.org/spec/attribute.html#uda
{`@[\w.]*`, NameDecorator, nil},
{`(abstract|auto|alias|align|const|delegate|enum|export|final|function|inout|lazy|nothrow|override|package|private|protected|public|pure|static|synchronized|template|volatile|__gshared)\b`, KeywordDeclaration, nil},
// https://dlang.org/spec/type.html#basic-data-types
{`(void|bool|byte|ubyte|short|ushort|int|uint|long|ulong|cent|ucent|float|double|real|ifloat|idouble|ireal|cfloat|cdouble|creal|char|wchar|dchar|string|wstring|dstring)\b`, KeywordType, nil},
{`(module)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
{`(true|false|null)\b`, KeywordConstant, nil},
{`(class|interface|struct|template|union)(\s+)`, ByGroups(KeywordDeclaration, Text), Push("class")},
{`(import)(\s+)`, ByGroups(KeywordNamespace, Text), Push("import")},
// https://dlang.org/spec/lex.html#string_literals
// TODO support delimited strings
{`[qr]?"(\\\\|\\"|[^"])*"[cwd]?`, LiteralString, nil},
{"(`)([^`]*)(`)[cwd]?", LiteralString, nil},
{`'\\.'|'[^\\]'|'\\u[0-9a-fA-F]{4}'`, LiteralStringChar, nil},
{`(\.)((?:[^\W\d]|\$)[\w$]*)`, ByGroups(Operator, NameAttribute), nil},
{`^\s*([^\W\d]|\$)[\w$]*:`, NameLabel, nil},
// https://dlang.org/spec/lex.html#floatliteral
{`([0-9][0-9_]*\.([0-9][0-9_]*)?|\.[0-9][0-9_]*)([eE][+\-]?[0-9][0-9_]*)?[fFL]?i?|[0-9][eE][+\-]?[0-9][0-9_]*[fFL]?|[0-9]([eE][+\-]?[0-9][0-9_]*)?[fFL]|0[xX]([0-9a-fA-F][0-9a-fA-F_]*\.?|([0-9a-fA-F][0-9a-fA-F_]*)?\.[0-9a-fA-F][0-9a-fA-F_]*)[pP][+\-]?[0-9][0-9_]*[fFL]?`, LiteralNumberFloat, nil},
// https://dlang.org/spec/lex.html#integerliteral
{`0[xX][0-9a-fA-F][0-9a-fA-F_]*[lL]?`, LiteralNumberHex, nil},
{`0[bB][01][01_]*[lL]?`, LiteralNumberBin, nil},
{`0[0-7_]+[lL]?`, LiteralNumberOct, nil},
{`0|[1-9][0-9_]*[lL]?`, LiteralNumberInteger, nil},
{`([~^*!%&\[\](){}<>|+=:;,./?-]|q{)`, Operator, nil},
{`([^\W\d]|\$)[\w$]*`, Name, nil},
{`\n`, Text, nil},
},
"class": {
{`([^\W\d]|\$)[\w$]*`, NameClass, Pop(1)},
},
"import": {
{`[\w.]+\*?`, NameNamespace, Pop(1)},
},
},
))

View file

@ -0,0 +1,91 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Dart lexer.
var Dart = internal.Register(MustNewLexer(
&Config{
Name: "Dart",
Aliases: []string{"dart"},
Filenames: []string{"*.dart"},
MimeTypes: []string{"text/x-dart"},
DotAll: true,
},
Rules{
"root": {
Include("string_literal"),
{`#!(.*?)$`, CommentPreproc, nil},
{`\b(import|export)\b`, Keyword, Push("import_decl")},
{`\b(library|source|part of|part)\b`, Keyword, nil},
{`[^\S\n]+`, Text, nil},
{`//.*?\n`, CommentSingle, nil},
{`/\*.*?\*/`, CommentMultiline, nil},
{`\b(class)\b(\s+)`, ByGroups(KeywordDeclaration, Text), Push("class")},
{`\b(assert|break|case|catch|continue|default|do|else|finally|for|if|in|is|new|return|super|switch|this|throw|try|while)\b`, Keyword, nil},
{`\b(abstract|async|await|const|extends|factory|final|get|implements|native|operator|set|static|sync|typedef|var|with|yield)\b`, KeywordDeclaration, nil},
{`\b(bool|double|dynamic|int|num|Object|String|void)\b`, KeywordType, nil},
{`\b(false|null|true)\b`, KeywordConstant, nil},
{`[~!%^&*+=|?:<>/-]|as\b`, Operator, nil},
{`[a-zA-Z_$]\w*:`, NameLabel, nil},
{`[a-zA-Z_$]\w*`, Name, nil},
{`[(){}\[\],.;]`, Punctuation, nil},
{`0[xX][0-9a-fA-F]+`, LiteralNumberHex, nil},
{`\d+(\.\d*)?([eE][+-]?\d+)?`, LiteralNumber, nil},
{`\.\d+([eE][+-]?\d+)?`, LiteralNumber, nil},
{`\n`, Text, nil},
},
"class": {
{`[a-zA-Z_$]\w*`, NameClass, Pop(1)},
},
"import_decl": {
Include("string_literal"),
{`\s+`, Text, nil},
{`\b(as|show|hide)\b`, Keyword, nil},
{`[a-zA-Z_$]\w*`, Name, nil},
{`\,`, Punctuation, nil},
{`\;`, Punctuation, Pop(1)},
},
"string_literal": {
{`r"""([\w\W]*?)"""`, LiteralStringDouble, nil},
{`r'''([\w\W]*?)'''`, LiteralStringSingle, nil},
{`r"(.*?)"`, LiteralStringDouble, nil},
{`r'(.*?)'`, LiteralStringSingle, nil},
{`"""`, LiteralStringDouble, Push("string_double_multiline")},
{`'''`, LiteralStringSingle, Push("string_single_multiline")},
{`"`, LiteralStringDouble, Push("string_double")},
{`'`, LiteralStringSingle, Push("string_single")},
},
"string_common": {
{`\\(x[0-9A-Fa-f]{2}|u[0-9A-Fa-f]{4}|u\{[0-9A-Fa-f]*\}|[a-z'\"$\\])`, LiteralStringEscape, nil},
{`(\$)([a-zA-Z_]\w*)`, ByGroups(LiteralStringInterpol, Name), nil},
{`(\$\{)(.*?)(\})`, ByGroups(LiteralStringInterpol, UsingSelf("root"), LiteralStringInterpol), nil},
},
"string_double": {
{`"`, LiteralStringDouble, Pop(1)},
{`[^"$\\\n]+`, LiteralStringDouble, nil},
Include("string_common"),
{`\$+`, LiteralStringDouble, nil},
},
"string_double_multiline": {
{`"""`, LiteralStringDouble, Pop(1)},
{`[^"$\\]+`, LiteralStringDouble, nil},
Include("string_common"),
{`(\$|\")+`, LiteralStringDouble, nil},
},
"string_single": {
{`'`, LiteralStringSingle, Pop(1)},
{`[^'$\\\n]+`, LiteralStringSingle, nil},
Include("string_common"),
{`\$+`, LiteralStringSingle, nil},
},
"string_single_multiline": {
{`'''`, LiteralStringSingle, Pop(1)},
{`[^\'$\\]+`, LiteralStringSingle, nil},
Include("string_common"),
{`(\$|\')+`, LiteralStringSingle, nil},
},
},
))

View file

@ -0,0 +1,29 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Diff lexer.
var Diff = internal.Register(MustNewLexer(
&Config{
Name: "Diff",
Aliases: []string{"diff", "udiff"},
EnsureNL: true,
Filenames: []string{"*.diff", "*.patch"},
MimeTypes: []string{"text/x-diff", "text/x-patch"},
},
Rules{
"root": {
{` .*\n`, Text, nil},
{`\+.*\n`, GenericInserted, nil},
{`-.*\n`, GenericDeleted, nil},
{`!.*\n`, GenericStrong, nil},
{`@.*\n`, GenericSubheading, nil},
{`([Ii]ndex|diff).*\n`, GenericHeading, nil},
{`=.*\n`, GenericHeading, nil},
{`.*\n`, Text, nil},
},
},
))

View file

@ -0,0 +1,53 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Django/Jinja lexer.
var DjangoJinja = internal.Register(MustNewLexer(
&Config{
Name: "Django/Jinja",
Aliases: []string{"django", "jinja"},
Filenames: []string{},
MimeTypes: []string{"application/x-django-templating", "application/x-jinja"},
DotAll: true,
},
Rules{
"root": {
{`[^{]+`, Other, nil},
{`\{\{`, CommentPreproc, Push("var")},
{`\{[*#].*?[*#]\}`, Comment, nil},
{`(\{%)(-?\s*)(comment)(\s*-?)(%\})(.*?)(\{%)(-?\s*)(endcomment)(\s*-?)(%\})`, ByGroups(CommentPreproc, Text, Keyword, Text, CommentPreproc, Comment, CommentPreproc, Text, Keyword, Text, CommentPreproc), nil},
{`(\{%)(-?\s*)(raw)(\s*-?)(%\})(.*?)(\{%)(-?\s*)(endraw)(\s*-?)(%\})`, ByGroups(CommentPreproc, Text, Keyword, Text, CommentPreproc, Text, CommentPreproc, Text, Keyword, Text, CommentPreproc), nil},
{`(\{%)(-?\s*)(filter)(\s+)([a-zA-Z_]\w*)`, ByGroups(CommentPreproc, Text, Keyword, Text, NameFunction), Push("block")},
{`(\{%)(-?\s*)([a-zA-Z_]\w*)`, ByGroups(CommentPreproc, Text, Keyword), Push("block")},
{`\{`, Other, nil},
},
"varnames": {
{`(\|)(\s*)([a-zA-Z_]\w*)`, ByGroups(Operator, Text, NameFunction), nil},
{`(is)(\s+)(not)?(\s+)?([a-zA-Z_]\w*)`, ByGroups(Keyword, Text, Keyword, Text, NameFunction), nil},
{`(_|true|false|none|True|False|None)\b`, KeywordPseudo, nil},
{`(in|as|reversed|recursive|not|and|or|is|if|else|import|with(?:(?:out)?\s*context)?|scoped|ignore\s+missing)\b`, Keyword, nil},
{`(loop|block|super|forloop)\b`, NameBuiltin, nil},
{`[a-zA-Z_][\w-]*`, NameVariable, nil},
{`\.\w+`, NameVariable, nil},
{`:?"(\\\\|\\"|[^"])*"`, LiteralStringDouble, nil},
{`:?'(\\\\|\\'|[^'])*'`, LiteralStringSingle, nil},
{`([{}()\[\]+\-*/,:~]|[><=]=?)`, Operator, nil},
{`[0-9](\.[0-9]*)?(eE[+-][0-9])?[flFLdD]?|0[xX][0-9a-fA-F]+[Ll]?`, LiteralNumber, nil},
},
"var": {
{`\s+`, Text, nil},
{`(-?)(\}\})`, ByGroups(Text, CommentPreproc), Pop(1)},
Include("varnames"),
},
"block": {
{`\s+`, Text, nil},
{`(-?)(%\})`, ByGroups(Text, CommentPreproc), Pop(1)},
Include("varnames"),
{`.`, Punctuation, nil},
},
},
))

View file

@ -0,0 +1,31 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/b"
"github.com/alecthomas/chroma/lexers/internal"
"github.com/alecthomas/chroma/lexers/j"
)
// Docker lexer.
var Docker = internal.Register(MustNewLexer(
&Config{
Name: "Docker",
Aliases: []string{"docker", "dockerfile"},
Filenames: []string{"Dockerfile", "*.docker"},
MimeTypes: []string{"text/x-dockerfile-config"},
CaseInsensitive: true,
},
Rules{
"root": {
{`#.*`, Comment, nil},
{`(ONBUILD)((?:\s*\\?\s*))`, ByGroups(Keyword, Using(b.Bash)), nil},
{`(HEALTHCHECK)(((?:\s*\\?\s*)--\w+=\w+(?:\s*\\?\s*))*)`, ByGroups(Keyword, Using(b.Bash)), nil},
{`(VOLUME|ENTRYPOINT|CMD|SHELL)((?:\s*\\?\s*))(\[.*?\])`, ByGroups(Keyword, Using(b.Bash), Using(j.JSON)), nil},
{`(LABEL|ENV|ARG)((?:(?:\s*\\?\s*)\w+=\w+(?:\s*\\?\s*))*)`, ByGroups(Keyword, Using(b.Bash)), nil},
{`((?:FROM|MAINTAINER|EXPOSE|WORKDIR|USER|STOPSIGNAL)|VOLUME)\b(.*)`, ByGroups(Keyword, LiteralString), nil},
{`((?:RUN|CMD|ENTRYPOINT|ENV|ARG|LABEL|ADD|COPY))`, Keyword, nil},
{`(.*\\\n)*.+`, Using(b.Bash), nil},
},
},
))

View file

@ -0,0 +1,69 @@
package d
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Dtd lexer.
var Dtd = internal.Register(MustNewLexer(
&Config{
Name: "DTD",
Aliases: []string{"dtd"},
Filenames: []string{"*.dtd"},
MimeTypes: []string{"application/xml-dtd"},
DotAll: true,
},
Rules{
"root": {
Include("common"),
{`(<!ELEMENT)(\s+)(\S+)`, ByGroups(Keyword, Text, NameTag), Push("element")},
{`(<!ATTLIST)(\s+)(\S+)`, ByGroups(Keyword, Text, NameTag), Push("attlist")},
{`(<!ENTITY)(\s+)(\S+)`, ByGroups(Keyword, Text, NameEntity), Push("entity")},
{`(<!NOTATION)(\s+)(\S+)`, ByGroups(Keyword, Text, NameTag), Push("notation")},
{`(<!\[)([^\[\s]+)(\s*)(\[)`, ByGroups(Keyword, NameEntity, Text, Keyword), nil},
{`(<!DOCTYPE)(\s+)([^>\s]+)`, ByGroups(Keyword, Text, NameTag), nil},
{`PUBLIC|SYSTEM`, KeywordConstant, nil},
{`[\[\]>]`, Keyword, nil},
},
"common": {
{`\s+`, Text, nil},
{`(%|&)[^;]*;`, NameEntity, nil},
{`<!--`, Comment, Push("comment")},
{`[(|)*,?+]`, Operator, nil},
{`"[^"]*"`, LiteralStringDouble, nil},
{`\'[^\']*\'`, LiteralStringSingle, nil},
},
"comment": {
{`[^-]+`, Comment, nil},
{`-->`, Comment, Pop(1)},
{`-`, Comment, nil},
},
"element": {
Include("common"),
{`EMPTY|ANY|#PCDATA`, KeywordConstant, nil},
{`[^>\s|()?+*,]+`, NameTag, nil},
{`>`, Keyword, Pop(1)},
},
"attlist": {
Include("common"),
{`CDATA|IDREFS|IDREF|ID|NMTOKENS|NMTOKEN|ENTITIES|ENTITY|NOTATION`, KeywordConstant, nil},
{`#REQUIRED|#IMPLIED|#FIXED`, KeywordConstant, nil},
{`xml:space|xml:lang`, KeywordReserved, nil},
{`[^>\s|()?+*,]+`, NameAttribute, nil},
{`>`, Keyword, Pop(1)},
},
"entity": {
Include("common"),
{`SYSTEM|PUBLIC|NDATA`, KeywordConstant, nil},
{`[^>\s|()?+*,]+`, NameEntity, nil},
{`>`, Keyword, Pop(1)},
},
"notation": {
Include("common"),
{`SYSTEM|PUBLIC`, KeywordConstant, nil},
{`[^>\s|()?+*,]+`, NameAttribute, nil},
{`>`, Keyword, Pop(1)},
},
},
))

View file

@ -0,0 +1,51 @@
package e
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Ebnf lexer.
var Ebnf = internal.Register(MustNewLexer(
&Config{
Name: "EBNF",
Aliases: []string{"ebnf"},
Filenames: []string{"*.ebnf"},
MimeTypes: []string{"text/x-ebnf"},
},
Rules{
"root": {
Include("whitespace"),
Include("comment_start"),
Include("identifier"),
{`=`, Operator, Push("production")},
},
"production": {
Include("whitespace"),
Include("comment_start"),
Include("identifier"),
{`"[^"]*"`, LiteralStringDouble, nil},
{`'[^']*'`, LiteralStringSingle, nil},
{`(\?[^?]*\?)`, NameEntity, nil},
{`[\[\]{}(),|]`, Punctuation, nil},
{`-`, Operator, nil},
{`;`, Punctuation, Pop(1)},
{`\.`, Punctuation, Pop(1)},
},
"whitespace": {
{`\s+`, Text, nil},
},
"comment_start": {
{`\(\*`, CommentMultiline, Push("comment")},
},
"comment": {
{`[^*)]`, CommentMultiline, nil},
Include("comment_start"),
{`\*\)`, CommentMultiline, Pop(1)},
{`[*)]`, CommentMultiline, nil},
},
"identifier": {
{`([a-zA-Z][\w \-]*)`, Keyword, nil},
},
},
))

View file

@ -0,0 +1,270 @@
package e
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Elixir lexer.
var Elixir = internal.Register(MustNewLexer(
&Config{
Name: "Elixir",
Aliases: []string{"elixir", "ex", "exs"},
Filenames: []string{"*.ex", "*.exs"},
MimeTypes: []string{"text/x-elixir"},
},
Rules{
"root": {
{`\s+`, Text, nil},
{`#.*$`, CommentSingle, nil},
{`(\?)(\\x\{)([\da-fA-F]+)(\})`, ByGroups(LiteralStringChar, LiteralStringEscape, LiteralNumberHex, LiteralStringEscape), nil},
{`(\?)(\\x[\da-fA-F]{1,2})`, ByGroups(LiteralStringChar, LiteralStringEscape), nil},
{`(\?)(\\[abdefnrstv])`, ByGroups(LiteralStringChar, LiteralStringEscape), nil},
{`\?\\?.`, LiteralStringChar, nil},
{`:::`, LiteralStringSymbol, nil},
{`::`, Operator, nil},
{`:(?:\.\.\.|<<>>|%\{\}|%|\{\})`, LiteralStringSymbol, nil},
{`:(?:(?:\.\.\.|[a-z_]\w*[!?]?)|[A-Z]\w*(?:\.[A-Z]\w*)*|(?:\<\<\<|\>\>\>|\|\|\||\&\&\&|\^\^\^|\~\~\~|\=\=\=|\!\=\=|\~\>\>|\<\~\>|\|\~\>|\<\|\>|\=\=|\!\=|\<\=|\>\=|\&\&|\|\||\<\>|\+\+|\-\-|\|\>|\=\~|\-\>|\<\-|\||\.|\=|\~\>|\<\~|\<|\>|\+|\-|\*|\/|\!|\^|\&))`, LiteralStringSymbol, nil},
{`:"`, LiteralStringSymbol, Push("string_double_atom")},
{`:'`, LiteralStringSymbol, Push("string_single_atom")},
{`((?:\.\.\.|<<>>|%\{\}|%|\{\})|(?:(?:\.\.\.|[a-z_]\w*[!?]?)|[A-Z]\w*(?:\.[A-Z]\w*)*|(?:\<\<\<|\>\>\>|\|\|\||\&\&\&|\^\^\^|\~\~\~|\=\=\=|\!\=\=|\~\>\>|\<\~\>|\|\~\>|\<\|\>|\=\=|\!\=|\<\=|\>\=|\&\&|\|\||\<\>|\+\+|\-\-|\|\>|\=\~|\-\>|\<\-|\||\.|\=|\~\>|\<\~|\<|\>|\+|\-|\*|\/|\!|\^|\&)))(:)(?=\s|\n)`, ByGroups(LiteralStringSymbol, Punctuation), nil},
{`@(?:\.\.\.|[a-z_]\w*[!?]?)`, NameAttribute, nil},
{`(?:\.\.\.|[a-z_]\w*[!?]?)`, Name, nil},
{`(%?)([A-Z]\w*(?:\.[A-Z]\w*)*)`, ByGroups(Punctuation, NameClass), nil},
{`\<\<\<|\>\>\>|\|\|\||\&\&\&|\^\^\^|\~\~\~|\=\=\=|\!\=\=|\~\>\>|\<\~\>|\|\~\>|\<\|\>`, Operator, nil},
{`\=\=|\!\=|\<\=|\>\=|\&\&|\|\||\<\>|\+\+|\-\-|\|\>|\=\~|\-\>|\<\-|\||\.|\=|\~\>|\<\~`, Operator, nil},
{`\\\\|\<\<|\>\>|\=\>|\(|\)|\:|\;|\,|\[|\]`, Punctuation, nil},
{`&\d`, NameEntity, nil},
{`\<|\>|\+|\-|\*|\/|\!|\^|\&`, Operator, nil},
{`0b[01](_?[01])*`, LiteralNumberBin, nil},
{`0o[0-7](_?[0-7])*`, LiteralNumberOct, nil},
{`0x[\da-fA-F](_?[\dA-Fa-f])*`, LiteralNumberHex, nil},
{`\d(_?\d)*\.\d(_?\d)*([eE][-+]?\d(_?\d)*)?`, LiteralNumberFloat, nil},
{`\d(_?\d)*`, LiteralNumberInteger, nil},
{`"""\s*`, LiteralStringHeredoc, Push("heredoc_double")},
{`'''\s*$`, LiteralStringHeredoc, Push("heredoc_single")},
{`"`, LiteralStringDouble, Push("string_double")},
{`'`, LiteralStringSingle, Push("string_single")},
Include("sigils"),
{`%\{`, Punctuation, Push("map_key")},
{`\{`, Punctuation, Push("tuple")},
},
"heredoc_double": {
{`^\s*"""`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_interpol"),
},
"heredoc_single": {
{`^\s*'''`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_interpol"),
},
"heredoc_interpol": {
{`[^#\\\n]+`, LiteralStringHeredoc, nil},
Include("escapes"),
{`\\.`, LiteralStringHeredoc, nil},
{`\n+`, LiteralStringHeredoc, nil},
Include("interpol"),
},
"heredoc_no_interpol": {
{`[^\\\n]+`, LiteralStringHeredoc, nil},
{`\\.`, LiteralStringHeredoc, nil},
{`\n+`, LiteralStringHeredoc, nil},
},
"escapes": {
{`(\\x\{)([\da-fA-F]+)(\})`, ByGroups(LiteralStringEscape, LiteralNumberHex, LiteralStringEscape), nil},
{`(\\x[\da-fA-F]{1,2})`, LiteralStringEscape, nil},
{`(\\[abdefnrstv])`, LiteralStringEscape, nil},
},
"interpol": {
{`#\{`, LiteralStringInterpol, Push("interpol_string")},
},
"interpol_string": {
{`\}`, LiteralStringInterpol, Pop(1)},
Include("root"),
},
"map_key": {
Include("root"),
{`:`, Punctuation, Push("map_val")},
{`=>`, Punctuation, Push("map_val")},
{`\}`, Punctuation, Pop(1)},
},
"map_val": {
Include("root"),
{`,`, Punctuation, Pop(1)},
{`(?=\})`, Punctuation, Pop(1)},
},
"tuple": {
Include("root"),
{`\}`, Punctuation, Pop(1)},
},
"string_double": {
{`[^#"\\]+`, LiteralStringDouble, nil},
Include("escapes"),
{`\\.`, LiteralStringDouble, nil},
{`(")`, ByGroups(LiteralStringDouble), Pop(1)},
Include("interpol"),
},
"string_single": {
{`[^#'\\]+`, LiteralStringSingle, nil},
Include("escapes"),
{`\\.`, LiteralStringSingle, nil},
{`(')`, ByGroups(LiteralStringSingle), Pop(1)},
Include("interpol"),
},
"string_double_atom": {
{`[^#"\\]+`, LiteralStringSymbol, nil},
Include("escapes"),
{`\\.`, LiteralStringSymbol, nil},
{`(")`, ByGroups(LiteralStringSymbol), Pop(1)},
Include("interpol"),
},
"string_single_atom": {
{`[^#'\\]+`, LiteralStringSymbol, nil},
Include("escapes"),
{`\\.`, LiteralStringSymbol, nil},
{`(')`, ByGroups(LiteralStringSymbol), Pop(1)},
Include("interpol"),
},
"sigils": {
{`(~[a-z])(""")`, ByGroups(LiteralStringOther, LiteralStringHeredoc), Push("triquot-end", "triquot-intp")},
{`(~[A-Z])(""")`, ByGroups(LiteralStringOther, LiteralStringHeredoc), Push("triquot-end", "triquot-no-intp")},
{`(~[a-z])(''')`, ByGroups(LiteralStringOther, LiteralStringHeredoc), Push("triapos-end", "triapos-intp")},
{`(~[A-Z])(''')`, ByGroups(LiteralStringOther, LiteralStringHeredoc), Push("triapos-end", "triapos-no-intp")},
{`~[a-z]\{`, LiteralStringOther, Push("cb-intp")},
{`~[A-Z]\{`, LiteralStringOther, Push("cb-no-intp")},
{`~[a-z]\[`, LiteralStringOther, Push("sb-intp")},
{`~[A-Z]\[`, LiteralStringOther, Push("sb-no-intp")},
{`~[a-z]\(`, LiteralStringOther, Push("pa-intp")},
{`~[A-Z]\(`, LiteralStringOther, Push("pa-no-intp")},
{`~[a-z]<`, LiteralStringOther, Push("ab-intp")},
{`~[A-Z]<`, LiteralStringOther, Push("ab-no-intp")},
{`~[a-z]/`, LiteralStringOther, Push("slas-intp")},
{`~[A-Z]/`, LiteralStringOther, Push("slas-no-intp")},
{`~[a-z]\|`, LiteralStringOther, Push("pipe-intp")},
{`~[A-Z]\|`, LiteralStringOther, Push("pipe-no-intp")},
{`~[a-z]"`, LiteralStringOther, Push("quot-intp")},
{`~[A-Z]"`, LiteralStringOther, Push("quot-no-intp")},
{`~[a-z]'`, LiteralStringOther, Push("apos-intp")},
{`~[A-Z]'`, LiteralStringOther, Push("apos-no-intp")},
},
"triquot-end": {
{`[a-zA-Z]+`, LiteralStringOther, Pop(1)},
Default(Pop(1)),
},
"triquot-intp": {
{`^\s*"""`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_interpol"),
},
"triquot-no-intp": {
{`^\s*"""`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_no_interpol"),
},
"triapos-end": {
{`[a-zA-Z]+`, LiteralStringOther, Pop(1)},
Default(Pop(1)),
},
"triapos-intp": {
{`^\s*'''`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_interpol"),
},
"triapos-no-intp": {
{`^\s*'''`, LiteralStringHeredoc, Pop(1)},
Include("heredoc_no_interpol"),
},
"cb-intp": {
{`[^#\}\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`\}[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"cb-no-intp": {
{`[^\}\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`\}[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"sb-intp": {
{`[^#\]\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`\][a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"sb-no-intp": {
{`[^\]\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`\][a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"pa-intp": {
{`[^#\)\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`\)[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"pa-no-intp": {
{`[^\)\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`\)[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"ab-intp": {
{`[^#>\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`>[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"ab-no-intp": {
{`[^>\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`>[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"slas-intp": {
{`[^#/\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`/[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"slas-no-intp": {
{`[^/\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`/[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"pipe-intp": {
{`[^#\|\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`\|[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"pipe-no-intp": {
{`[^\|\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`\|[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"quot-intp": {
{`[^#"\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`"[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"quot-no-intp": {
{`[^"\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`"[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
"apos-intp": {
{`[^#'\\]+`, LiteralStringOther, nil},
Include("escapes"),
{`\\.`, LiteralStringOther, nil},
{`'[a-zA-Z]*`, LiteralStringOther, Pop(1)},
Include("interpol"),
},
"apos-no-intp": {
{`[^'\\]+`, LiteralStringOther, nil},
{`\\.`, LiteralStringOther, nil},
{`'[a-zA-Z]*`, LiteralStringOther, Pop(1)},
},
},
))

View file

@ -0,0 +1,59 @@
package e
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Elm lexer.
var Elm = internal.Register(MustNewLexer(
&Config{
Name: "Elm",
Aliases: []string{"elm"},
Filenames: []string{"*.elm"},
MimeTypes: []string{"text/x-elm"},
},
Rules{
"root": {
{`\{-`, CommentMultiline, Push("comment")},
{`--.*`, CommentSingle, nil},
{`\s+`, Text, nil},
{`"`, LiteralString, Push("doublequote")},
{`^\s*module\s*`, KeywordNamespace, Push("imports")},
{`^\s*import\s*`, KeywordNamespace, Push("imports")},
{`\[glsl\|.*`, NameEntity, Push("shader")},
{Words(``, `\b`, `alias`, `as`, `case`, `else`, `if`, `import`, `in`, `let`, `module`, `of`, `port`, `then`, `type`, `where`), KeywordReserved, nil},
{`[A-Z]\w*`, KeywordType, nil},
{`^main `, KeywordReserved, nil},
{Words(`\(`, `\)`, `~`, `||`, `|>`, `|`, "`", `^`, `\`, `'`, `>>`, `>=`, `>`, `==`, `=`, `<~`, `<|`, `<=`, `<<`, `<-`, `<`, `::`, `:`, `/=`, `//`, `/`, `..`, `.`, `->`, `-`, `++`, `+`, `*`, `&&`, `%`), NameFunction, nil},
{Words(``, ``, `~`, `||`, `|>`, `|`, "`", `^`, `\`, `'`, `>>`, `>=`, `>`, `==`, `=`, `<~`, `<|`, `<=`, `<<`, `<-`, `<`, `::`, `:`, `/=`, `//`, `/`, `..`, `.`, `->`, `-`, `++`, `+`, `*`, `&&`, `%`), NameFunction, nil},
Include("numbers"),
{`[a-z_][a-zA-Z_\']*`, NameVariable, nil},
{`[,()\[\]{}]`, Punctuation, nil},
},
"comment": {
{`-(?!\})`, CommentMultiline, nil},
{`\{-`, CommentMultiline, Push("comment")},
{`[^-}]`, CommentMultiline, nil},
{`-\}`, CommentMultiline, Pop(1)},
},
"doublequote": {
{`\\u[0-9a-fA-F]{4}`, LiteralStringEscape, nil},
{`\\[nrfvb\\"]`, LiteralStringEscape, nil},
{`[^"]`, LiteralString, nil},
{`"`, LiteralString, Pop(1)},
},
"imports": {
{`\w+(\.\w+)*`, NameClass, Pop(1)},
},
"numbers": {
{`_?\d+\.(?=\d+)`, LiteralNumberFloat, nil},
{`_?\d+`, LiteralNumberInteger, nil},
},
"shader": {
{`\|(?!\])`, NameEntity, nil},
{`\|\]`, NameEntity, Pop(1)},
{`.*\n`, NameEntity, nil},
},
},
))

View file

@ -0,0 +1,582 @@
package e
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
var (
emacsMacros = []string{
"atomic-change-group", "case", "block", "cl-block", "cl-callf", "cl-callf2",
"cl-case", "cl-decf", "cl-declaim", "cl-declare",
"cl-define-compiler-macro", "cl-defmacro", "cl-defstruct",
"cl-defsubst", "cl-deftype", "cl-defun", "cl-destructuring-bind",
"cl-do", "cl-do*", "cl-do-all-symbols", "cl-do-symbols", "cl-dolist",
"cl-dotimes", "cl-ecase", "cl-etypecase", "eval-when", "cl-eval-when", "cl-flet",
"cl-flet*", "cl-function", "cl-incf", "cl-labels", "cl-letf",
"cl-letf*", "cl-load-time-value", "cl-locally", "cl-loop",
"cl-macrolet", "cl-multiple-value-bind", "cl-multiple-value-setq",
"cl-progv", "cl-psetf", "cl-psetq", "cl-pushnew", "cl-remf",
"cl-return", "cl-return-from", "cl-rotatef", "cl-shiftf",
"cl-symbol-macrolet", "cl-tagbody", "cl-the", "cl-typecase",
"combine-after-change-calls", "condition-case-unless-debug", "decf",
"declaim", "declare", "declare-function", "def-edebug-spec",
"defadvice", "defclass", "defcustom", "defface", "defgeneric",
"defgroup", "define-advice", "define-alternatives",
"define-compiler-macro", "define-derived-mode", "define-generic-mode",
"define-global-minor-mode", "define-globalized-minor-mode",
"define-minor-mode", "define-modify-macro",
"define-obsolete-face-alias", "define-obsolete-function-alias",
"define-obsolete-variable-alias", "define-setf-expander",
"define-skeleton", "defmacro", "defmethod", "defsetf", "defstruct",
"defsubst", "deftheme", "deftype", "defun", "defvar-local",
"delay-mode-hooks", "destructuring-bind", "do", "do*",
"do-all-symbols", "do-symbols", "dolist", "dont-compile", "dotimes",
"dotimes-with-progress-reporter", "ecase", "ert-deftest", "etypecase",
"eval-and-compile", "eval-when-compile", "flet", "ignore-errors",
"incf", "labels", "lambda", "letrec", "lexical-let", "lexical-let*",
"loop", "multiple-value-bind", "multiple-value-setq", "noreturn",
"oref", "oref-default", "oset", "oset-default", "pcase",
"pcase-defmacro", "pcase-dolist", "pcase-exhaustive", "pcase-let",
"pcase-let*", "pop", "psetf", "psetq", "push", "pushnew", "remf",
"return", "rotatef", "rx", "save-match-data", "save-selected-window",
"save-window-excursion", "setf", "setq-local", "shiftf",
"track-mouse", "typecase", "unless", "use-package", "when",
"while-no-input", "with-case-table", "with-category-table",
"with-coding-priority", "with-current-buffer", "with-demoted-errors",
"with-eval-after-load", "with-file-modes", "with-local-quit",
"with-output-to-string", "with-output-to-temp-buffer",
"with-parsed-tramp-file-name", "with-selected-frame",
"with-selected-window", "with-silent-modifications", "with-slots",
"with-syntax-table", "with-temp-buffer", "with-temp-file",
"with-temp-message", "with-timeout", "with-tramp-connection-property",
"with-tramp-file-property", "with-tramp-progress-reporter",
"with-wrapper-hook", "load-time-value", "locally", "macrolet", "progv",
"return-from",
}
emacsSpecialForms = []string{
"and", "catch", "cond", "condition-case", "defconst", "defvar",
"function", "if", "interactive", "let", "let*", "or", "prog1",
"prog2", "progn", "quote", "save-current-buffer", "save-excursion",
"save-restriction", "setq", "setq-default", "subr-arity",
"unwind-protect", "while",
}
emacsBuiltinFunction = []string{
"%", "*", "+", "-", "/", "/=", "1+", "1-", "<", "<=", "=", ">", ">=",
"Snarf-documentation", "abort-recursive-edit", "abs",
"accept-process-output", "access-file", "accessible-keymaps", "acos",
"active-minibuffer-window", "add-face-text-property",
"add-name-to-file", "add-text-properties", "all-completions",
"append", "apply", "apropos-internal", "aref", "arrayp", "aset",
"ash", "asin", "assoc", "assoc-string", "assq", "atan", "atom",
"autoload", "autoload-do-load", "backtrace", "backtrace--locals",
"backtrace-debug", "backtrace-eval", "backtrace-frame",
"backward-char", "backward-prefix-chars", "barf-if-buffer-read-only",
"base64-decode-region", "base64-decode-string",
"base64-encode-region", "base64-encode-string", "beginning-of-line",
"bidi-find-overridden-directionality", "bidi-resolved-levels",
"bitmap-spec-p", "bobp", "bolp", "bool-vector",
"bool-vector-count-consecutive", "bool-vector-count-population",
"bool-vector-exclusive-or", "bool-vector-intersection",
"bool-vector-not", "bool-vector-p", "bool-vector-set-difference",
"bool-vector-subsetp", "bool-vector-union", "boundp",
"buffer-base-buffer", "buffer-chars-modified-tick",
"buffer-enable-undo", "buffer-file-name", "buffer-has-markers-at",
"buffer-list", "buffer-live-p", "buffer-local-value",
"buffer-local-variables", "buffer-modified-p", "buffer-modified-tick",
"buffer-name", "buffer-size", "buffer-string", "buffer-substring",
"buffer-substring-no-properties", "buffer-swap-text", "bufferp",
"bury-buffer-internal", "byte-code", "byte-code-function-p",
"byte-to-position", "byte-to-string", "byteorder",
"call-interactively", "call-last-kbd-macro", "call-process",
"call-process-region", "cancel-kbd-macro-events", "capitalize",
"capitalize-region", "capitalize-word", "car", "car-less-than-car",
"car-safe", "case-table-p", "category-docstring",
"category-set-mnemonics", "category-table", "category-table-p",
"ccl-execute", "ccl-execute-on-string", "ccl-program-p", "cdr",
"cdr-safe", "ceiling", "char-after", "char-before",
"char-category-set", "char-charset", "char-equal", "char-or-string-p",
"char-resolve-modifiers", "char-syntax", "char-table-extra-slot",
"char-table-p", "char-table-parent", "char-table-range",
"char-table-subtype", "char-to-string", "char-width", "characterp",
"charset-after", "charset-id-internal", "charset-plist",
"charset-priority-list", "charsetp", "check-coding-system",
"check-coding-systems-region", "clear-buffer-auto-save-failure",
"clear-charset-maps", "clear-face-cache", "clear-font-cache",
"clear-image-cache", "clear-string", "clear-this-command-keys",
"close-font", "clrhash", "coding-system-aliases",
"coding-system-base", "coding-system-eol-type", "coding-system-p",
"coding-system-plist", "coding-system-priority-list",
"coding-system-put", "color-distance", "color-gray-p",
"color-supported-p", "combine-after-change-execute",
"command-error-default-function", "command-remapping", "commandp",
"compare-buffer-substrings", "compare-strings",
"compare-window-configurations", "completing-read",
"compose-region-internal", "compose-string-internal",
"composition-get-gstring", "compute-motion", "concat", "cons",
"consp", "constrain-to-field", "continue-process",
"controlling-tty-p", "coordinates-in-window-p", "copy-alist",
"copy-category-table", "copy-file", "copy-hash-table", "copy-keymap",
"copy-marker", "copy-sequence", "copy-syntax-table", "copysign",
"cos", "current-active-maps", "current-bidi-paragraph-direction",
"current-buffer", "current-case-table", "current-column",
"current-global-map", "current-idle-time", "current-indentation",
"current-input-mode", "current-local-map", "current-message",
"current-minor-mode-maps", "current-time", "current-time-string",
"current-time-zone", "current-window-configuration",
"cygwin-convert-file-name-from-windows",
"cygwin-convert-file-name-to-windows", "daemon-initialized",
"daemonp", "dbus--init-bus", "dbus-get-unique-name",
"dbus-message-internal", "debug-timer-check", "declare-equiv-charset",
"decode-big5-char", "decode-char", "decode-coding-region",
"decode-coding-string", "decode-sjis-char", "decode-time",
"default-boundp", "default-file-modes", "default-printer-name",
"default-toplevel-value", "default-value", "define-category",
"define-charset-alias", "define-charset-internal",
"define-coding-system-alias", "define-coding-system-internal",
"define-fringe-bitmap", "define-hash-table-test", "define-key",
"define-prefix-command", "delete",
"delete-all-overlays", "delete-and-extract-region", "delete-char",
"delete-directory-internal", "delete-field", "delete-file",
"delete-frame", "delete-other-windows-internal", "delete-overlay",
"delete-process", "delete-region", "delete-terminal",
"delete-window-internal", "delq", "describe-buffer-bindings",
"describe-vector", "destroy-fringe-bitmap", "detect-coding-region",
"detect-coding-string", "ding", "directory-file-name",
"directory-files", "directory-files-and-attributes", "discard-input",
"display-supports-face-attributes-p", "do-auto-save", "documentation",
"documentation-property", "downcase", "downcase-region",
"downcase-word", "draw-string", "dump-colors", "dump-emacs",
"dump-face", "dump-frame-glyph-matrix", "dump-glyph-matrix",
"dump-glyph-row", "dump-redisplay-history", "dump-tool-bar-row",
"elt", "emacs-pid", "encode-big5-char", "encode-char",
"encode-coding-region", "encode-coding-string", "encode-sjis-char",
"encode-time", "end-kbd-macro", "end-of-line", "eobp", "eolp", "eq",
"eql", "equal", "equal-including-properties", "erase-buffer",
"error-message-string", "eval", "eval-buffer", "eval-region",
"event-convert-list", "execute-kbd-macro", "exit-recursive-edit",
"exp", "expand-file-name", "expt", "external-debugging-output",
"face-attribute-relative-p", "face-attributes-as-vector", "face-font",
"fboundp", "fceiling", "fetch-bytecode", "ffloor",
"field-beginning", "field-end", "field-string",
"field-string-no-properties", "file-accessible-directory-p",
"file-acl", "file-attributes", "file-attributes-lessp",
"file-directory-p", "file-executable-p", "file-exists-p",
"file-locked-p", "file-modes", "file-name-absolute-p",
"file-name-all-completions", "file-name-as-directory",
"file-name-completion", "file-name-directory",
"file-name-nondirectory", "file-newer-than-file-p", "file-readable-p",
"file-regular-p", "file-selinux-context", "file-symlink-p",
"file-system-info", "file-system-info", "file-writable-p",
"fillarray", "find-charset-region", "find-charset-string",
"find-coding-systems-region-internal", "find-composition-internal",
"find-file-name-handler", "find-font", "find-operation-coding-system",
"float", "float-time", "floatp", "floor", "fmakunbound",
"following-char", "font-at", "font-drive-otf", "font-face-attributes",
"font-family-list", "font-get", "font-get-glyphs",
"font-get-system-font", "font-get-system-normal-font", "font-info",
"font-match-p", "font-otf-alternates", "font-put",
"font-shape-gstring", "font-spec", "font-variation-glyphs",
"font-xlfd-name", "fontp", "fontset-font", "fontset-info",
"fontset-list", "fontset-list-all", "force-mode-line-update",
"force-window-update", "format", "format-mode-line",
"format-network-address", "format-time-string", "forward-char",
"forward-comment", "forward-line", "forward-word",
"frame-border-width", "frame-bottom-divider-width",
"frame-can-run-window-configuration-change-hook", "frame-char-height",
"frame-char-width", "frame-face-alist", "frame-first-window",
"frame-focus", "frame-font-cache", "frame-fringe-width", "frame-list",
"frame-live-p", "frame-or-buffer-changed-p", "frame-parameter",
"frame-parameters", "frame-pixel-height", "frame-pixel-width",
"frame-pointer-visible-p", "frame-right-divider-width",
"frame-root-window", "frame-scroll-bar-height",
"frame-scroll-bar-width", "frame-selected-window", "frame-terminal",
"frame-text-cols", "frame-text-height", "frame-text-lines",
"frame-text-width", "frame-total-cols", "frame-total-lines",
"frame-visible-p", "framep", "frexp", "fringe-bitmaps-at-pos",
"fround", "fset", "ftruncate", "funcall", "funcall-interactively",
"function-equal", "functionp", "gap-position", "gap-size",
"garbage-collect", "gc-status", "generate-new-buffer-name", "get",
"get-buffer", "get-buffer-create", "get-buffer-process",
"get-buffer-window", "get-byte", "get-char-property",
"get-char-property-and-overlay", "get-file-buffer", "get-file-char",
"get-internal-run-time", "get-load-suffixes", "get-pos-property",
"get-process", "get-screen-color", "get-text-property",
"get-unicode-property-internal", "get-unused-category",
"get-unused-iso-final-char", "getenv-internal", "gethash",
"gfile-add-watch", "gfile-rm-watch", "global-key-binding",
"gnutls-available-p", "gnutls-boot", "gnutls-bye", "gnutls-deinit",
"gnutls-error-fatalp", "gnutls-error-string", "gnutls-errorp",
"gnutls-get-initstage", "gnutls-peer-status",
"gnutls-peer-status-warning-describe", "goto-char", "gpm-mouse-start",
"gpm-mouse-stop", "group-gid", "group-real-gid",
"handle-save-session", "handle-switch-frame", "hash-table-count",
"hash-table-p", "hash-table-rehash-size",
"hash-table-rehash-threshold", "hash-table-size", "hash-table-test",
"hash-table-weakness", "iconify-frame", "identity", "image-flush",
"image-mask-p", "image-metadata", "image-size", "imagemagick-types",
"imagep", "indent-to", "indirect-function", "indirect-variable",
"init-image-library", "inotify-add-watch", "inotify-rm-watch",
"input-pending-p", "insert", "insert-and-inherit",
"insert-before-markers", "insert-before-markers-and-inherit",
"insert-buffer-substring", "insert-byte", "insert-char",
"insert-file-contents", "insert-startup-screen", "int86",
"integer-or-marker-p", "integerp", "interactive-form", "intern",
"intern-soft", "internal--track-mouse", "internal-char-font",
"internal-complete-buffer", "internal-copy-lisp-face",
"internal-default-process-filter",
"internal-default-process-sentinel", "internal-describe-syntax-value",
"internal-event-symbol-parse-modifiers",
"internal-face-x-get-resource", "internal-get-lisp-face-attribute",
"internal-lisp-face-attribute-values", "internal-lisp-face-empty-p",
"internal-lisp-face-equal-p", "internal-lisp-face-p",
"internal-make-lisp-face", "internal-make-var-non-special",
"internal-merge-in-global-face",
"internal-set-alternative-font-family-alist",
"internal-set-alternative-font-registry-alist",
"internal-set-font-selection-order",
"internal-set-lisp-face-attribute",
"internal-set-lisp-face-attribute-from-resource",
"internal-show-cursor", "internal-show-cursor-p", "interrupt-process",
"invisible-p", "invocation-directory", "invocation-name", "isnan",
"iso-charset", "key-binding", "key-description",
"keyboard-coding-system", "keymap-parent", "keymap-prompt", "keymapp",
"keywordp", "kill-all-local-variables", "kill-buffer", "kill-emacs",
"kill-local-variable", "kill-process", "last-nonminibuffer-frame",
"lax-plist-get", "lax-plist-put", "ldexp", "length",
"libxml-parse-html-region", "libxml-parse-xml-region",
"line-beginning-position", "line-end-position", "line-pixel-height",
"list", "list-fonts", "list-system-processes", "listp", "load",
"load-average", "local-key-binding", "local-variable-if-set-p",
"local-variable-p", "locale-info", "locate-file-internal",
"lock-buffer", "log", "logand", "logb", "logior", "lognot", "logxor",
"looking-at", "lookup-image", "lookup-image-map", "lookup-key",
"lower-frame", "lsh", "macroexpand", "make-bool-vector",
"make-byte-code", "make-category-set", "make-category-table",
"make-char", "make-char-table", "make-directory-internal",
"make-frame-invisible", "make-frame-visible", "make-hash-table",
"make-indirect-buffer", "make-keymap", "make-list",
"make-local-variable", "make-marker", "make-network-process",
"make-overlay", "make-serial-process", "make-sparse-keymap",
"make-string", "make-symbol", "make-symbolic-link", "make-temp-name",
"make-terminal-frame", "make-variable-buffer-local",
"make-variable-frame-local", "make-vector", "makunbound",
"map-char-table", "map-charset-chars", "map-keymap",
"map-keymap-internal", "mapatoms", "mapc", "mapcar", "mapconcat",
"maphash", "mark-marker", "marker-buffer", "marker-insertion-type",
"marker-position", "markerp", "match-beginning", "match-data",
"match-end", "matching-paren", "max", "max-char", "md5", "member",
"memory-info", "memory-limit", "memory-use-counts", "memq", "memql",
"menu-bar-menu-at-x-y", "menu-or-popup-active-p",
"menu-or-popup-active-p", "merge-face-attribute", "message",
"message-box", "message-or-box", "min",
"minibuffer-completion-contents", "minibuffer-contents",
"minibuffer-contents-no-properties", "minibuffer-depth",
"minibuffer-prompt", "minibuffer-prompt-end",
"minibuffer-selected-window", "minibuffer-window", "minibufferp",
"minor-mode-key-binding", "mod", "modify-category-entry",
"modify-frame-parameters", "modify-syntax-entry",
"mouse-pixel-position", "mouse-position", "move-overlay",
"move-point-visually", "move-to-column", "move-to-window-line",
"msdos-downcase-filename", "msdos-long-file-names", "msdos-memget",
"msdos-memput", "msdos-mouse-disable", "msdos-mouse-enable",
"msdos-mouse-init", "msdos-mouse-p", "msdos-remember-default-colors",
"msdos-set-keyboard", "msdos-set-mouse-buttons",
"multibyte-char-to-unibyte", "multibyte-string-p", "narrow-to-region",
"natnump", "nconc", "network-interface-info",
"network-interface-list", "new-fontset", "newline-cache-check",
"next-char-property-change", "next-frame", "next-overlay-change",
"next-property-change", "next-read-file-uses-dialog-p",
"next-single-char-property-change", "next-single-property-change",
"next-window", "nlistp", "nreverse", "nth", "nthcdr", "null",
"number-or-marker-p", "number-to-string", "numberp",
"open-dribble-file", "open-font", "open-termscript",
"optimize-char-table", "other-buffer", "other-window-for-scrolling",
"overlay-buffer", "overlay-end", "overlay-get", "overlay-lists",
"overlay-properties", "overlay-put", "overlay-recenter",
"overlay-start", "overlayp", "overlays-at", "overlays-in",
"parse-partial-sexp", "play-sound-internal", "plist-get",
"plist-member", "plist-put", "point", "point-marker", "point-max",
"point-max-marker", "point-min", "point-min-marker",
"pos-visible-in-window-p", "position-bytes", "posix-looking-at",
"posix-search-backward", "posix-search-forward", "posix-string-match",
"posn-at-point", "posn-at-x-y", "preceding-char",
"prefix-numeric-value", "previous-char-property-change",
"previous-frame", "previous-overlay-change",
"previous-property-change", "previous-single-char-property-change",
"previous-single-property-change", "previous-window", "prin1",
"prin1-to-string", "princ", "print", "process-attributes",
"process-buffer", "process-coding-system", "process-command",
"process-connection", "process-contact", "process-datagram-address",
"process-exit-status", "process-filter", "process-filter-multibyte-p",
"process-id", "process-inherit-coding-system-flag", "process-list",
"process-mark", "process-name", "process-plist",
"process-query-on-exit-flag", "process-running-child-p",
"process-send-eof", "process-send-region", "process-send-string",
"process-sentinel", "process-status", "process-tty-name",
"process-type", "processp", "profiler-cpu-log",
"profiler-cpu-running-p", "profiler-cpu-start", "profiler-cpu-stop",
"profiler-memory-log", "profiler-memory-running-p",
"profiler-memory-start", "profiler-memory-stop", "propertize",
"purecopy", "put", "put-text-property",
"put-unicode-property-internal", "puthash", "query-font",
"query-fontset", "quit-process", "raise-frame", "random", "rassoc",
"rassq", "re-search-backward", "re-search-forward", "read",
"read-buffer", "read-char", "read-char-exclusive",
"read-coding-system", "read-command", "read-event",
"read-from-minibuffer", "read-from-string", "read-function",
"read-key-sequence", "read-key-sequence-vector",
"read-no-blanks-input", "read-non-nil-coding-system", "read-string",
"read-variable", "recent-auto-save-p", "recent-doskeys",
"recent-keys", "recenter", "recursion-depth", "recursive-edit",
"redirect-debugging-output", "redirect-frame-focus", "redisplay",
"redraw-display", "redraw-frame", "regexp-quote", "region-beginning",
"region-end", "register-ccl-program", "register-code-conversion-map",
"remhash", "remove-list-of-text-properties", "remove-text-properties",
"rename-buffer", "rename-file", "replace-match",
"reset-this-command-lengths", "resize-mini-window-internal",
"restore-buffer-modified-p", "resume-tty", "reverse", "round",
"run-hook-with-args", "run-hook-with-args-until-failure",
"run-hook-with-args-until-success", "run-hook-wrapped", "run-hooks",
"run-window-configuration-change-hook", "run-window-scroll-functions",
"safe-length", "scan-lists", "scan-sexps", "scroll-down",
"scroll-left", "scroll-other-window", "scroll-right", "scroll-up",
"search-backward", "search-forward", "secure-hash", "select-frame",
"select-window", "selected-frame", "selected-window",
"self-insert-command", "send-string-to-terminal", "sequencep",
"serial-process-configure", "set", "set-buffer",
"set-buffer-auto-saved", "set-buffer-major-mode",
"set-buffer-modified-p", "set-buffer-multibyte", "set-case-table",
"set-category-table", "set-char-table-extra-slot",
"set-char-table-parent", "set-char-table-range", "set-charset-plist",
"set-charset-priority", "set-coding-system-priority",
"set-cursor-size", "set-default", "set-default-file-modes",
"set-default-toplevel-value", "set-file-acl", "set-file-modes",
"set-file-selinux-context", "set-file-times", "set-fontset-font",
"set-frame-height", "set-frame-position", "set-frame-selected-window",
"set-frame-size", "set-frame-width", "set-fringe-bitmap-face",
"set-input-interrupt-mode", "set-input-meta-mode", "set-input-mode",
"set-keyboard-coding-system-internal", "set-keymap-parent",
"set-marker", "set-marker-insertion-type", "set-match-data",
"set-message-beep", "set-minibuffer-window",
"set-mouse-pixel-position", "set-mouse-position",
"set-network-process-option", "set-output-flow-control",
"set-process-buffer", "set-process-coding-system",
"set-process-datagram-address", "set-process-filter",
"set-process-filter-multibyte",
"set-process-inherit-coding-system-flag", "set-process-plist",
"set-process-query-on-exit-flag", "set-process-sentinel",
"set-process-window-size", "set-quit-char",
"set-safe-terminal-coding-system-internal", "set-screen-color",
"set-standard-case-table", "set-syntax-table",
"set-terminal-coding-system-internal", "set-terminal-local-value",
"set-terminal-parameter", "set-text-properties", "set-time-zone-rule",
"set-visited-file-modtime", "set-window-buffer",
"set-window-combination-limit", "set-window-configuration",
"set-window-dedicated-p", "set-window-display-table",
"set-window-fringes", "set-window-hscroll", "set-window-margins",
"set-window-new-normal", "set-window-new-pixel",
"set-window-new-total", "set-window-next-buffers",
"set-window-parameter", "set-window-point", "set-window-prev-buffers",
"set-window-redisplay-end-trigger", "set-window-scroll-bars",
"set-window-start", "set-window-vscroll", "setcar", "setcdr",
"setplist", "show-face-resources", "signal", "signal-process", "sin",
"single-key-description", "skip-chars-backward", "skip-chars-forward",
"skip-syntax-backward", "skip-syntax-forward", "sleep-for", "sort",
"sort-charsets", "special-variable-p", "split-char",
"split-window-internal", "sqrt", "standard-case-table",
"standard-category-table", "standard-syntax-table", "start-kbd-macro",
"start-process", "stop-process", "store-kbd-macro-event", "string",
"string-as-multibyte", "string-as-unibyte", "string-bytes",
"string-collate-equalp", "string-collate-lessp", "string-equal",
"string-lessp", "string-make-multibyte", "string-make-unibyte",
"string-match", "string-to-char", "string-to-multibyte",
"string-to-number", "string-to-syntax", "string-to-unibyte",
"string-width", "stringp", "subr-name", "subrp",
"subst-char-in-region", "substitute-command-keys",
"substitute-in-file-name", "substring", "substring-no-properties",
"suspend-emacs", "suspend-tty", "suspicious-object", "sxhash",
"symbol-function", "symbol-name", "symbol-plist", "symbol-value",
"symbolp", "syntax-table", "syntax-table-p", "system-groups",
"system-move-file-to-trash", "system-name", "system-users", "tan",
"terminal-coding-system", "terminal-list", "terminal-live-p",
"terminal-local-value", "terminal-name", "terminal-parameter",
"terminal-parameters", "terpri", "test-completion",
"text-char-description", "text-properties-at", "text-property-any",
"text-property-not-all", "this-command-keys",
"this-command-keys-vector", "this-single-command-keys",
"this-single-command-raw-keys", "time-add", "time-less-p",
"time-subtract", "tool-bar-get-system-style", "tool-bar-height",
"tool-bar-pixel-width", "top-level", "trace-redisplay",
"trace-to-stderr", "translate-region-internal", "transpose-regions",
"truncate", "try-completion", "tty-display-color-cells",
"tty-display-color-p", "tty-no-underline",
"tty-suppress-bold-inverse-default-colors", "tty-top-frame",
"tty-type", "type-of", "undo-boundary", "unencodable-char-position",
"unhandled-file-name-directory", "unibyte-char-to-multibyte",
"unibyte-string", "unicode-property-table-internal", "unify-charset",
"unintern", "unix-sync", "unlock-buffer", "upcase", "upcase-initials",
"upcase-initials-region", "upcase-region", "upcase-word",
"use-global-map", "use-local-map", "user-full-name",
"user-login-name", "user-real-login-name", "user-real-uid",
"user-uid", "variable-binding-locus", "vconcat", "vector",
"vector-or-char-table-p", "vectorp", "verify-visited-file-modtime",
"vertical-motion", "visible-frame-list", "visited-file-modtime",
"w16-get-clipboard-data", "w16-selection-exists-p",
"w16-set-clipboard-data", "w32-battery-status",
"w32-default-color-map", "w32-define-rgb-color",
"w32-display-monitor-attributes-list", "w32-frame-menu-bar-size",
"w32-frame-rect", "w32-get-clipboard-data",
"w32-get-codepage-charset", "w32-get-console-codepage",
"w32-get-console-output-codepage", "w32-get-current-locale-id",
"w32-get-default-locale-id", "w32-get-keyboard-layout",
"w32-get-locale-info", "w32-get-valid-codepages",
"w32-get-valid-keyboard-layouts", "w32-get-valid-locale-ids",
"w32-has-winsock", "w32-long-file-name", "w32-reconstruct-hot-key",
"w32-register-hot-key", "w32-registered-hot-keys",
"w32-selection-exists-p", "w32-send-sys-command",
"w32-set-clipboard-data", "w32-set-console-codepage",
"w32-set-console-output-codepage", "w32-set-current-locale",
"w32-set-keyboard-layout", "w32-set-process-priority",
"w32-shell-execute", "w32-short-file-name", "w32-toggle-lock-key",
"w32-unload-winsock", "w32-unregister-hot-key", "w32-window-exists-p",
"w32notify-add-watch", "w32notify-rm-watch",
"waiting-for-user-input-p", "where-is-internal", "widen",
"widget-apply", "widget-get", "widget-put",
"window-absolute-pixel-edges", "window-at", "window-body-height",
"window-body-width", "window-bottom-divider-width", "window-buffer",
"window-combination-limit", "window-configuration-frame",
"window-configuration-p", "window-dedicated-p",
"window-display-table", "window-edges", "window-end", "window-frame",
"window-fringes", "window-header-line-height", "window-hscroll",
"window-inside-absolute-pixel-edges", "window-inside-edges",
"window-inside-pixel-edges", "window-left-child",
"window-left-column", "window-line-height", "window-list",
"window-list-1", "window-live-p", "window-margins",
"window-minibuffer-p", "window-mode-line-height", "window-new-normal",
"window-new-pixel", "window-new-total", "window-next-buffers",
"window-next-sibling", "window-normal-size", "window-old-point",
"window-parameter", "window-parameters", "window-parent",
"window-pixel-edges", "window-pixel-height", "window-pixel-left",
"window-pixel-top", "window-pixel-width", "window-point",
"window-prev-buffers", "window-prev-sibling",
"window-redisplay-end-trigger", "window-resize-apply",
"window-resize-apply-total", "window-right-divider-width",
"window-scroll-bar-height", "window-scroll-bar-width",
"window-scroll-bars", "window-start", "window-system",
"window-text-height", "window-text-pixel-size", "window-text-width",
"window-top-child", "window-top-line", "window-total-height",
"window-total-width", "window-use-time", "window-valid-p",
"window-vscroll", "windowp", "write-char", "write-region",
"x-backspace-delete-keys-p", "x-change-window-property",
"x-change-window-property", "x-close-connection",
"x-close-connection", "x-create-frame", "x-create-frame",
"x-delete-window-property", "x-delete-window-property",
"x-disown-selection-internal", "x-display-backing-store",
"x-display-backing-store", "x-display-color-cells",
"x-display-color-cells", "x-display-grayscale-p",
"x-display-grayscale-p", "x-display-list", "x-display-list",
"x-display-mm-height", "x-display-mm-height", "x-display-mm-width",
"x-display-mm-width", "x-display-monitor-attributes-list",
"x-display-pixel-height", "x-display-pixel-height",
"x-display-pixel-width", "x-display-pixel-width", "x-display-planes",
"x-display-planes", "x-display-save-under", "x-display-save-under",
"x-display-screens", "x-display-screens", "x-display-visual-class",
"x-display-visual-class", "x-family-fonts", "x-file-dialog",
"x-file-dialog", "x-file-dialog", "x-focus-frame", "x-frame-geometry",
"x-frame-geometry", "x-get-atom-name", "x-get-resource",
"x-get-selection-internal", "x-hide-tip", "x-hide-tip",
"x-list-fonts", "x-load-color-file", "x-menu-bar-open-internal",
"x-menu-bar-open-internal", "x-open-connection", "x-open-connection",
"x-own-selection-internal", "x-parse-geometry", "x-popup-dialog",
"x-popup-menu", "x-register-dnd-atom", "x-select-font",
"x-select-font", "x-selection-exists-p", "x-selection-owner-p",
"x-send-client-message", "x-server-max-request-size",
"x-server-max-request-size", "x-server-vendor", "x-server-vendor",
"x-server-version", "x-server-version", "x-show-tip", "x-show-tip",
"x-synchronize", "x-synchronize", "x-uses-old-gtk-dialog",
"x-window-property", "x-window-property", "x-wm-set-size-hint",
"xw-color-defined-p", "xw-color-defined-p", "xw-color-values",
"xw-color-values", "xw-display-color-p", "xw-display-color-p",
"yes-or-no-p", "zlib-available-p", "zlib-decompress-region",
"forward-point",
}
emacsBuiltinFunctionHighlighted = []string{
"defvaralias", "provide", "require",
"with-no-warnings", "define-widget", "with-electric-help",
"throw", "defalias", "featurep",
}
emacsLambdaListKeywords = []string{
"&allow-other-keys", "&aux", "&body", "&environment", "&key", "&optional",
"&rest", "&whole",
}
emacsErrorKeywords = []string{
"cl-assert", "cl-check-type", "error", "signal",
"user-error", "warn",
}
)
// EmacsLisp lexer.
var EmacsLisp = internal.Register(TypeRemappingLexer(MustNewLexer(
&Config{
Name: "EmacsLisp",
Aliases: []string{"emacs", "elisp", "emacs-lisp"},
Filenames: []string{"*.el"},
MimeTypes: []string{"text/x-elisp", "application/x-elisp"},
},
Rules{
"root": {
Default(Push("body")),
},
"body": {
{`\s+`, Text, nil},
{`;.*$`, CommentSingle, nil},
{`"`, LiteralString, Push("string")},
{`\?([^\\]|\\.)`, LiteralStringChar, nil},
{`:((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)`, NameBuiltin, nil},
{`::((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)`, LiteralStringSymbol, nil},
{`'((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)`, LiteralStringSymbol, nil},
{`'`, Operator, nil},
{"`", Operator, nil},
{"[-+]?\\d+\\.?(?=[ \"()\\]\\'\\n,;`])", LiteralNumberInteger, nil},
{"[-+]?\\d+/\\d+(?=[ \"()\\]\\'\\n,;`])", LiteralNumber, nil},
{"[-+]?(\\d*\\.\\d+([defls][-+]?\\d+)?|\\d+(\\.\\d*)?[defls][-+]?\\d+)(?=[ \"()\\]\\'\\n,;`])", LiteralNumberFloat, nil},
{`\[|\]`, Punctuation, nil},
{`#:((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)`, LiteralStringSymbol, nil},
{`#\^\^?`, Operator, nil},
{`#\'`, NameFunction, nil},
{`#[bB][+-]?[01]+(/[01]+)?`, LiteralNumberBin, nil},
{`#[oO][+-]?[0-7]+(/[0-7]+)?`, LiteralNumberOct, nil},
{`#[xX][+-]?[0-9a-fA-F]+(/[0-9a-fA-F]+)?`, LiteralNumberHex, nil},
{`#\d+r[+-]?[0-9a-zA-Z]+(/[0-9a-zA-Z]+)?`, LiteralNumber, nil},
{`#\d+=`, Operator, nil},
{`#\d+#`, Operator, nil},
{`(,@|,|\.|:)`, Operator, nil},
{"(t|nil)(?=[ \"()\\]\\'\\n,;`])", NameConstant, nil},
{`\*((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)\*`, NameVariableGlobal, nil},
{`((?:\\.|[\w!$%&*+-/<=>?@^{}~|])(?:\\.|[\w!$%&*+-/<=>?@^{}~|]|[#.:])*)`, NameVariable, nil},
{`#\(`, Operator, Push("body")},
{`\(`, Punctuation, Push("body")},
{`\)`, Punctuation, Pop(1)},
},
"string": {
{"[^\"\\\\`]+", LiteralString, nil},
{"`((?:\\\\.|[\\w!$%&*+-/<=>?@^{}~|])(?:\\\\.|[\\w!$%&*+-/<=>?@^{}~|]|[#.:])*)\\'", LiteralStringSymbol, nil},
{"`", LiteralString, nil},
{`\\.`, LiteralString, nil},
{`\\\n`, LiteralString, nil},
{`"`, LiteralString, Pop(1)},
},
},
), TypeMapping{
{NameVariable, NameFunction, emacsBuiltinFunction},
{NameVariable, NameBuiltin, emacsSpecialForms},
{NameVariable, NameException, emacsErrorKeywords},
{NameVariable, NameBuiltin, append(emacsBuiltinFunctionHighlighted, emacsMacros...)},
{NameVariable, KeywordPseudo, emacsLambdaListKeywords},
}))

View file

@ -0,0 +1,66 @@
package e
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Erlang lexer.
var Erlang = internal.Register(MustNewLexer(
&Config{
Name: "Erlang",
Aliases: []string{"erlang"},
Filenames: []string{"*.erl", "*.hrl", "*.es", "*.escript"},
MimeTypes: []string{"text/x-erlang"},
},
Rules{
"root": {
{`\s+`, Text, nil},
{`%.*\n`, Comment, nil},
{Words(``, `\b`, `after`, `begin`, `case`, `catch`, `cond`, `end`, `fun`, `if`, `let`, `of`, `query`, `receive`, `try`, `when`), Keyword, nil},
{Words(``, `\b`, `abs`, `append_element`, `apply`, `atom_to_list`, `binary_to_list`, `bitstring_to_list`, `binary_to_term`, `bit_size`, `bump_reductions`, `byte_size`, `cancel_timer`, `check_process_code`, `delete_module`, `demonitor`, `disconnect_node`, `display`, `element`, `erase`, `exit`, `float`, `float_to_list`, `fun_info`, `fun_to_list`, `function_exported`, `garbage_collect`, `get`, `get_keys`, `group_leader`, `hash`, `hd`, `integer_to_list`, `iolist_to_binary`, `iolist_size`, `is_atom`, `is_binary`, `is_bitstring`, `is_boolean`, `is_builtin`, `is_float`, `is_function`, `is_integer`, `is_list`, `is_number`, `is_pid`, `is_port`, `is_process_alive`, `is_record`, `is_reference`, `is_tuple`, `length`, `link`, `list_to_atom`, `list_to_binary`, `list_to_bitstring`, `list_to_existing_atom`, `list_to_float`, `list_to_integer`, `list_to_pid`, `list_to_tuple`, `load_module`, `localtime_to_universaltime`, `make_tuple`, `md5`, `md5_final`, `md5_update`, `memory`, `module_loaded`, `monitor`, `monitor_node`, `node`, `nodes`, `open_port`, `phash`, `phash2`, `pid_to_list`, `port_close`, `port_command`, `port_connect`, `port_control`, `port_call`, `port_info`, `port_to_list`, `process_display`, `process_flag`, `process_info`, `purge_module`, `put`, `read_timer`, `ref_to_list`, `register`, `resume_process`, `round`, `send`, `send_after`, `send_nosuspend`, `set_cookie`, `setelement`, `size`, `spawn`, `spawn_link`, `spawn_monitor`, `spawn_opt`, `split_binary`, `start_timer`, `statistics`, `suspend_process`, `system_flag`, `system_info`, `system_monitor`, `system_profile`, `term_to_binary`, `tl`, `trace`, `trace_delivered`, `trace_info`, `trace_pattern`, `trunc`, `tuple_size`, `tuple_to_list`, `universaltime_to_localtime`, `unlink`, `unregister`, `whereis`), NameBuiltin, nil},
{Words(``, `\b`, `and`, `andalso`, `band`, `bnot`, `bor`, `bsl`, `bsr`, `bxor`, `div`, `not`, `or`, `orelse`, `rem`, `xor`), OperatorWord, nil},
{`^-`, Punctuation, Push("directive")},
{`(\+\+?|--?|\*|/|<|>|/=|=:=|=/=|=<|>=|==?|<-|!|\?)`, Operator, nil},
{`"`, LiteralString, Push("string")},
{`<<`, NameLabel, nil},
{`>>`, NameLabel, nil},
{`((?:[a-z]\w*|'[^\n']*[^\\]'))(:)`, ByGroups(NameNamespace, Punctuation), nil},
{`(?:^|(?<=:))((?:[a-z]\w*|'[^\n']*[^\\]'))(\s*)(\()`, ByGroups(NameFunction, Text, Punctuation), nil},
{`[+-]?(?:[2-9]|[12][0-9]|3[0-6])#[0-9a-zA-Z]+`, LiteralNumberInteger, nil},
{`[+-]?\d+`, LiteralNumberInteger, nil},
{`[+-]?\d+.\d+`, LiteralNumberFloat, nil},
{`[]\[:_@\".{}()|;,]`, Punctuation, nil},
{`(?:[A-Z_]\w*)`, NameVariable, nil},
{`(?:[a-z]\w*|'[^\n']*[^\\]')`, Name, nil},
{`\?(?:(?:[A-Z_]\w*)|(?:[a-z]\w*|'[^\n']*[^\\]'))`, NameConstant, nil},
{`\$(?:(?:\\(?:[bdefnrstv\'"\\]|[0-7][0-7]?[0-7]?|(?:x[0-9a-fA-F]{2}|x\{[0-9a-fA-F]+\})|\^[a-zA-Z]))|\\[ %]|[^\\])`, LiteralStringChar, nil},
{`#(?:[a-z]\w*|'[^\n']*[^\\]')(:?\.(?:[a-z]\w*|'[^\n']*[^\\]'))?`, NameLabel, nil},
{`\A#!.+\n`, CommentHashbang, nil},
{`#\{`, Punctuation, Push("map_key")},
},
"string": {
{`(?:\\(?:[bdefnrstv\'"\\]|[0-7][0-7]?[0-7]?|(?:x[0-9a-fA-F]{2}|x\{[0-9a-fA-F]+\})|\^[a-zA-Z]))`, LiteralStringEscape, nil},
{`"`, LiteralString, Pop(1)},
{`~[0-9.*]*[~#+BPWXb-ginpswx]`, LiteralStringInterpol, nil},
{`[^"\\~]+`, LiteralString, nil},
{`~`, LiteralString, nil},
},
"directive": {
{`(define)(\s*)(\()((?:(?:[A-Z_]\w*)|(?:[a-z]\w*|'[^\n']*[^\\]')))`, ByGroups(NameEntity, Text, Punctuation, NameConstant), Pop(1)},
{`(record)(\s*)(\()((?:(?:[A-Z_]\w*)|(?:[a-z]\w*|'[^\n']*[^\\]')))`, ByGroups(NameEntity, Text, Punctuation, NameLabel), Pop(1)},
{`(?:[a-z]\w*|'[^\n']*[^\\]')`, NameEntity, Pop(1)},
},
"map_key": {
Include("root"),
{`=>`, Punctuation, Push("map_val")},
{`:=`, Punctuation, Push("map_val")},
{`\}`, Punctuation, Pop(1)},
},
"map_val": {
Include("root"),
{`,`, Punctuation, Pop(1)},
{`(?=\})`, Punctuation, Pop(1)},
},
},
))

View file

@ -0,0 +1,115 @@
package f
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Factor lexer.
var Factor = internal.Register(MustNewLexer(
&Config{
Name: "Factor",
Aliases: []string{"factor"},
Filenames: []string{"*.factor"},
MimeTypes: []string{"text/x-factor"},
},
Rules{
"root": {
{`#!.*$`, CommentPreproc, nil},
Default(Push("base")),
},
"base": {
{`\s+`, Text, nil},
{`((?:MACRO|MEMO|TYPED)?:[:]?)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`(M:[:]?)(\s+)(\S+)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass, Text, NameFunction), nil},
{`(C:)(\s+)(\S+)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction, Text, NameClass), nil},
{`(GENERIC:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`(HOOK:|GENERIC#)(\s+)(\S+)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction, Text, NameFunction), nil},
{`\(\s`, NameFunction, Push("stackeffect")},
{`;\s`, Keyword, nil},
{`(USING:)(\s+)`, ByGroups(KeywordNamespace, Text), Push("vocabs")},
{`(USE:|UNUSE:|IN:|QUALIFIED:)(\s+)(\S+)`, ByGroups(KeywordNamespace, Text, NameNamespace), nil},
{`(QUALIFIED-WITH:)(\s+)(\S+)(\s+)(\S+)`, ByGroups(KeywordNamespace, Text, NameNamespace, Text, NameNamespace), nil},
{`(FROM:|EXCLUDE:)(\s+)(\S+)(\s+=>\s)`, ByGroups(KeywordNamespace, Text, NameNamespace, Text), Push("words")},
{`(RENAME:)(\s+)(\S+)(\s+)(\S+)(\s+=>\s+)(\S+)`, ByGroups(KeywordNamespace, Text, NameFunction, Text, NameNamespace, Text, NameFunction), nil},
{`(ALIAS:|TYPEDEF:)(\s+)(\S+)(\s+)(\S+)`, ByGroups(KeywordNamespace, Text, NameFunction, Text, NameFunction), nil},
{`(DEFER:|FORGET:|POSTPONE:)(\s+)(\S+)`, ByGroups(KeywordNamespace, Text, NameFunction), nil},
{`(TUPLE:|ERROR:)(\s+)(\S+)(\s+<\s+)(\S+)`, ByGroups(Keyword, Text, NameClass, Text, NameClass), Push("slots")},
{`(TUPLE:|ERROR:|BUILTIN:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass), Push("slots")},
{`(MIXIN:|UNION:|INTERSECTION:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass), nil},
{`(PREDICATE:)(\s+)(\S+)(\s+<\s+)(\S+)`, ByGroups(Keyword, Text, NameClass, Text, NameClass), nil},
{`(C:)(\s+)(\S+)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction, Text, NameClass), nil},
{`(INSTANCE:)(\s+)(\S+)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass, Text, NameClass), nil},
{`(SLOT:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`(SINGLETON:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass), nil},
{`SINGLETONS:`, Keyword, Push("classes")},
{`(CONSTANT:|SYMBOL:|MAIN:|HELP:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameFunction), nil},
{`SYMBOLS:\s`, Keyword, Push("words")},
{`SYNTAX:\s`, Keyword, nil},
{`ALIEN:\s`, Keyword, nil},
{`(STRUCT:)(\s+)(\S+)`, ByGroups(Keyword, Text, NameClass), nil},
{`(FUNCTION:)(\s+\S+\s+)(\S+)(\s+\(\s+[^)]+\)\s)`, ByGroups(KeywordNamespace, Text, NameFunction, Text), nil},
{`(FUNCTION-ALIAS:)(\s+)(\S+)(\s+\S+\s+)(\S+)(\s+\(\s+[^)]+\)\s)`, ByGroups(KeywordNamespace, Text, NameFunction, Text, NameFunction, Text), nil},
{`(?:<PRIVATE|PRIVATE>)\s`, KeywordNamespace, nil},
{`"""\s+(?:.|\n)*?\s+"""`, LiteralString, nil},
{`"(?:\\\\|\\"|[^"])*"`, LiteralString, nil},
{`\S+"\s+(?:\\\\|\\"|[^"])*"`, LiteralString, nil},
{`CHAR:\s+(?:\\[\\abfnrstv]|[^\\]\S*)\s`, LiteralStringChar, nil},
{`!\s+.*$`, Comment, nil},
{`#!\s+.*$`, Comment, nil},
{`/\*\s+(?:.|\n)*?\s\*/\s`, Comment, nil},
{`[tf]\s`, NameConstant, nil},
{`[\\$]\s+\S+`, NameConstant, nil},
{`M\\\s+\S+\s+\S+`, NameConstant, nil},
{`[+-]?(?:[\d,]*\d)?\.(?:\d([\d,]*\d)?)?(?:[eE][+-]?\d+)?\s`, LiteralNumber, nil},
{`[+-]?\d(?:[\d,]*\d)?(?:[eE][+-]?\d+)?\s`, LiteralNumber, nil},
{`0x[a-fA-F\d](?:[a-fA-F\d,]*[a-fA-F\d])?(?:p\d([\d,]*\d)?)?\s`, LiteralNumber, nil},
{`NAN:\s+[a-fA-F\d](?:[a-fA-F\d,]*[a-fA-F\d])?(?:p\d([\d,]*\d)?)?\s`, LiteralNumber, nil},
{`0b[01]+\s`, LiteralNumberBin, nil},
{`0o[0-7]+\s`, LiteralNumberOct, nil},
{`(?:\d([\d,]*\d)?)?\+\d(?:[\d,]*\d)?/\d(?:[\d,]*\d)?\s`, LiteralNumber, nil},
{`(?:\-\d([\d,]*\d)?)?\-\d(?:[\d,]*\d)?/\d(?:[\d,]*\d)?\s`, LiteralNumber, nil},
{`(?:deprecated|final|foldable|flushable|inline|recursive)\s`, Keyword, nil},
{Words(``, `\s`, `-rot`, `2bi`, `2bi@`, `2bi*`, `2curry`, `2dip`, `2drop`, `2dup`, `2keep`, `2nip`, `2over`, `2tri`, `2tri@`, `2tri*`, `3bi`, `3curry`, `3dip`, `3drop`, `3dup`, `3keep`, `3tri`, `4dip`, `4drop`, `4dup`, `4keep`, `<wrapper>`, `=`, `>boolean`, `clone`, `?`, `?execute`, `?if`, `and`, `assert`, `assert=`, `assert?`, `bi`, `bi-curry`, `bi-curry@`, `bi-curry*`, `bi@`, `bi*`, `boa`, `boolean`, `boolean?`, `both?`, `build`, `call`, `callstack`, `callstack>array`, `callstack?`, `clear`, `(clone)`, `compose`, `compose?`, `curry`, `curry?`, `datastack`, `die`, `dip`, `do`, `drop`, `dup`, `dupd`, `either?`, `eq?`, `equal?`, `execute`, `hashcode`, `hashcode*`, `identity-hashcode`, `identity-tuple`, `identity-tuple?`, `if`, `if*`, `keep`, `loop`, `most`, `new`, `nip`, `not`, `null`, `object`, `or`, `over`, `pick`, `prepose`, `retainstack`, `rot`, `same?`, `swap`, `swapd`, `throw`, `tri`, `tri-curry`, `tri-curry@`, `tri-curry*`, `tri@`, `tri*`, `tuple`, `tuple?`, `unless`, `unless*`, `until`, `when`, `when*`, `while`, `with`, `wrapper`, `wrapper?`, `xor`), NameBuiltin, nil},
{Words(``, `\s`, `2cache`, `<enum>`, `>alist`, `?at`, `?of`, `assoc`, `assoc-all?`, `assoc-any?`, `assoc-clone-like`, `assoc-combine`, `assoc-diff`, `assoc-diff!`, `assoc-differ`, `assoc-each`, `assoc-empty?`, `assoc-filter`, `assoc-filter!`, `assoc-filter-as`, `assoc-find`, `assoc-hashcode`, `assoc-intersect`, `assoc-like`, `assoc-map`, `assoc-map-as`, `assoc-partition`, `assoc-refine`, `assoc-size`, `assoc-stack`, `assoc-subset?`, `assoc-union`, `assoc-union!`, `assoc=`, `assoc>map`, `assoc?`, `at`, `at+`, `at*`, `cache`, `change-at`, `clear-assoc`, `delete-at`, `delete-at*`, `enum`, `enum?`, `extract-keys`, `inc-at`, `key?`, `keys`, `map>assoc`, `maybe-set-at`, `new-assoc`, `of`, `push-at`, `rename-at`, `set-at`, `sift-keys`, `sift-values`, `substitute`, `unzip`, `value-at`, `value-at*`, `value?`, `values`, `zip`), NameBuiltin, nil},
{Words(``, `\s`, `2cleave`, `2cleave>quot`, `3cleave`, `3cleave>quot`, `4cleave`, `4cleave>quot`, `alist>quot`, `call-effect`, `case`, `case-find`, `case>quot`, `cleave`, `cleave>quot`, `cond`, `cond>quot`, `deep-spread>quot`, `execute-effect`, `linear-case-quot`, `no-case`, `no-case?`, `no-cond`, `no-cond?`, `recursive-hashcode`, `shallow-spread>quot`, `spread`, `to-fixed-point`, `wrong-values`, `wrong-values?`), NameBuiltin, nil},
{Words(``, `\s`, `-`, `/`, `/f`, `/i`, `/mod`, `2/`, `2^`, `<`, `<=`, `<fp-nan>`, `>`, `>=`, `>bignum`, `>fixnum`, `>float`, `>integer`, `(all-integers?)`, `(each-integer)`, `(find-integer)`, `*`, `+`, `?1+`, `abs`, `align`, `all-integers?`, `bignum`, `bignum?`, `bit?`, `bitand`, `bitnot`, `bitor`, `bits>double`, `bits>float`, `bitxor`, `complex`, `complex?`, `denominator`, `double>bits`, `each-integer`, `even?`, `find-integer`, `find-last-integer`, `fixnum`, `fixnum?`, `float`, `float>bits`, `float?`, `fp-bitwise=`, `fp-infinity?`, `fp-nan-payload`, `fp-nan?`, `fp-qnan?`, `fp-sign`, `fp-snan?`, `fp-special?`, `if-zero`, `imaginary-part`, `integer`, `integer>fixnum`, `integer>fixnum-strict`, `integer?`, `log2`, `log2-expects-positive`, `log2-expects-positive?`, `mod`, `neg`, `neg?`, `next-float`, `next-power-of-2`, `number`, `number=`, `number?`, `numerator`, `odd?`, `out-of-fixnum-range`, `out-of-fixnum-range?`, `power-of-2?`, `prev-float`, `ratio`, `ratio?`, `rational`, `rational?`, `real`, `real-part`, `real?`, `recip`, `rem`, `sgn`, `shift`, `sq`, `times`, `u<`, `u<=`, `u>`, `u>=`, `unless-zero`, `unordered?`, `when-zero`, `zero?`), NameBuiltin, nil},
{Words(``, `\s`, `1sequence`, `2all?`, `2each`, `2map`, `2map-as`, `2map-reduce`, `2reduce`, `2selector`, `2sequence`, `3append`, `3append-as`, `3each`, `3map`, `3map-as`, `3sequence`, `4sequence`, `<repetition>`, `<reversed>`, `<slice>`, `?first`, `?last`, `?nth`, `?second`, `?set-nth`, `accumulate`, `accumulate!`, `accumulate-as`, `all?`, `any?`, `append`, `append!`, `append-as`, `assert-sequence`, `assert-sequence=`, `assert-sequence?`, `binary-reduce`, `bounds-check`, `bounds-check?`, `bounds-error`, `bounds-error?`, `but-last`, `but-last-slice`, `cartesian-each`, `cartesian-map`, `cartesian-product`, `change-nth`, `check-slice`, `check-slice-error`, `clone-like`, `collapse-slice`, `collector`, `collector-for`, `concat`, `concat-as`, `copy`, `count`, `cut`, `cut-slice`, `cut*`, `delete-all`, `delete-slice`, `drop-prefix`, `each`, `each-from`, `each-index`, `empty?`, `exchange`, `filter`, `filter!`, `filter-as`, `find`, `find-from`, `find-index`, `find-index-from`, `find-last`, `find-last-from`, `first`, `first2`, `first3`, `first4`, `flip`, `follow`, `fourth`, `glue`, `halves`, `harvest`, `head`, `head-slice`, `head-slice*`, `head*`, `head?`, `if-empty`, `immutable`, `immutable-sequence`, `immutable-sequence?`, `immutable?`, `index`, `index-from`, `indices`, `infimum`, `infimum-by`, `insert-nth`, `interleave`, `iota`, `iota-tuple`, `iota-tuple?`, `join`, `join-as`, `last`, `last-index`, `last-index-from`, `length`, `lengthen`, `like`, `longer`, `longer?`, `longest`, `map`, `map!`, `map-as`, `map-find`, `map-find-last`, `map-index`, `map-integers`, `map-reduce`, `map-sum`, `max-length`, `member-eq?`, `member?`, `midpoint@`, `min-length`, `mismatch`, `move`, `new-like`, `new-resizable`, `new-sequence`, `non-negative-integer-expected`, `non-negative-integer-expected?`, `nth`, `nths`, `pad-head`, `pad-tail`, `padding`, `partition`, `pop`, `pop*`, `prefix`, `prepend`, `prepend-as`, `produce`, `produce-as`, `product`, `push`, `push-all`, `push-either`, `push-if`, `reduce`, `reduce-index`, `remove`, `remove!`, `remove-eq`, `remove-eq!`, `remove-nth`, `remove-nth!`, `repetition`, `repetition?`, `replace-slice`, `replicate`, `replicate-as`, `rest`, `rest-slice`, `reverse`, `reverse!`, `reversed`, `reversed?`, `second`, `selector`, `selector-for`, `sequence`, `sequence-hashcode`, `sequence=`, `sequence?`, `set-first`, `set-fourth`, `set-last`, `set-length`, `set-nth`, `set-second`, `set-third`, `short`, `shorten`, `shorter`, `shorter?`, `shortest`, `sift`, `slice`, `slice-error`, `slice-error?`, `slice?`, `snip`, `snip-slice`, `start`, `start*`, `subseq`, `subseq?`, `suffix`, `suffix!`, `sum`, `sum-lengths`, `supremum`, `supremum-by`, `surround`, `tail`, `tail-slice`, `tail-slice*`, `tail*`, `tail?`, `third`, `trim`, `trim-head`, `trim-head-slice`, `trim-slice`, `trim-tail`, `trim-tail-slice`, `unclip`, `unclip-last`, `unclip-last-slice`, `unclip-slice`, `unless-empty`, `virtual-exemplar`, `virtual-sequence`, `virtual-sequence?`, `virtual@`, `when-empty`), NameBuiltin, nil},
{Words(``, `\s`, `+@`, `change`, `change-global`, `counter`, `dec`, `get`, `get-global`, `global`, `inc`, `init-namespaces`, `initialize`, `is-global`, `make-assoc`, `namespace`, `namestack`, `off`, `on`, `set`, `set-global`, `set-namestack`, `toggle`, `with-global`, `with-scope`, `with-variable`, `with-variables`), NameBuiltin, nil},
{Words(``, `\s`, `1array`, `2array`, `3array`, `4array`, `<array>`, `>array`, `array`, `array?`, `pair`, `pair?`, `resize-array`), NameBuiltin, nil},
{Words(``, `\s`, `(each-stream-block-slice)`, `(each-stream-block)`, `(stream-contents-by-block)`, `(stream-contents-by-element)`, `(stream-contents-by-length-or-block)`, `(stream-contents-by-length)`, `+byte+`, `+character+`, `bad-seek-type`, `bad-seek-type?`, `bl`, `contents`, `each-block`, `each-block-size`, `each-block-slice`, `each-line`, `each-morsel`, `each-stream-block`, `each-stream-block-slice`, `each-stream-line`, `error-stream`, `flush`, `input-stream`, `input-stream?`, `invalid-read-buffer`, `invalid-read-buffer?`, `lines`, `nl`, `output-stream`, `output-stream?`, `print`, `read`, `read-into`, `read-partial`, `read-partial-into`, `read-until`, `read1`, `readln`, `seek-absolute`, `seek-absolute?`, `seek-end`, `seek-end?`, `seek-input`, `seek-output`, `seek-relative`, `seek-relative?`, `stream-bl`, `stream-contents`, `stream-contents*`, `stream-copy`, `stream-copy*`, `stream-element-type`, `stream-flush`, `stream-length`, `stream-lines`, `stream-nl`, `stream-print`, `stream-read`, `stream-read-into`, `stream-read-partial`, `stream-read-partial-into`, `stream-read-partial-unsafe`, `stream-read-unsafe`, `stream-read-until`, `stream-read1`, `stream-readln`, `stream-seek`, `stream-seekable?`, `stream-tell`, `stream-write`, `stream-write1`, `tell-input`, `tell-output`, `with-error-stream`, `with-error-stream*`, `with-error>output`, `with-input-output+error-streams`, `with-input-output+error-streams*`, `with-input-stream`, `with-input-stream*`, `with-output-stream`, `with-output-stream*`, `with-output>error`, `with-output+error-stream`, `with-output+error-stream*`, `with-streams`, `with-streams*`, `write`, `write1`), NameBuiltin, nil},
{Words(``, `\s`, `1string`, `<string>`, `>string`, `resize-string`, `string`, `string?`), NameBuiltin, nil},
{Words(``, `\s`, `1vector`, `<vector>`, `>vector`, `?push`, `vector`, `vector?`), NameBuiltin, nil},
{Words(``, `\s`, `<condition>`, `<continuation>`, `<restart>`, `attempt-all`, `attempt-all-error`, `attempt-all-error?`, `callback-error-hook`, `callcc0`, `callcc1`, `cleanup`, `compute-restarts`, `condition`, `condition?`, `continuation`, `continuation?`, `continue`, `continue-restart`, `continue-with`, `current-continuation`, `error`, `error-continuation`, `error-in-thread`, `error-thread`, `ifcc`, `ignore-errors`, `in-callback?`, `original-error`, `recover`, `restart`, `restart?`, `restarts`, `rethrow`, `rethrow-restarts`, `return`, `return-continuation`, `thread-error-hook`, `throw-continue`, `throw-restarts`, `with-datastack`, `with-return`), NameBuiltin, nil},
{`\S+`, Text, nil},
},
"stackeffect": {
{`\s+`, Text, nil},
{`\(\s+`, NameFunction, Push("stackeffect")},
{`\)\s`, NameFunction, Pop(1)},
{`--\s`, NameFunction, nil},
{`\S+`, NameVariable, nil},
},
"slots": {
{`\s+`, Text, nil},
{`;\s`, Keyword, Pop(1)},
{`(\{\s+)(\S+)(\s+[^}]+\s+\}\s)`, ByGroups(Text, NameVariable, Text), nil},
{`\S+`, NameVariable, nil},
},
"vocabs": {
{`\s+`, Text, nil},
{`;\s`, Keyword, Pop(1)},
{`\S+`, NameNamespace, nil},
},
"classes": {
{`\s+`, Text, nil},
{`;\s`, Keyword, Pop(1)},
{`\S+`, NameClass, nil},
},
"words": {
{`\s+`, Text, nil},
{`;\s`, Keyword, Pop(1)},
{`\S+`, NameFunction, nil},
},
},
))

View file

@ -0,0 +1,65 @@
package f
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Fish lexer.
var Fish = internal.Register(MustNewLexer(
&Config{
Name: "Fish",
Aliases: []string{"fish", "fishshell"},
Filenames: []string{"*.fish", "*.load"},
MimeTypes: []string{"application/x-fish"},
},
Rules{
"root": {
Include("basic"),
Include("data"),
Include("interp"),
},
"interp": {
{`\$\(\(`, Keyword, Push("math")},
{`\(`, Keyword, Push("paren")},
{`\$#?(\w+|.)`, NameVariable, nil},
},
"basic": {
{`\b(begin|end|if|else|while|break|for|in|return|function|block|case|continue|switch|not|and|or|set|echo|exit|pwd|true|false|cd|count|test)(\s*)\b`, ByGroups(Keyword, Text), nil},
{`\b(alias|bg|bind|breakpoint|builtin|command|commandline|complete|contains|dirh|dirs|emit|eval|exec|fg|fish|fish_config|fish_indent|fish_pager|fish_prompt|fish_right_prompt|fish_update_completions|fishd|funced|funcsave|functions|help|history|isatty|jobs|math|mimedb|nextd|open|popd|prevd|psub|pushd|random|read|set_color|source|status|trap|type|ulimit|umask|vared|fc|getopts|hash|kill|printf|time|wait)\s*\b(?!\.)`, NameBuiltin, nil},
{`#.*\n`, Comment, nil},
{`\\[\w\W]`, LiteralStringEscape, nil},
{`(\b\w+)(\s*)(=)`, ByGroups(NameVariable, Text, Operator), nil},
{`[\[\]()=]`, Operator, nil},
{`<<-?\s*(\'?)\\?(\w+)[\w\W]+?\2`, LiteralString, nil},
},
"data": {
{`(?s)\$?"(\\\\|\\[0-7]+|\\.|[^"\\$])*"`, LiteralStringDouble, nil},
{`"`, LiteralStringDouble, Push("string")},
{`(?s)\$'(\\\\|\\[0-7]+|\\.|[^'\\])*'`, LiteralStringSingle, nil},
{`(?s)'.*?'`, LiteralStringSingle, nil},
{`;`, Punctuation, nil},
{`&|\||\^|<|>`, Operator, nil},
{`\s+`, Text, nil},
{`\d+(?= |\Z)`, LiteralNumber, nil},
{"[^=\\s\\[\\]{}()$\"\\'`\\\\<&|;]+", Text, nil},
},
"string": {
{`"`, LiteralStringDouble, Pop(1)},
{`(?s)(\\\\|\\[0-7]+|\\.|[^"\\$])+`, LiteralStringDouble, nil},
Include("interp"),
},
"paren": {
{`\)`, Keyword, Pop(1)},
Include("root"),
},
"math": {
{`\)\)`, Keyword, Pop(1)},
{`[-+*/%^|&]|\*\*|\|\|`, Operator, nil},
{`\d+#\d+`, LiteralNumber, nil},
{`\d+#(?! )`, LiteralNumber, nil},
{`\d+`, LiteralNumber, nil},
Include("root"),
},
},
))

View file

@ -0,0 +1,40 @@
package f
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Forth lexer.
var Forth = internal.Register(MustNewLexer(
&Config{
Name: "Forth",
Aliases: []string{"forth"},
Filenames: []string{"*.frt", "*.fth", "*.fs"},
MimeTypes: []string{"application/x-forth"},
CaseInsensitive: true,
},
Rules{
"root": {
{`\s+`, Text, nil},
{`\\.*?\n`, CommentSingle, nil},
{`\([\s].*?\)`, CommentSingle, nil},
{`(:|variable|constant|value|buffer:)(\s+)`, ByGroups(KeywordNamespace, Text), Push("worddef")},
{`([.sc]")(\s+?)`, ByGroups(LiteralString, Text), Push("stringdef")},
{`(blk|block|buffer|evaluate|flush|load|save-buffers|update|empty-buffers|list|refill|scr|thru|\#s|\*\/mod|\+loop|\/mod|0<|0=|1\+|1-|2!|2\*|2\/|2@|2drop|2dup|2over|2swap|>body|>in|>number|>r|\?dup|abort|abort\"|abs|accept|align|aligned|allot|and|base|begin|bl|c!|c,|c@|cell\+|cells|char|char\+|chars|constant|count|cr|create|decimal|depth|do|does>|drop|dup|else|emit|environment\?|evaluate|execute|exit|fill|find|fm\/mod|here|hold|i|if|immediate|invert|j|key|leave|literal|loop|lshift|m\*|max|min|mod|move|negate|or|over|postpone|quit|r>|r@|recurse|repeat|rot|rshift|s\"|s>d|sign|sm\/rem|source|space|spaces|state|swap|then|type|u\.|u\<|um\*|um\/mod|unloop|until|variable|while|word|xor|\[char\]|\[\'\]|@|!|\#|<\#|\#>|:|;|\+|-|\*|\/|,|<|>|\|1\+|1-|\.|\.r|0<>|0>|2>r|2r>|2r@|:noname|\?do|again|c\"|case|compile,|endcase|endof|erase|false|hex|marker|nip|of|pad|parse|pick|refill|restore-input|roll|save-input|source-id|to|true|tuck|u\.r|u>|unused|value|within|\[compile\]|\#tib|convert|expect|query|span|tib|2constant|2literal|2variable|d\+|d-|d\.|d\.r|d0<|d0=|d2\*|d2\/|d<|d=|d>s|dabs|dmax|dmin|dnegate|m\*\/|m\+|2rot|du<|catch|throw|abort|abort\"|at-xy|key\?|page|ekey|ekey>char|ekey\?|emit\?|ms|time&date|BIN|CLOSE-FILE|CREATE-FILE|DELETE-FILE|FILE-POSITION|FILE-SIZE|INCLUDE-FILE|INCLUDED|OPEN-FILE|R\/O|R\/W|READ-FILE|READ-LINE|REPOSITION-FILE|RESIZE-FILE|S\"|SOURCE-ID|W/O|WRITE-FILE|WRITE-LINE|FILE-STATUS|FLUSH-FILE|REFILL|RENAME-FILE|>float|d>f|f!|f\*|f\+|f-|f\/|f0<|f0=|f<|f>d|f@|falign|faligned|fconstant|fdepth|fdrop|fdup|fliteral|float\+|floats|floor|fmax|fmin|fnegate|fover|frot|fround|fswap|fvariable|represent|df!|df@|dfalign|dfaligned|dfloat\+|dfloats|f\*\*|f\.|fabs|facos|facosh|falog|fasin|fasinh|fatan|fatan2|fatanh|fcos|fcosh|fe\.|fexp|fexpm1|fln|flnp1|flog|fs\.|fsin|fsincos|fsinh|fsqrt|ftan|ftanh|f~|precision|set-precision|sf!|sf@|sfalign|sfaligned|sfloat\+|sfloats|\(local\)|to|locals\||allocate|free|resize|definitions|find|forth-wordlist|get-current|get-order|search-wordlist|set-current|set-order|wordlist|also|forth|only|order|previous|-trailing|\/string|blank|cmove|cmove>|compare|search|sliteral|.s|dump|see|words|;code|ahead|assembler|bye|code|cs-pick|cs-roll|editor|state|\[else\]|\[if\]|\[then\]|forget|defer|defer@|defer!|action-of|begin-structure|field:|buffer:|parse-name|buffer:|traverse-wordlist|n>r|nr>|2value|fvalue|name>interpret|name>compile|name>string|cfield:|end-structure)\s`, Keyword, nil},
{`(\$[0-9A-F]+)`, LiteralNumberHex, nil},
{`(\#|%|&|\-|\+)?[0-9]+`, LiteralNumberInteger, nil},
{`(\#|%|&|\-|\+)?[0-9.]+`, KeywordType, nil},
{`(@i|!i|@e|!e|pause|noop|turnkey|sleep|itype|icompare|sp@|sp!|rp@|rp!|up@|up!|>a|a>|a@|a!|a@+|a@-|>b|b>|b@|b!|b@+|b@-|find-name|1ms|sp0|rp0|\(evaluate\)|int-trap|int!)\s`, NameConstant, nil},
{`(do-recognizer|r:fail|recognizer:|get-recognizers|set-recognizers|r:float|r>comp|r>int|r>post|r:name|r:word|r:dnum|r:num|recognizer|forth-recognizer|rec:num|rec:float|rec:word)\s`, NameDecorator, nil},
{`(Evalue|Rvalue|Uvalue|Edefer|Rdefer|Udefer)(\s+)`, ByGroups(KeywordNamespace, Text), Push("worddef")},
{`[^\s]+(?=[\s])`, NameFunction, nil},
},
"worddef": {
{`\S+`, NameClass, Pop(1)},
},
"stringdef": {
{`[^"]+`, LiteralString, Pop(1)},
},
},
))

View file

@ -0,0 +1,47 @@
package f
import (
. "github.com/alecthomas/chroma" // nolint
"github.com/alecthomas/chroma/lexers/internal"
)
// Fortran lexer.
var Fortran = internal.Register(MustNewLexer(
&Config{
Name: "Fortran",
Aliases: []string{"fortran"},
Filenames: []string{"*.f03", "*.f90", "*.F03", "*.F90"},
MimeTypes: []string{"text/x-fortran"},
CaseInsensitive: true,
},
Rules{
"root": {
{`^#.*\n`, CommentPreproc, nil},
{`!.*\n`, Comment, nil},
Include("strings"),
Include("core"),
{`[a-z][\w$]*`, Name, nil},
Include("nums"),
{`[\s]+`, Text, nil},
},
"core": {
{Words(`\b`, `\s*\b`, `ABSTRACT`, `ACCEPT`, `ALL`, `ALLSTOP`, `ALLOCATABLE`, `ALLOCATE`, `ARRAY`, `ASSIGN`, `ASSOCIATE`, `ASYNCHRONOUS`, `BACKSPACE`, `BIND`, `BLOCK`, `BLOCKDATA`, `BYTE`, `CALL`, `CASE`, `CLASS`, `CLOSE`, `CODIMENSION`, `COMMON`, `CONCURRRENT`, `CONTIGUOUS`, `CONTAINS`, `CONTINUE`, `CRITICAL`, `CYCLE`, `DATA`, `DEALLOCATE`, `DECODE`, `DEFERRED`, `DIMENSION`, `DO`, `ELEMENTAL`, `ELSE`, `ENCODE`, `END`, `ENTRY`, `ENUM`, `ENUMERATOR`, `EQUIVALENCE`, `EXIT`, `EXTENDS`, `EXTERNAL`, `EXTRINSIC`, `FILE`, `FINAL`, `FORALL`, `FORMAT`, `FUNCTION`, `GENERIC`, `GOTO`, `IF`, `IMAGES`, `IMPLICIT`, `IMPORT`, `IMPURE`, `INCLUDE`, `INQUIRE`, `INTENT`, `INTERFACE`, `INTRINSIC`, `IS`, `LOCK`, `MEMORY`, `MODULE`, `NAMELIST`, `NULLIFY`, `NONE`, `NON_INTRINSIC`, `NON_OVERRIDABLE`, `NOPASS`, `OPEN`, `OPTIONAL`, `OPTIONS`, `PARAMETER`, `PASS`, `PAUSE`, `POINTER`, `PRINT`, `PRIVATE`, `PROGRAM`, `PROCEDURE`, `PROTECTED`, `PUBLIC`, `PURE`, `READ`, `RECURSIVE`, `RESULT`, `RETURN`, `REWIND`, `SAVE`, `SELECT`, `SEQUENCE`, `STOP`, `SUBMODULE`, `SUBROUTINE`, `SYNC`, `SYNCALL`, `SYNCIMAGES`, `SYNCMEMORY`, `TARGET`, `THEN`, `TYPE`, `UNLOCK`, `USE`, `VALUE`, `VOLATILE`, `WHERE`, `WRITE`, `WHILE`), Keyword, nil},
{Words(`\b`, `\s*\b`, `CHARACTER`, `COMPLEX`, `DOUBLE PRECISION`, `DOUBLE COMPLEX`, `INTEGER`, `LOGICAL`, `REAL`, `C_INT`, `C_SHORT`, `C_LONG`, `C_LONG_LONG`, `C_SIGNED_CHAR`, `C_SIZE_T`, `C_INT8_T`, `C_INT16_T`, `C_INT32_T`, `C_INT64_T`, `C_INT_LEAST8_T`, `C_INT_LEAST16_T`, `C_INT_LEAST32_T`, `C_INT_LEAST64_T`, `C_INT_FAST8_T`, `C_INT_FAST16_T`, `C_INT_FAST32_T`, `C_INT_FAST64_T`, `C_INTMAX_T`, `C_INTPTR_T`, `C_FLOAT`, `C_DOUBLE`, `C_LONG_DOUBLE`, `C_FLOAT_COMPLEX`, `C_DOUBLE_COMPLEX`, `C_LONG_DOUBLE_COMPLEX`, `C_BOOL`, `C_CHAR`, `C_PTR`, `C_FUNPTR`), KeywordType, nil},
{`(\*\*|\*|\+|-|\/|<|>|<=|>=|==|\/=|=)`, Operator, nil},
{`(::)`, KeywordDeclaration, nil},
{`[()\[\],:&%;.]`, Punctuation, nil},
{Words(`\b`, `\s*\b`, `Abort`, `Abs`, `Access`, `AChar`, `ACos`, `ACosH`, `AdjustL`, `AdjustR`, `AImag`, `AInt`, `Alarm`, `All`, `Allocated`, `ALog`, `AMax`, `AMin`, `AMod`, `And`, `ANInt`, `Any`, `ASin`, `ASinH`, `Associated`, `ATan`, `ATanH`, `Atomic_Define`, `Atomic_Ref`, `BesJ`, `BesJN`, `Bessel_J0`, `Bessel_J1`, `Bessel_JN`, `Bessel_Y0`, `Bessel_Y1`, `Bessel_YN`, `BesY`, `BesYN`, `BGE`, `BGT`, `BLE`, `BLT`, `Bit_Size`, `BTest`, `CAbs`, `CCos`, `Ceiling`, `CExp`, `Char`, `ChDir`, `ChMod`, `CLog`, `Cmplx`, `Command_Argument_Count`, `Complex`, `Conjg`, `Cos`, `CosH`, `Count`, `CPU_Time`, `CShift`, `CSin`, `CSqRt`, `CTime`, `C_Loc`, `C_Associated`, `C_Null_Ptr`, `C_Null_Funptr`, `C_F_Pointer`, `C_F_ProcPointer`, `C_Null_Char`, `C_Alert`, `C_Backspace`, `C_Form_Feed`, `C_FunLoc`, `C_Sizeof`, `C_New_Line`, `C_Carriage_Return`, `C_Horizontal_Tab`, `C_Vertical_Tab`, `DAbs`, `DACos`, `DASin`, `DATan`, `Date_and_Time`, `DbesJ`, `DbesJN`, `DbesY`, `DbesYN`, `Dble`, `DCos`, `DCosH`, `DDiM`, `DErF`, `DErFC`, `DExp`, `Digits`, `DiM`, `DInt`, `DLog`, `DMax`, `DMin`, `DMod`, `DNInt`, `Dot_Product`, `DProd`, `DSign`, `DSinH`, `DShiftL`, `DShiftR`, `DSin`, `DSqRt`, `DTanH`, `DTan`, `DTime`, `EOShift`, `Epsilon`, `ErF`, `ErFC`, `ErFC_Scaled`, `ETime`, `Execute_Command_Line`, `Exit`, `Exp`, `Exponent`, `Extends_Type_Of`, `FDate`, `FGet`, `FGetC`, `FindLoc`, `Float`, `Floor`, `Flush`, `FNum`, `FPutC`, `FPut`, `Fraction`, `FSeek`, `FStat`, `FTell`, `Gamma`, `GError`, `GetArg`, `Get_Command`, `Get_Command_Argument`, `Get_Environment_Variable`, `GetCWD`, `GetEnv`, `GetGId`, `GetLog`, `GetPId`, `GetUId`, `GMTime`, `HostNm`, `Huge`, `Hypot`, `IAbs`, `IAChar`, `IAll`, `IAnd`, `IAny`, `IArgC`, `IBClr`, `IBits`, `IBSet`, `IChar`, `IDate`, `IDiM`, `IDInt`, `IDNInt`, `IEOr`, `IErrNo`, `IFix`, `Imag`, `ImagPart`, `Image_Index`, `Index`, `Int`, `IOr`, `IParity`, `IRand`, `IsaTty`, `IShft`, `IShftC`, `ISign`, `Iso_C_Binding`, `Is_Contiguous`, `Is_Iostat_End`, `Is_Iostat_Eor`, `ITime`, `Kill`, `Kind`, `LBound`, `LCoBound`, `Len`, `Len_Trim`, `LGe`, `LGt`, `Link`, `LLe`, `LLt`, `LnBlnk`, `Loc`, `Log`, `Log_Gamma`, `Logical`, `Long`, `LShift`, `LStat`, `LTime`, `MaskL`, `MaskR`, `MatMul`, `Max`, `MaxExponent`, `MaxLoc`, `MaxVal`, `MClock`, `Merge`, `Merge_Bits`, `Move_Alloc`, `Min`, `MinExponent`, `MinLoc`, `MinVal`, `Mod`, `Modulo`, `MvBits`, `Nearest`, `New_Line`, `NInt`, `Norm2`, `Not`, `Null`, `Num_Images`, `Or`, `Pack`, `Parity`, `PError`, `Precision`, `Present`, `Product`, `Radix`, `Rand`, `Random_Number`, `Random_Seed`, `Range`, `Real`, `RealPart`, `Rename`, `Repeat`, `Reshape`, `RRSpacing`, `RShift`, `Same_Type_As`, `Scale`, `Scan`, `Second`, `Selected_Char_Kind`, `Selected_Int_Kind`, `Selected_Real_Kind`, `Set_Exponent`, `Shape`, `ShiftA`, `ShiftL`, `ShiftR`, `Short`, `Sign`, `Signal`, `SinH`, `Sin`, `Sleep`, `Sngl`, `Spacing`, `Spread`, `SqRt`, `SRand`, `Stat`, `Storage_Size`, `Sum`, `SymLnk`, `System`, `System_Clock`, `Tan`, `TanH`, `Time`, `This_Image`, `Tiny`, `TrailZ`, `Transfer`, `Transpose`, `Trim`, `TtyNam`, `UBound`, `UCoBound`, `UMask`, `Unlink`, `Unpack`, `Verify`, `XOr`, `ZAbs`, `ZCos`, `ZExp`, `ZLog`, `ZSin`, `ZSqRt`), NameBuiltin, nil},
{`\.(true|false)\.`, NameBuiltin, nil},
{`\.(eq|ne|lt|le|gt|ge|not|and|or|eqv|neqv)\.`, OperatorWord, nil},
},
"strings": {
{`(?s)"(\\\\|\\[0-7]+|\\.|[^"\\])*"`, LiteralStringDouble, nil},
{`(?s)'(\\\\|\\[0-7]+|\\.|[^'\\])*'`, LiteralStringSingle, nil},
},
"nums": {
{`\d+(?![.e])(_[a-z]\w+)?`, LiteralNumberInteger, nil},
{`[+-]?\d*\.\d+([ed][-+]?\d+)?(_[a-z]\w+)?`, LiteralNumberFloat, nil},
{`[+-]?\d+\.\d*([ed][-+]?\d+)?(_[a-z]\w+)?`, LiteralNumberFloat, nil},
},
},
))

Some files were not shown because too many files have changed in this diff Show more