Major refactoring: split type checker, fix bugs, improve naming
- Split 8280-line type_checker.nim into 13 modules in type_checking/ directory (types, type_utils, comparison, pragmas, user_defines, borrow_check, interfaces, builtins, names, resolution, expressions, declarations + entry point) - Fix 10 bugs: parser expect() dead variable, fmterr lexing label, VM debug issues, debugger double-print, parseCmp precedence, unreachable handler - Fix 12 typos/doc errors across token, lexer, parser, multibyte, opcodes, testing, and config modules - Remove dead code: unused import std/math, ObjectKind variants - Extract shared utilities (unwrapType, exprKey, nameKey) to type_system.nim - Rename optimizer.nim to ref_analysis.nim (reflects actual purpose) - Refactor peon.nim: extract CompilationConfig struct and parseCommandLine proc - Rename AST fields for clarity: UnaryExpr.a->operand, BinaryExpr.a/b->left/right, constr->resolvedConstraint, VM stack ops, multibyte functions, opcode sets - Remove duplicate planning document Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This commit is contained in:
7
.gitignore
vendored
7
.gitignore
vendored
@@ -150,3 +150,10 @@ cython_debug/
|
||||
bin/
|
||||
.buildcache
|
||||
*.vsix
|
||||
|
||||
# Local test/build executables
|
||||
/peon_program
|
||||
/tests/cli_flags
|
||||
/tests/module_typecheck
|
||||
/tests/parse_regressions
|
||||
/tests/typecheck
|
||||
|
||||
@@ -1,422 +0,0 @@
|
||||
# Deref, Borrowed Match, And UFCS Plan
|
||||
|
||||
This document captures a concrete implementation plan for four related frontend features:
|
||||
|
||||
- a dereference surface, with `x[]` as the preferred spelling
|
||||
- an address-of operator that maps to the current `borrow(...)` semantics
|
||||
- Rust-style shorthand variant bindings such as `case Ok(value)`
|
||||
- UFCS-style calls such as `x.len()`, with field access always winning over UFCS lookup
|
||||
|
||||
It is based on the repository state as of March 24, 2026.
|
||||
|
||||
|
||||
## Summary
|
||||
|
||||
The shortest path is:
|
||||
|
||||
1. Treat `x[]` as a real dereference/place expression.
|
||||
2. Add a dedicated address-of surface that lowers through the existing borrow pipeline.
|
||||
3. Split match destructuring into two modes:
|
||||
- `match x` matches by value
|
||||
- `match x[]` matches through a borrowed view and must not move payload fields out
|
||||
4. Add UFCS only for `CallExpr(GetterExpr(...))`, and only after ordinary field resolution fails.
|
||||
|
||||
That keeps the language rule simple:
|
||||
|
||||
- field access always has priority over UFCS
|
||||
- borrowed matches are explicit
|
||||
- by-value matches stay available for extraction
|
||||
|
||||
|
||||
## Current Repository State
|
||||
|
||||
### 1. `x[]` is already parseable
|
||||
|
||||
The postfix parser already treats `()`, `.`, and `[]` as one chain, and `parseSlice()` allows an empty bracket list. The parser therefore already accepts `x[]`; the blocker is semantic lowering, not syntax.
|
||||
|
||||
Relevant files:
|
||||
|
||||
- `src/frontend/parsing/parser.nim`
|
||||
- `parseSlice()`
|
||||
- `call()`
|
||||
|
||||
Current gap:
|
||||
|
||||
- `sliceExpr()` in `src/frontend/compiler/type_checker.nim` rejects anything other than exactly one element, so `x[]` currently dies during typechecking instead of becoming a deref expression.
|
||||
|
||||
### 2. The compiler already has borrow machinery
|
||||
|
||||
The current borrow surface is the builtin `borrow(x)`.
|
||||
|
||||
Relevant files:
|
||||
|
||||
- `src/peon/stdlib/builtins/misc.pn`
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
- `builtinBorrowExpr()`
|
||||
- `borrowType()`
|
||||
- `src/frontend/compiler/type_system.nim`
|
||||
- `TypedBorrowExpr`
|
||||
- `src/backend/c/codegen/generator.nim`
|
||||
- `generateBorrowExpr()`
|
||||
|
||||
This is useful because the address-of operator does not need a new ownership model. It only needs syntax plus a lowering path into the existing borrow representation.
|
||||
|
||||
### 3. `case Ok(value)` already works syntactically
|
||||
|
||||
The current match binder already supports positional payload bindings by field order, so `case Ok(value)` is already accepted by the typechecker for a single-field variant. The verbose `case Ok(value = value)` form is not required for that case today.
|
||||
|
||||
Relevant files:
|
||||
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
- `resolveMatchVariant()`
|
||||
- `typeMatchBindings()`
|
||||
|
||||
What is missing is not the shorthand syntax. What is missing is correct ownership-mode semantics for match bindings.
|
||||
|
||||
### 4. Match bindings are copy/deep-copy style today
|
||||
|
||||
Current typed representation:
|
||||
|
||||
- `TypedMatchArm` only stores `variant`, `bindings`, and `body`
|
||||
- it does not record whether a binding is by-value or by-borrow
|
||||
|
||||
Current frontend behavior:
|
||||
|
||||
- `matchStmt()` creates names for payload bindings and pushes them into the arm scope
|
||||
- `matchBindingName()` always gives the binding the field's value type directly
|
||||
- there is no distinction between owned and borrowed destructuring
|
||||
|
||||
Current ownership analysis:
|
||||
|
||||
- match bindings are classified from their value type alone
|
||||
- there is no way to tell ownership analysis that a binding is a borrow of a payload field rather than a copied-out local
|
||||
|
||||
Current backend behavior:
|
||||
|
||||
- the bytecode backend loads each bound field and then calls `emitDeepCopyIfNeeded(...)` before installing the local
|
||||
- the native C backend emits a fresh local initialized with `generateValueCopy(...)`
|
||||
|
||||
Relevant files:
|
||||
|
||||
- `src/frontend/compiler/type_system.nim`
|
||||
- `TypedMatchArm`
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
- `matchTargetType()`
|
||||
- `matchBindingName()`
|
||||
- `typeMatchBindings()`
|
||||
- `matchStmt()`
|
||||
- `src/frontend/compiler/ownership_analysis.nim`
|
||||
- match-arm handling in `visitNode()`
|
||||
- `src/backend/bytecode/codegen/generator.nim`
|
||||
- `generateMatchStmt()`
|
||||
- `src/backend/c/codegen/generator.nim`
|
||||
- `generateMatchBindingBlock()`
|
||||
- `generateMatchStmt()`
|
||||
|
||||
Implication:
|
||||
|
||||
- today, payload bindings do not model "move out of owned scrutinee" versus "borrow from dereferenced scrutinee"
|
||||
- both backends currently materialize new locals from the payload
|
||||
|
||||
### 5. UFCS is not implemented today
|
||||
|
||||
The parser already builds `GetterExpr` followed by `CallExpr`, so `x.len()` parses.
|
||||
|
||||
The typechecker does not treat that as method syntax. In `call()`:
|
||||
|
||||
- `CallExpr(IdentExpr(...))` resolves normal free calls
|
||||
- `CallExpr(GetterExpr(...))` only handles:
|
||||
- qualified enum constructors such as `Type.Variant(...)`
|
||||
- module-member calls such as `module.fn(...)`
|
||||
|
||||
For ordinary values, `call(getterExpr)` currently falls through to `moduleMembers(...)`, which errors unless the left-hand side is a module alias.
|
||||
|
||||
Relevant files:
|
||||
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
- `getter()`
|
||||
- `call()`
|
||||
- `moduleMembers()`
|
||||
|
||||
|
||||
## Proposed Semantics
|
||||
|
||||
### 1. Deref syntax
|
||||
|
||||
Recommendation:
|
||||
|
||||
- make `x[]` the canonical deref spelling
|
||||
- do not add `*x` in the same patch unless there is a strong ergonomics reason
|
||||
|
||||
Why:
|
||||
|
||||
- the parser already accepts `x[]`
|
||||
- the requested borrowed-match spelling is `match x[]`
|
||||
- postfix deref composes naturally with the existing postfix chain:
|
||||
- `x[].field`
|
||||
- `x[][i]`
|
||||
- `x[].len()`
|
||||
|
||||
Semantic rule:
|
||||
|
||||
- `x[]` means "open the pointee/place behind a managed handle or borrow"
|
||||
- it must preserve place-ness so later field/index access and pattern matching can operate on the underlying storage without forcing a value copy
|
||||
|
||||
### 2. Address-of syntax
|
||||
|
||||
Recommendation:
|
||||
|
||||
- add a dedicated address-of operator and lower it to the same typed form as `borrow(...)`
|
||||
- prefer `&x` unless the parser/operator bootstrapping makes that too fragile
|
||||
|
||||
Important constraint:
|
||||
|
||||
- `&` already exists as a binary operator in the stdlib bitwise surface
|
||||
- if prefix `&` is chosen, it should be treated as core syntax, not as "this only parses if `operator \`&\`` happens to be in scope"
|
||||
|
||||
Semantic rule:
|
||||
|
||||
- address-of is exactly sugar for `borrow(...)`
|
||||
- it keeps the current "stable value location only" rule
|
||||
- it returns a `lent` handle, not an owning `ref`
|
||||
|
||||
### 3. Match binding modes
|
||||
|
||||
Peon does not currently have Rust's `Copy` trait model. The practical split in this compiler is therefore:
|
||||
|
||||
- by-value destructuring
|
||||
- by-borrow destructuring
|
||||
|
||||
Recommended rule:
|
||||
|
||||
- `match x`
|
||||
- matches by value
|
||||
- payload bindings are materialized as locals sourced from the payload
|
||||
- for managed ownership-sensitive payloads, this path is where move-out semantics belong
|
||||
- `match x[]`
|
||||
- matches through a borrowed/dereferenced view
|
||||
- payload bindings do not move out of the variant
|
||||
- each binding becomes a borrowed view of the corresponding payload field
|
||||
|
||||
This matches the user-facing rule from Rust closely enough to stay predictable:
|
||||
|
||||
- owned scrutinee: extract
|
||||
- borrowed scrutinee: bind borrows
|
||||
|
||||
Binding surface:
|
||||
|
||||
- keep positional shorthand such as `Ok(value)`
|
||||
- keep keyword bindings such as `Ok(value = renamed)`
|
||||
- allow `_` in either form
|
||||
|
||||
No new pattern syntax is required for the shorthand request itself.
|
||||
|
||||
### 4. UFCS
|
||||
|
||||
Recommended rule:
|
||||
|
||||
- `x.f(...)` resolves in this order:
|
||||
1. real field/module/type-member access through the existing `getter()` path
|
||||
2. if that succeeds and yields a callable value, call it normally
|
||||
3. if ordinary field access fails because `x` has no such field, try UFCS by rewriting to `f(x, ...)`
|
||||
|
||||
Consequences:
|
||||
|
||||
- fields always win
|
||||
- if `x` has a field `len` and that field is not callable, `x.len()` is an error
|
||||
- the compiler must not silently fall back to UFCS in that case
|
||||
- `x.len` without `()` remains ordinary field access only
|
||||
|
||||
Scope of UFCS:
|
||||
|
||||
- only rewrite `CallExpr(GetterExpr(...))`
|
||||
- do not introduce first-class method values in this patch
|
||||
- keep qualified enum constructors such as `Result.Ok(...)` working as they do today
|
||||
|
||||
|
||||
## Implementation Plan
|
||||
|
||||
### Workstream 1: Add typed deref and address-of nodes
|
||||
|
||||
Frontend changes:
|
||||
|
||||
- Extend the typed AST with explicit nodes for:
|
||||
- dereference/place-open
|
||||
- address-of, unless it reuses `TypedBorrowExpr` directly
|
||||
- Decide whether the surface AST needs a dedicated node for deref, or whether empty `SliceExpr` is lowered directly into a typed deref node
|
||||
- Keep `x[]` distinct from index access as early as possible in semantic lowering
|
||||
|
||||
Files:
|
||||
|
||||
- `src/frontend/compiler/type_system.nim`
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
|
||||
Recommendation:
|
||||
|
||||
- do not add a parse-tree node for `x[]` unless it materially simplifies later passes
|
||||
- parse it as `SliceExpr` and lower it into a dedicated typed deref expression in the typechecker
|
||||
|
||||
### Workstream 2: Teach the typechecker the two match modes
|
||||
|
||||
Changes:
|
||||
|
||||
- Extend `matchTargetType()` so variant matching can recognize borrowed/dereferenced enum views, not just plain values and `ref`
|
||||
- Introduce explicit match binding mode metadata, for example:
|
||||
- `ByValue`
|
||||
- `ByBorrow`
|
||||
- Extend `TypedMatchArm` so each arm records enough information for later passes to know how bindings were created
|
||||
- Change `matchStmt()` so the scrutinee determines the default binding mode:
|
||||
- plain value target -> `ByValue`
|
||||
- deref/borrow target -> `ByBorrow`
|
||||
- Keep `typeMatchBindings()` as the parser-facing binder for names, but route binding type construction through a new helper that can choose:
|
||||
- field type directly for `ByValue`
|
||||
- borrowed field type for `ByBorrow`
|
||||
|
||||
Files:
|
||||
|
||||
- `src/frontend/compiler/type_system.nim`
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
|
||||
Important detail:
|
||||
|
||||
- borrowed match bindings should not be represented as ordinary copied values with the same field type
|
||||
- they need either `lent` field types or another explicit representation that later passes can recognize as borrowed aliases
|
||||
|
||||
### Workstream 3: Preserve ownership information through analysis
|
||||
|
||||
Changes:
|
||||
|
||||
- Update ownership analysis so match bindings are classified from binding mode, not just from the stored type
|
||||
- Borrowed match bindings must be recorded as borrowed aliases of the payload field
|
||||
- By-value match bindings should keep the current cleanup/ownership behavior
|
||||
|
||||
Files:
|
||||
|
||||
- `src/frontend/compiler/ownership_analysis.nim`
|
||||
|
||||
Stretch goal:
|
||||
|
||||
- decide whether matching an owning scrutinee with by-value bindings should count as consuming the original binding for move-tracked cases
|
||||
- if yes, hook that into the existing move-tracking logic instead of faking it in codegen
|
||||
|
||||
### Workstream 4: Lower borrowed match bindings correctly in both backends
|
||||
|
||||
Bytecode backend:
|
||||
|
||||
- stop unconditionally deep-copying payload bindings
|
||||
- add a second lowering path for borrowed match bindings
|
||||
- that path should bind a view/reference to the payload slot instead of materializing a copied value
|
||||
|
||||
Native C backend:
|
||||
|
||||
- stop unconditionally emitting `generateValueCopy(...)` for every match binding
|
||||
- add a borrowed-binding path that binds either:
|
||||
- a direct pointer/view into the payload storage
|
||||
- or a `lent` wrapper built from the payload address
|
||||
- ensure cleanup flags do not destroy payload state that remains owned by the match target
|
||||
|
||||
Files:
|
||||
|
||||
- `src/backend/bytecode/codegen/generator.nim`
|
||||
- `src/backend/c/codegen/generator.nim`
|
||||
|
||||
Important detail:
|
||||
|
||||
- the C backend currently copies the whole scrutinee into `match_target` and then copies fields out again
|
||||
- borrowed match lowering must avoid turning that into an accidental copy-out model
|
||||
|
||||
### Workstream 5: Add UFCS without breaking field priority
|
||||
|
||||
Typechecker changes:
|
||||
|
||||
- split the current `call(getterExpr)` path into three explicit cases:
|
||||
- qualified enum constructor
|
||||
- module-member call
|
||||
- value-target call site that may become UFCS
|
||||
- For value targets:
|
||||
1. try ordinary `getter()` first
|
||||
2. if it succeeds, perform a normal call on the returned expression
|
||||
3. only if there is no real field, search visible free functions named `getter.name`
|
||||
4. resolve them against the rewritten signature `(target, args...)`
|
||||
|
||||
Builtin handling:
|
||||
|
||||
- UFCS should reuse the existing builtin special cases for:
|
||||
- `borrow`
|
||||
- `move`
|
||||
- `clone`
|
||||
- `len`
|
||||
- `high`
|
||||
- `new` should stay out of UFCS unless there is a separate reason to support it
|
||||
|
||||
Files:
|
||||
|
||||
- `src/frontend/compiler/type_checker.nim`
|
||||
|
||||
Recommendation:
|
||||
|
||||
- implement UFCS only in `call(getterExpr)`
|
||||
- do not touch ordinary `getter()` resolution semantics
|
||||
|
||||
### Workstream 6: Tests and language docs
|
||||
|
||||
Add parser/typechecker/codegen coverage for:
|
||||
|
||||
- `x[]` parses and typechecks as deref
|
||||
- `match x[]` is treated as a variant match on a borrowed view
|
||||
- `case Ok(value)` binds positionally
|
||||
- borrowed match bindings do not move payload state out of the variant
|
||||
- by-value match bindings preserve the intended ownership semantics for managed refs
|
||||
- `x.len()` resolves to `len(x)`
|
||||
- field priority:
|
||||
- if `x.len` is a field, `x.len()` does not fall back to UFCS
|
||||
- chaining:
|
||||
- `x[].field`
|
||||
- `x[].len()`
|
||||
- `(&x)[]` or the final chosen spelling
|
||||
|
||||
Likely test files:
|
||||
|
||||
- `tests/parse.nim`
|
||||
- `tests/parse_regressions.nim`
|
||||
- `tests/typecheck.nim`
|
||||
- `tests/codegen.nim`
|
||||
- `tests/c_codegen.nim`
|
||||
- `tests/variants.nim`
|
||||
|
||||
|
||||
## Recommended Landing Order
|
||||
|
||||
1. Land typed deref and address-of.
|
||||
2. Switch `matchTargetType()` and `TypedMatchArm` to explicit binding modes.
|
||||
3. Update both backends so borrowed and by-value match bindings lower differently.
|
||||
4. Add UFCS last, once deref and borrowed-place semantics exist.
|
||||
|
||||
This order keeps the highest-risk semantic work first and avoids hiding ownership bugs behind call-syntax sugar.
|
||||
|
||||
|
||||
## Open Decisions
|
||||
|
||||
### 1. Whether by-value match should be a true move for all payload types
|
||||
|
||||
Current repository behavior is copy/deep-copy oriented for match bindings. A fully Rust-like move-out model would require more than parser sugar.
|
||||
|
||||
Recommendation:
|
||||
|
||||
- treat this as a deliberate semantic change, not as an incidental side effect of `Ok(value)`
|
||||
- implement borrowed match first
|
||||
- then decide whether by-value bindings should remain "copy/deep-copy locals" for plain values while moving ownership-sensitive handles, or whether Peon wants a stricter aggregate move model
|
||||
|
||||
### 2. Final address-of spelling
|
||||
|
||||
Recommendation:
|
||||
|
||||
- prefer `&x` if it is promoted to core syntax
|
||||
- otherwise keep `borrow(x)` as the implementation primitive and choose a non-conflicting surface spelling later
|
||||
|
||||
### 3. Whether `*x` is still worth adding
|
||||
|
||||
Recommendation:
|
||||
|
||||
- no for the first pass
|
||||
- `x[]` already fits the requested borrowed-match story and uses syntax the parser can accept immediately
|
||||
@@ -1,5 +1,5 @@
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/optimizer
|
||||
import frontend/compiler/type_checking
|
||||
import frontend/compiler/ref_analysis
|
||||
import backend/bytecode/opcodes
|
||||
import backend/bytecode/tooling/multibyte
|
||||
import util/literals
|
||||
@@ -379,7 +379,7 @@ proc makeConstant(self: BytecodeGenerator, value: TypedExpr): array[3, uint8] =
|
||||
of Tiny:
|
||||
result = self.chunk.writeConstant([uint8(parseInt(lit))])
|
||||
of Short:
|
||||
result = self.chunk.writeConstant(parseInt(lit).toDouble())
|
||||
result = self.chunk.writeConstant(parseInt(lit).toUint16Bytes())
|
||||
of Long:
|
||||
result = self.chunk.writeConstant(parseInt(lit).toQuad())
|
||||
of LongLong:
|
||||
@@ -921,8 +921,8 @@ proc endScope(self: BytecodeGenerator) =
|
||||
# the pop count is greater than zero
|
||||
while popCount > 0:
|
||||
self.emitByte(PopN, self.currentNode.node.token.line)
|
||||
self.emitBytes(popCount.toDouble(), self.currentNode.node.token.line)
|
||||
popCount -= popCount.toDouble().fromDouble().int
|
||||
self.emitBytes(popCount.toUint16Bytes(), self.currentNode.node.token.line)
|
||||
popCount -= popCount.toUint16Bytes().fromUint16Bytes().int
|
||||
elif popCount == 1:
|
||||
# We only emit PopN if we're popping more than one value
|
||||
self.emitByte(PopC, self.currentNode.node.token.line)
|
||||
@@ -952,8 +952,8 @@ proc emitScopeCleanupTo(self: BytecodeGenerator, cleanupDepth, line: int) =
|
||||
if popCount > 1:
|
||||
while popCount > 0:
|
||||
self.emitByte(PopN, line)
|
||||
self.emitBytes(popCount.toDouble(), line)
|
||||
popCount -= popCount.toDouble().fromDouble().int
|
||||
self.emitBytes(popCount.toUint16Bytes(), line)
|
||||
popCount -= popCount.toUint16Bytes().fromUint16Bytes().int
|
||||
elif popCount == 1:
|
||||
self.emitByte(PopC, line)
|
||||
|
||||
@@ -1278,12 +1278,12 @@ proc generateUnary(self: BytecodeGenerator, expression: TypedExpr) =
|
||||
let node = UnaryExpr(expression.node)
|
||||
var
|
||||
default: TypedExpr
|
||||
signature = @[("", TypedUnaryExpr(expression).a.kind, default, false)]
|
||||
signature = @[("", TypedUnaryExpr(expression).operand.kind, default, false)]
|
||||
fn = Type(kind: Function, returnType: Type(kind: Any), signature: signature)
|
||||
let impl = self.typeChecker.match(node.token.lexeme, fn.signature, @[TypedUnaryExpr(expression).a], node)
|
||||
let impl = self.typeChecker.match(node.token.lexeme, fn.signature, @[TypedUnaryExpr(expression).operand], node)
|
||||
self.generateCall(TypedCallExpr(node: node,
|
||||
callee: newTypedNamedExpr(Expression(node), impl),
|
||||
args: @[TypedUnaryExpr(expression).a],
|
||||
args: @[TypedUnaryExpr(expression).operand],
|
||||
kind: impl.valueType.returnType))
|
||||
|
||||
|
||||
@@ -1291,34 +1291,34 @@ proc generateBinary(self: BytecodeGenerator, expression: TypedExpr) =
|
||||
## Emits code for binary expressions
|
||||
let node = BinaryExpr(expression.node)
|
||||
let binary = TypedBinaryExpr(expression)
|
||||
let leftType = binary.a.kind.unwrapType()
|
||||
let rightType = binary.b.kind.unwrapType()
|
||||
let leftType = binary.left.kind.unwrapType()
|
||||
let rightType = binary.right.kind.unwrapType()
|
||||
let resultType = expression.kind.unwrapType()
|
||||
if node.token.lexeme == "and":
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.left)
|
||||
let jump = self.emitJump(JumpIfFalseOrPop, node.token.line)
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.right)
|
||||
self.patchJump(jump)
|
||||
return
|
||||
if node.token.lexeme == "or":
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.left)
|
||||
let jump = self.emitJump(JumpIfTrue, node.token.line)
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.right)
|
||||
self.patchJump(jump)
|
||||
return
|
||||
if node.token.lexeme == "+" and leftType.kind == TypeKind.String and rightType.kind == TypeKind.String:
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.right)
|
||||
self.generateExpression(binary.left)
|
||||
self.emitByte(ConcatString, node.token.line)
|
||||
return
|
||||
if leftType.kind == TypeKind.String and rightType.kind == TypeKind.String and node.token.lexeme in ["==", "!="]:
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.right)
|
||||
self.generateExpression(binary.left)
|
||||
self.emitByte(if node.token.lexeme == "==": Equal else: NotEqual, node.token.line)
|
||||
return
|
||||
if leftType.kind == Integer and rightType.kind == Integer:
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.right)
|
||||
self.generateExpression(binary.left)
|
||||
case node.token.lexeme
|
||||
of "+":
|
||||
self.emitByte(Add, node.token.line)
|
||||
@@ -1356,8 +1356,8 @@ proc generateBinary(self: BytecodeGenerator, expression: TypedExpr) =
|
||||
else:
|
||||
discard
|
||||
if leftType.kind == Float and rightType.kind == Float:
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.right)
|
||||
self.generateExpression(binary.left)
|
||||
let wide = leftType.width == Full
|
||||
case node.token.lexeme
|
||||
of "+":
|
||||
@@ -1393,20 +1393,20 @@ proc generateBinary(self: BytecodeGenerator, expression: TypedExpr) =
|
||||
else:
|
||||
discard
|
||||
if leftType.kind == Boolean and rightType.kind == Boolean and node.token.lexeme in ["==", "!="]:
|
||||
self.generateExpression(binary.b)
|
||||
self.generateExpression(binary.a)
|
||||
self.generateExpression(binary.right)
|
||||
self.generateExpression(binary.left)
|
||||
self.emitByte(if node.token.lexeme == "==": Equal else: NotEqual, node.token.line)
|
||||
return
|
||||
var
|
||||
default: TypedExpr
|
||||
signature = @[("", binary.a.kind, default, false),
|
||||
("", binary.b.kind, default, false)]
|
||||
signature = @[("", binary.left.kind, default, false),
|
||||
("", binary.right.kind, default, false)]
|
||||
fn = Type(kind: Function, returnType: Type(kind: Any), signature: signature)
|
||||
let impl = self.typeChecker.match(node.token.lexeme, fn.signature,
|
||||
@[binary.a, binary.b], node)
|
||||
@[binary.left, binary.right], node)
|
||||
self.generateCall(TypedCallExpr(node: node,
|
||||
callee: newTypedNamedExpr(Expression(node), impl),
|
||||
args: @[binary.a, binary.b],
|
||||
args: @[binary.left, binary.right],
|
||||
kind: impl.valueType.returnType))
|
||||
|
||||
|
||||
@@ -1902,13 +1902,13 @@ proc generateFunDecl(self: BytecodeGenerator, decl: TypedFunDecl) =
|
||||
self.chunk.functions.add(0.toTriple()) # Patched later
|
||||
self.chunk.functions.add(uint8(node.parameters.len()))
|
||||
if not node.name.isNil():
|
||||
self.chunk.functions.add(decl.name.ident.token.lexeme.len().toDouble())
|
||||
self.chunk.functions.add(decl.name.ident.token.lexeme.len().toUint16Bytes())
|
||||
var s = decl.name.ident.token.lexeme
|
||||
if s.len() >= uint16.high().int:
|
||||
s = node.name.token.lexeme[0..uint16.high()]
|
||||
self.chunk.functions.add(s.toBytes())
|
||||
else:
|
||||
self.chunk.functions.add(0.toDouble())
|
||||
self.chunk.functions.add(0.toUint16Bytes())
|
||||
let
|
||||
previousFunction = self.currentFunction
|
||||
previousStackSize = self.stackSize
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
# Copyright 2023 Mattia Giambirtone & All Contributors
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
@@ -360,7 +360,7 @@ const argumentDoubleInstructions* = {PopN, }
|
||||
const argumentTripleInstructions* = {ImportModule, LoadField, StoreField, MakeFieldRef}
|
||||
|
||||
# Instructions taking two hardcoded 24-bit arguments
|
||||
const argumentSextupleInstructions* = {MakeObject, MakeModuleGlobalRef}
|
||||
const dualTripleArgInstructions* = {MakeObject, MakeModuleGlobalRef}
|
||||
|
||||
|
||||
# Jump instructions jump at relative or absolute bytecode offsets
|
||||
@@ -445,7 +445,7 @@ proc writeModule*(self: Chunk, module: ModuleInfo) =
|
||||
self.modules.add(module.start.toTriple())
|
||||
self.modules.add(module.stop.toTriple())
|
||||
self.modules.add(module.globals.toTriple())
|
||||
self.modules.add(module.name.len().toDouble())
|
||||
self.modules.add(module.name.len().toUint16Bytes())
|
||||
self.modules.add(module.name.toBytes())
|
||||
|
||||
|
||||
@@ -458,7 +458,7 @@ proc getModules*(self: Chunk): seq[ModuleInfo] =
|
||||
globals: int([self.modules[idx + 6], self.modules[idx + 7], self.modules[idx + 8]].fromTriple()),
|
||||
name: ""))
|
||||
idx += 9
|
||||
let size = int([self.modules[idx], self.modules[idx + 1]].fromDouble())
|
||||
let size = int([self.modules[idx], self.modules[idx + 1]].fromUint16Bytes())
|
||||
idx += 2
|
||||
result[^1].name = self.modules[idx ..< idx + size].fromBytes()
|
||||
idx += size
|
||||
|
||||
@@ -126,16 +126,15 @@ proc stackTripleInstruction(self: BytecodeDebugger, instruction: OpCode) =
|
||||
|
||||
proc stackDoubleInstruction(self: BytecodeDebugger, instruction: OpCode) =
|
||||
## Debugs instructions that operate on a single value on the stack using a 16-bit operand
|
||||
var slot = [self.chunk.code[self.current + 1], self.chunk.code[self.current + 2]].fromDouble()
|
||||
var slot = [self.chunk.code[self.current + 1], self.chunk.code[self.current + 2]].fromUint16Bytes()
|
||||
printInstruction(instruction)
|
||||
stdout.write(&", points to index ")
|
||||
stdout.styledWriteLine(fgGreen, &", points to index ", fgYellow, $slot)
|
||||
self.current += 3
|
||||
|
||||
|
||||
proc argumentDoubleInstruction(self: BytecodeDebugger, instruction: OpCode) =
|
||||
## Debugs instructions that operate on a hardcoded value on the stack using a 16-bit operand
|
||||
var slot = [self.chunk.code[self.current + 1], self.chunk.code[self.current + 2]].fromDouble()
|
||||
var slot = [self.chunk.code[self.current + 1], self.chunk.code[self.current + 2]].fromUint16Bytes()
|
||||
printInstruction(instruction)
|
||||
stdout.styledWriteLine(fgGreen, &", has argument ", fgYellow, $slot)
|
||||
self.current += 3
|
||||
@@ -241,7 +240,7 @@ proc disassembleInstruction*(self: BytecodeDebugger) =
|
||||
self.argumentDoubleInstruction(opcode)
|
||||
of argumentTripleInstructions:
|
||||
self.argumentTripleInstruction(opcode)
|
||||
of argumentSextupleInstructions:
|
||||
of dualTripleArgInstructions:
|
||||
self.argumentSextupleInstruction(opcode)
|
||||
of LoadModuleGlobal, StoreModuleGlobal:
|
||||
self.moduleGlobalInstruction(opcode)
|
||||
@@ -268,7 +267,7 @@ proc parseFunctions(self: BytecodeDebugger) =
|
||||
idx += 3
|
||||
argc = int(self.chunk.functions[idx])
|
||||
inc(idx)
|
||||
size = int([self.chunk.functions[idx], self.chunk.functions[idx + 1]].fromDouble())
|
||||
size = int([self.chunk.functions[idx], self.chunk.functions[idx + 1]].fromUint16Bytes())
|
||||
idx += 2
|
||||
name = self.chunk.functions[idx..<idx + size].fromBytes()
|
||||
inc(idx, size)
|
||||
|
||||
@@ -15,7 +15,7 @@
|
||||
## Utilities to handle multibyte sequences
|
||||
|
||||
|
||||
proc toDouble*(input: int | uint | uint16): array[2, uint8] =
|
||||
proc toUint16Bytes*(input: int | uint | uint16): array[2, uint8] =
|
||||
## Converts an unsigned integer
|
||||
## to an array[2, uint8]
|
||||
result = cast[array[2, uint8]](uint16(input))
|
||||
@@ -36,8 +36,8 @@ proc toLong*(input: int | uint | uint16 | uint32 | uint64): array[8, uint8] =
|
||||
result = cast[array[8, uint8]](uint(input))
|
||||
|
||||
|
||||
proc fromDouble*(input: array[2, uint8]): uint16 =
|
||||
## Rebuilds the output of toDouble into
|
||||
proc fromUint16Bytes*(input: array[2, uint8]): uint16 =
|
||||
## Rebuilds the output of toUint16Bytes into
|
||||
## an uint16
|
||||
copyMem(result.addr, unsafeAddr(input), sizeof(uint16))
|
||||
|
||||
@@ -49,14 +49,14 @@ proc fromTriple*(input: array[3, uint8]): uint =
|
||||
|
||||
|
||||
proc fromQuad*(input: array[4, uint8]): uint =
|
||||
## Rebuilts the output of toQuad into
|
||||
## an uint
|
||||
## Rebuilds the output of toQuad into
|
||||
## a uint
|
||||
copyMem(result.addr, unsafeAddr(input), sizeof(uint32))
|
||||
|
||||
|
||||
proc fromLong*(input: array[8, uint8]): uint =
|
||||
## Rebuilts the output of toQuad into
|
||||
## an uint
|
||||
## Rebuilds the output of toLong into
|
||||
## a uint
|
||||
copyMem(result.addr, unsafeAddr(input), sizeof(uint64))
|
||||
|
||||
|
||||
@@ -68,7 +68,7 @@ proc toBytes*(s: string): seq[byte] =
|
||||
|
||||
|
||||
proc toBytes*(s: int): array[8, uint8] =
|
||||
## Converts
|
||||
## Converts an int to its byte representation
|
||||
result = cast[array[8, uint8]](s)
|
||||
|
||||
|
||||
|
||||
@@ -61,8 +61,7 @@ type
|
||||
ObjectKind* = enum
|
||||
## A tag for heap-allocated
|
||||
## peon objects
|
||||
String, List,
|
||||
Dict, Tuple,
|
||||
String,
|
||||
Structure,
|
||||
HeapObject* = object
|
||||
## A tagged box for a heap-allocated
|
||||
@@ -382,7 +381,7 @@ proc markRoots(self: var PeonVM): HashSet[ptr HeapObject] =
|
||||
for p in result:
|
||||
if p.mark():
|
||||
when debugMarkGC:
|
||||
echo &"DEBUG - GC: Marked object: {obj[]}"
|
||||
echo &"DEBUG - GC: Marked object: {p[]}"
|
||||
when debugGC:
|
||||
echo "DEBUG - GC: Mark phase complete"
|
||||
|
||||
@@ -414,6 +413,8 @@ proc trace(self: var PeonVM, roots: HashSet[ptr HeapObject]) =
|
||||
let child = cast[ptr HeapObject](value)
|
||||
if child.mark():
|
||||
worklist.add(child)
|
||||
when debugGC:
|
||||
inc(count)
|
||||
else:
|
||||
discard # TODO: Other types
|
||||
inc(idx)
|
||||
@@ -619,33 +620,26 @@ func peekCall*(self: PeonVM, distance: int = 0): uint64 =
|
||||
self.calls[self.calls.high() + distance]
|
||||
|
||||
|
||||
func pushc(self: var PeonVM, val: uint64) =
|
||||
func pushCall(self: var PeonVM, val: uint64) =
|
||||
## Pushes a value onto the
|
||||
## call stack
|
||||
self.calls.add(val)
|
||||
|
||||
|
||||
func popc(self: var PeonVM): uint64 =
|
||||
func popCall(self: var PeonVM): uint64 =
|
||||
## Pops a value off the call
|
||||
## stack and returns it
|
||||
return self.calls.pop()
|
||||
|
||||
|
||||
func peekc(self: PeonVM, distance: int = 0): uint64 {.used.} =
|
||||
## Returns the value at the given
|
||||
## distance from the top of the
|
||||
## call stack without consuming it
|
||||
return self.calls[self.calls.high() + distance]
|
||||
|
||||
|
||||
func getc(self: PeonVM, idx: int): uint64 =
|
||||
func getCallSlot(self: PeonVM, idx: int): uint64 =
|
||||
## Getter method that abstracts
|
||||
## indexing our call stack through
|
||||
## stack frames
|
||||
return self.calls[idx.uint64 + self.frames[^1]]
|
||||
|
||||
|
||||
func setc(self: var PeonVM, idx: int, val: uint64) =
|
||||
func setCallSlot(self: var PeonVM, idx: int, val: uint64) =
|
||||
## Setter method that abstracts
|
||||
## indexing our call stack through
|
||||
## stack frames
|
||||
@@ -668,7 +662,7 @@ proc readShort(self: var PeonVM): uint16 =
|
||||
## bytecode and returns them
|
||||
## as an unsigned 16 bit
|
||||
## integer
|
||||
return [self.readByte(), self.readByte()].fromDouble()
|
||||
return [self.readByte(), self.readByte()].fromUint16Bytes()
|
||||
|
||||
|
||||
proc readLong(self: var PeonVM): uint32 =
|
||||
@@ -719,7 +713,7 @@ proc constReadUInt64(self: var PeonVM, idx: int): uint64 =
|
||||
proc constReadUInt32(self: var PeonVM, idx: int): uint32 =
|
||||
## Reads a constant from the
|
||||
## chunk's constant table and
|
||||
## returns it as an int32
|
||||
## returns it as a uint32
|
||||
var arr = [self.chunk.consts[idx], self.chunk.consts[idx + 1],
|
||||
self.chunk.consts[idx + 2], self.chunk.consts[idx + 3]]
|
||||
copyMem(result.addr, arr.addr, sizeof(arr))
|
||||
@@ -728,7 +722,7 @@ proc constReadUInt32(self: var PeonVM, idx: int): uint32 =
|
||||
proc constReadInt32(self: var PeonVM, idx: int): int32 =
|
||||
## Reads a constant from the
|
||||
## chunk's constant table and
|
||||
## returns it as an uint32
|
||||
## returns it as an int32
|
||||
var arr = [self.chunk.consts[idx], self.chunk.consts[idx + 1],
|
||||
self.chunk.consts[idx + 2], self.chunk.consts[idx + 3]]
|
||||
copyMem(result.addr, arr.addr, sizeof(arr))
|
||||
@@ -798,8 +792,8 @@ proc constReadString(self: var PeonVM, size, idx: int): ptr HeapObject =
|
||||
proc beginFrame(self: var PeonVM, target, retAddr: uint64, moduleId: int = -1) =
|
||||
## Creates a new call frame and jumps to the requested target
|
||||
self.ip = target
|
||||
self.pushc(target)
|
||||
self.pushc(retAddr)
|
||||
self.pushCall(target)
|
||||
self.pushCall(retAddr)
|
||||
self.results.add(self.getNil())
|
||||
self.frames.add(uint64(self.calls.len() - 2))
|
||||
self.frameModules.add(moduleId)
|
||||
@@ -808,9 +802,9 @@ proc beginFrame(self: var PeonVM, target, retAddr: uint64, moduleId: int = -1) =
|
||||
proc beginFunctionFrame(self: var PeonVM, target, retAddr, env: uint64) =
|
||||
## Creates a new ordinary function frame with a hidden env slot.
|
||||
self.ip = target
|
||||
self.pushc(target)
|
||||
self.pushc(retAddr)
|
||||
self.pushc(env)
|
||||
self.pushCall(target)
|
||||
self.pushCall(retAddr)
|
||||
self.pushCall(env)
|
||||
self.results.add(self.getNil())
|
||||
self.frames.add(uint64(self.calls.len() - 3))
|
||||
self.frameModules.add(-1)
|
||||
@@ -1099,7 +1093,7 @@ proc dispatch*(self: var PeonVM) {.inline.} =
|
||||
self.beginFunctionFrame(jmpAddr, retAddr, self.getNil())
|
||||
# Loads the arguments onto the stack
|
||||
for _ in 0..<argc:
|
||||
self.pushc(self.pop())
|
||||
self.pushCall(self.pop())
|
||||
# Pops the function and return address
|
||||
# off the operand stack since they're
|
||||
# not needed there anymore
|
||||
@@ -1117,7 +1111,7 @@ proc dispatch*(self: var PeonVM) {.inline.} =
|
||||
let env = closure.values[1]
|
||||
self.beginFunctionFrame(jmpAddr, retAddr, env)
|
||||
for _ in 0..<argc:
|
||||
self.pushc(self.pop())
|
||||
self.pushCall(self.pop())
|
||||
discard self.pop()
|
||||
discard self.pop()
|
||||
of Return:
|
||||
@@ -1165,28 +1159,28 @@ proc dispatch*(self: var PeonVM) {.inline.} =
|
||||
let idx = self.readLong()
|
||||
when debugVM:
|
||||
assert idx.int in 0..self.calls.high(), "StoreVar index is out of bounds"
|
||||
self.setc(idx.int, self.pop())
|
||||
self.setCallSlot(idx.int, self.pop())
|
||||
of AddVar:
|
||||
# Adds a new variable to the call stack. This is just
|
||||
# an optimization for StoreVar that avoids using an if
|
||||
# condition in the VM's bytecode dispatch loop (which is
|
||||
# not a great idea)
|
||||
self.pushc(self.pop())
|
||||
self.pushCall(self.pop())
|
||||
of MakeLocalRef:
|
||||
let idx = self.readLong().int
|
||||
self.push(cast[uint64](self.makeLocalVarRef(idx)))
|
||||
of LoadVar:
|
||||
# Pushes a local variable from the call stack
|
||||
# onto the operand stack
|
||||
self.push(self.getc(self.readLong().int))
|
||||
self.push(self.getCallSlot(self.readLong().int))
|
||||
of LoadRefSlot:
|
||||
let idx = self.readLong().int
|
||||
let reference = cast[ptr HeapObject](self.getc(idx))
|
||||
let reference = cast[ptr HeapObject](self.getCallSlot(idx))
|
||||
self.push(self.loadVarRef(reference))
|
||||
of StoreRefSlot:
|
||||
let idx = self.readLong().int
|
||||
let value = self.pop()
|
||||
let reference = cast[ptr HeapObject](self.getc(idx))
|
||||
let reference = cast[ptr HeapObject](self.getCallSlot(idx))
|
||||
self.storeVarRef(reference, value)
|
||||
of LoadGlobal:
|
||||
# Pushes a global variable from the call stack
|
||||
@@ -1337,21 +1331,21 @@ proc dispatch*(self: var PeonVM) {.inline.} =
|
||||
continue
|
||||
of PopC:
|
||||
# Pops a value off the call stack
|
||||
discard self.popc()
|
||||
discard self.popCall()
|
||||
of Pop:
|
||||
# Pops a value off the operand stack
|
||||
discard self.pop()
|
||||
of PushC:
|
||||
# Pops a value off the operand stack
|
||||
# and pushes it onto the call stack
|
||||
self.pushc(self.pop())
|
||||
self.pushCall(self.pop())
|
||||
of PopN:
|
||||
# Pops N elements off the call stack
|
||||
for _ in 0..<int(self.readShort()):
|
||||
discard self.popc()
|
||||
discard self.popCall()
|
||||
of LoadTOS:
|
||||
# Pushes the top of the call stack onto the operand stack
|
||||
self.push(self.peekc())
|
||||
self.push(self.peekCall())
|
||||
of DupTop:
|
||||
self.push(self.peek())
|
||||
# Jump opcodes
|
||||
@@ -1643,9 +1637,6 @@ proc run*(self: var PeonVM, chunk: Chunk, breakpoints: seq[uint64] = @[], repl:
|
||||
stderr.writeLine(&"VM: Fatal error at bytecode offset {self.ip - 1}: {e.name} -> {e.msg}")
|
||||
except CatchableError as e:
|
||||
stderr.writeLine(&"VM: Fatal error at bytecode offset {self.ip - 1}: {e.name} -> {e.msg}")
|
||||
except NilAccessDefect:
|
||||
stderr.writeLine(&"VM: Memory Access Violation (bytecode offset {self.ip}): SIGSEGV")
|
||||
quit(1)
|
||||
if not repl:
|
||||
# We clean up after ourselves!
|
||||
self.collect()
|
||||
|
||||
@@ -11,7 +11,7 @@
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
import frontend/compiler/optimizer
|
||||
import frontend/compiler/ref_analysis
|
||||
import frontend/compiler/type_system
|
||||
|
||||
|
||||
|
||||
@@ -1,5 +1,5 @@
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/optimizer
|
||||
import frontend/compiler/type_checking
|
||||
import frontend/compiler/ref_analysis
|
||||
import backend/c/codegen/names
|
||||
import backend/c/codegen/types as c_types
|
||||
import backend/c/codegen/ownership
|
||||
@@ -2386,11 +2386,11 @@ proc collectStringLiterals(self: NativeCGenerator, module: CompiledModule, expr:
|
||||
self.collectStringLiterals(module, assignment.indexExpr)
|
||||
self.collectStringLiterals(module, assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.collectStringLiterals(module, TypedUnaryExpr(expr).a)
|
||||
self.collectStringLiterals(module, TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.collectStringLiterals(module, binary.a)
|
||||
self.collectStringLiterals(module, binary.b)
|
||||
self.collectStringLiterals(module, binary.left)
|
||||
self.collectStringLiterals(module, binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -2567,11 +2567,11 @@ proc collectNestedFunctions(self: NativeCGenerator, module: CompiledModule, expr
|
||||
self.collectNestedFunctions(module, assignment.indexExpr)
|
||||
self.collectNestedFunctions(module, assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.collectNestedFunctions(module, TypedUnaryExpr(expr).a)
|
||||
self.collectNestedFunctions(module, TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.collectNestedFunctions(module, binary.a)
|
||||
self.collectNestedFunctions(module, binary.b)
|
||||
self.collectNestedFunctions(module, binary.left)
|
||||
self.collectNestedFunctions(module, binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -2686,11 +2686,11 @@ proc registerReachableTypes(self: NativeCGenerator, module: CompiledModule, expr
|
||||
self.registerReachableTypes(module, assignment.indexExpr)
|
||||
self.registerReachableTypes(module, assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.registerReachableTypes(module, TypedUnaryExpr(expr).a)
|
||||
self.registerReachableTypes(module, TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.registerReachableTypes(module, binary.a)
|
||||
self.registerReachableTypes(module, binary.b)
|
||||
self.registerReachableTypes(module, binary.left)
|
||||
self.registerReachableTypes(module, binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -4169,7 +4169,7 @@ proc generateExpression(self: NativeCGenerator, expression: TypedExpr): string =
|
||||
result = &"(({self.lowerType(expression.kind)})({decodeCharLiteral(expression.node.token.lexeme)}))"
|
||||
of NodeKind.unaryExpr:
|
||||
let node = TypedUnaryExpr(expression)
|
||||
let operand = self.generateExpression(node.a)
|
||||
let operand = self.generateExpression(node.operand)
|
||||
let resultType = expression.kind.unwrapType()
|
||||
if resultType.kind == Integer and resultType.signed and
|
||||
self.overflowChecksEnabled() and node.node.token.lexeme == "-":
|
||||
@@ -4182,17 +4182,17 @@ proc generateExpression(self: NativeCGenerator, expression: TypedExpr): string =
|
||||
result = &"""({self.mapOperator(node.node.token.lexeme, "unary")} {operand})"""
|
||||
of NodeKind.binaryExpr:
|
||||
let node = TypedBinaryExpr(expression)
|
||||
let leftType = node.a.kind.unwrapType()
|
||||
let rightType = node.b.kind.unwrapType()
|
||||
let leftType = node.left.kind.unwrapType()
|
||||
let rightType = node.right.kind.unwrapType()
|
||||
if leftType.kind == TypeKind.String and rightType.kind == TypeKind.String and node.node.token.lexeme == "+":
|
||||
return &"peon_concat_string({self.generateExpression(node.a)}, {self.generateExpression(node.b)})"
|
||||
return &"peon_concat_string({self.generateExpression(node.left)}, {self.generateExpression(node.right)})"
|
||||
if leftType.kind == TypeKind.String and rightType.kind == TypeKind.String and node.node.token.lexeme in ["==", "!="]:
|
||||
let compare = &"peon_str_eq({self.generateExpression(node.a)}, {self.generateExpression(node.b)})"
|
||||
let compare = &"peon_str_eq({self.generateExpression(node.left)}, {self.generateExpression(node.right)})"
|
||||
if node.node.token.lexeme == "==":
|
||||
return compare
|
||||
return &"(!{compare})"
|
||||
let left = self.generateExpression(node.a)
|
||||
let right = self.generateExpression(node.b)
|
||||
let left = self.generateExpression(node.left)
|
||||
let right = self.generateExpression(node.right)
|
||||
let resultType = expression.kind.unwrapType()
|
||||
if leftType.kind == Integer and rightType.kind == Integer and resultType.kind == Integer and
|
||||
self.overflowChecksEnabled() and node.node.token.lexeme in ["+", "-", "*", "/"]:
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/type_checking
|
||||
import std/tables
|
||||
|
||||
|
||||
@@ -111,6 +111,7 @@ func aliasSourceName*(expr: TypedExpr): Name =
|
||||
nil
|
||||
|
||||
|
||||
# TODO: Implement ownership helpers for the C backend if needed
|
||||
proc emitOwnershipHelpers*(): string = ""
|
||||
|
||||
|
||||
|
||||
@@ -31,8 +31,8 @@ const debugStressGC* {.booldefine.} = false # Make the GC run a collection at
|
||||
const debugMarkGC* {.booldefine.} = false # Trace the marking phase object by object (extremely verbose)
|
||||
const PeonBytecodeMarker* = "PEON_BYTECODE" # Magic value at the beginning of bytecode files
|
||||
const HeapGrowFactor* = 2 # The growth factor used by the GC to schedule the next collection
|
||||
const FirstGC* = 1024 * 1024; # How many bytes to allocate before running the first GC
|
||||
const enableVMChecks* {.booldefine.} = true; # Enables all types of compiler checks in the VM
|
||||
const FirstGC* = 1024 * 1024 # How many bytes to allocate before running the first GC
|
||||
const enableVMChecks* {.booldefine.} = true # Enables all types of compiler checks in the VM
|
||||
# List of paths where peon looks for modules, in order (empty path means current directory, which always takes precedence)
|
||||
const moduleLookupPaths*: seq[string] = @["", "src/peon/stdlib", absolutePath(joinPath(".local", "peon", "stdlib"), getenv("HOME"))]
|
||||
when HeapGrowFactor <= 1:
|
||||
@@ -41,7 +41,7 @@ const PeonVersion* = (major: 0, minor: 2, patch: 0)
|
||||
const PeonRelease* = "alpha"
|
||||
const PeonCommitHash* = staticExec("git rev-parse HEAD")
|
||||
const PeonBranch* = staticExec("git symbolic-ref HEAD 2>/dev/null | cut -f 3 -d /")
|
||||
const PeonVersionString* = &"Peon {PeonVersion.major}.{PeonVersion.minor}.{PeonVersion.patch} {PeonRelease} ({PeonBranch}, {CompileDate}, {CompileTime}, {PeonCommitHash[0..PeonCommitHash.high() mod 8]}) [Nim {NimVersion}] on {hostOS} ({hostCPU})"
|
||||
const PeonVersionString* = &"Peon {PeonVersion.major}.{PeonVersion.minor}.{PeonVersion.patch} {PeonRelease} ({PeonBranch}, {CompileDate}, {CompileTime}, {PeonCommitHash[0..7]}) [Nim {NimVersion}] on {hostOS} ({hostCPU})"
|
||||
const HelpMessage* = """The peon programming language, Copyright (C) 2026 Mattia Giambirtone & All Contributors
|
||||
|
||||
This program is free software, see the license distributed with this program or check
|
||||
@@ -64,8 +64,7 @@ Options
|
||||
|
||||
-h, --help Show this help text and exit
|
||||
-v, --version Print the current peon version and exit
|
||||
-s, --string Use the passed string as if it was a file
|
||||
-w, --warnings Turn warnings on or off (default: on). Acceptable values are
|
||||
-w, --warnings Turn warnings on or off (default: on). Acceptable values are
|
||||
yes/on and no/off
|
||||
--noWarn Disable a specific warning (examples: --noWarn:UserWarning,
|
||||
--noWarn:RawPointerLeak)
|
||||
@@ -95,6 +94,7 @@ Options
|
||||
-o, --output Rename the generated artifact
|
||||
-s, --string Run the given string as if it were a file (the filename is set to '<string>')
|
||||
--cc Use the specified C compiler when targeting the native C backend
|
||||
|
||||
--linker Use the specified linker when targeting the native C backend
|
||||
--passC Pass additional options to the native C compiler
|
||||
--passL Pass additional options to the native linker invocation
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/type_checking
|
||||
|
||||
import std/sets
|
||||
import std/strformat
|
||||
@@ -24,12 +24,6 @@ type
|
||||
escapingClosures: HashSet[uint]
|
||||
|
||||
|
||||
func nameKey(name: Name): uint =
|
||||
if name.isNil():
|
||||
return 0
|
||||
cast[uint](name)
|
||||
|
||||
|
||||
proc newClosureConverter(checker: TypeChecker): ClosureConverter =
|
||||
ClosureConverter(checker: checker,
|
||||
counter: 0,
|
||||
@@ -199,11 +193,11 @@ proc collectCaptures(self: ClosureConverter, fn: TypedFunDecl, captures: var seq
|
||||
self.collectCaptures(fn, captures, seen, assignment.indexExpr)
|
||||
self.collectCaptures(fn, captures, seen, assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.collectCaptures(fn, captures, seen, TypedUnaryExpr(expr).a)
|
||||
self.collectCaptures(fn, captures, seen, TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.collectCaptures(fn, captures, seen, binary.a)
|
||||
self.collectCaptures(fn, captures, seen, binary.b)
|
||||
self.collectCaptures(fn, captures, seen, binary.left)
|
||||
self.collectCaptures(fn, captures, seen, binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -407,11 +401,11 @@ proc noteEscapingClosureExpr(self: ClosureConverter, expr: TypedExpr) =
|
||||
self.noteEscapingClosureExpr(assignment.indexExpr)
|
||||
self.noteEscapingClosureExpr(assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.noteEscapingClosureExpr(TypedUnaryExpr(expr).a)
|
||||
self.noteEscapingClosureExpr(TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.noteEscapingClosureExpr(binary.a)
|
||||
self.noteEscapingClosureExpr(binary.b)
|
||||
self.noteEscapingClosureExpr(binary.left)
|
||||
self.noteEscapingClosureExpr(binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -558,11 +552,11 @@ proc analyzeExpr(self: ClosureConverter, expr: TypedExpr) =
|
||||
self.analyzeExpr(assignment.indexExpr)
|
||||
self.analyzeExpr(assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.analyzeExpr(TypedUnaryExpr(expr).a)
|
||||
self.analyzeExpr(TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.analyzeExpr(binary.a)
|
||||
self.analyzeExpr(binary.b)
|
||||
self.analyzeExpr(binary.left)
|
||||
self.analyzeExpr(binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
@@ -737,11 +731,11 @@ proc rewriteExpr(self: ClosureConverter, expr: TypedExpr, current: TypedFunDecl)
|
||||
assignment.indexExpr = self.rewriteExpr(assignment.indexExpr, current)
|
||||
assignment.value = self.rewriteExpr(assignment.value, current)
|
||||
of NodeKind.unaryExpr:
|
||||
TypedUnaryExpr(expr).a = self.rewriteExpr(TypedUnaryExpr(expr).a, current)
|
||||
TypedUnaryExpr(expr).operand = self.rewriteExpr(TypedUnaryExpr(expr).operand, current)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
binary.a = self.rewriteExpr(binary.a, current)
|
||||
binary.b = self.rewriteExpr(binary.b, current)
|
||||
binary.left = self.rewriteExpr(binary.left, current)
|
||||
binary.right = self.rewriteExpr(binary.right, current)
|
||||
else:
|
||||
discard
|
||||
expr
|
||||
|
||||
@@ -60,18 +60,6 @@ proc fail(message: string, node: ASTNode) {.noreturn.} =
|
||||
raise ComptimeEvalError(msg: message, node: node)
|
||||
|
||||
|
||||
proc unwrapType(self: Type): Type {.inline.} =
|
||||
case self.kind:
|
||||
of Typevar:
|
||||
return self.wrapped
|
||||
of Union:
|
||||
result = Type(kind: Union, constraints: @[], displayName: self.displayName)
|
||||
for typ in self.constraints:
|
||||
result.constraints.add((match: typ.match, kind: typ.kind.unwrapType(), value: typ.value))
|
||||
else:
|
||||
return self
|
||||
|
||||
|
||||
proc emitCtfeByte(chunk: Chunk, byt: OpCode | uint8, line: int) {.inline.} =
|
||||
chunk.write(uint8(byt), line)
|
||||
|
||||
@@ -361,14 +349,14 @@ proc emitCtfeExpression(ctx: ComptimeEvalContext, chunk: Chunk, expression: Type
|
||||
if ctx.builtinForUnary.isNil():
|
||||
fail("compile-time evaluation cannot resolve unary operators in this context", expression.node)
|
||||
let unary = TypedUnaryExpr(expression)
|
||||
let builtin = ctx.builtinForUnary(UnaryExpr(expression.node), unary.a)
|
||||
ctx.emitCtfeBuiltin(chunk, builtin, @[unary.a], expression.node.token.line)
|
||||
let builtin = ctx.builtinForUnary(UnaryExpr(expression.node), unary.operand)
|
||||
ctx.emitCtfeBuiltin(chunk, builtin, @[unary.operand], expression.node.token.line)
|
||||
of binaryExpr:
|
||||
if ctx.builtinForBinary.isNil():
|
||||
fail("compile-time evaluation cannot resolve binary operators in this context", expression.node)
|
||||
let binary = TypedBinaryExpr(expression)
|
||||
let builtin = ctx.builtinForBinary(BinaryExpr(expression.node), binary.a, binary.b)
|
||||
ctx.emitCtfeBuiltin(chunk, builtin, @[binary.a, binary.b], expression.node.token.line)
|
||||
let builtin = ctx.builtinForBinary(BinaryExpr(expression.node), binary.left, binary.right)
|
||||
ctx.emitCtfeBuiltin(chunk, builtin, @[binary.left, binary.right], expression.node.token.line)
|
||||
else:
|
||||
fail("expression is not statically evaluable", expression.node)
|
||||
|
||||
|
||||
@@ -25,20 +25,6 @@ func newMonomorphizer(): Monomorphizer =
|
||||
emittedFunctions: @[])
|
||||
|
||||
|
||||
proc unwrapType(self: Type): Type {.inline.} =
|
||||
if self.isNil():
|
||||
return nil
|
||||
case self.kind
|
||||
of Typevar:
|
||||
return self.wrapped
|
||||
of Union:
|
||||
result = Type(kind: Union, constraints: @[], displayName: self.displayName)
|
||||
for typ in self.constraints:
|
||||
result.constraints.add((match: typ.match, kind: typ.kind.unwrapType(), value: typ.value))
|
||||
else:
|
||||
return self
|
||||
|
||||
|
||||
proc isBuiltinFunction(name: Name): bool =
|
||||
not name.isNil() and
|
||||
not name.valueType.isNil() and
|
||||
@@ -871,16 +857,16 @@ proc rewriteExpr(self: Monomorphizer, expression: var TypedExpr, bindings: Table
|
||||
return
|
||||
case expression.node.kind
|
||||
of NodeKind.unaryExpr:
|
||||
var a = TypedUnaryExpr(expression).a
|
||||
self.rewriteExpr(a, bindings)
|
||||
TypedUnaryExpr(expression).a = a
|
||||
var operand = TypedUnaryExpr(expression).operand
|
||||
self.rewriteExpr(operand, bindings)
|
||||
TypedUnaryExpr(expression).operand = operand
|
||||
of NodeKind.binaryExpr:
|
||||
var a = TypedBinaryExpr(expression).a
|
||||
var b = TypedBinaryExpr(expression).b
|
||||
self.rewriteExpr(a, bindings)
|
||||
self.rewriteExpr(b, bindings)
|
||||
TypedBinaryExpr(expression).a = a
|
||||
TypedBinaryExpr(expression).b = b
|
||||
var left = TypedBinaryExpr(expression).left
|
||||
var right = TypedBinaryExpr(expression).right
|
||||
self.rewriteExpr(left, bindings)
|
||||
self.rewriteExpr(right, bindings)
|
||||
TypedBinaryExpr(expression).left = left
|
||||
TypedBinaryExpr(expression).right = right
|
||||
of NodeKind.callExpr:
|
||||
let call = TypedCallExpr(expression)
|
||||
var callee = call.callee
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/type_checking
|
||||
|
||||
import std/tables
|
||||
|
||||
@@ -34,11 +34,6 @@ func newOwnershipInfo*: OwnershipInfo =
|
||||
nameClasses: newTable[uint, OwnershipClass]())
|
||||
|
||||
|
||||
func exprKey(expr: TypedExpr): uint {.inline.} = cast[uint](expr)
|
||||
|
||||
func nameKey(name: Name): uint {.inline.} = cast[uint](name)
|
||||
|
||||
|
||||
func classifyType*(typ: Type): OwnershipClass =
|
||||
if typ.isNil():
|
||||
return PlainValue
|
||||
@@ -171,11 +166,11 @@ proc visitExpr(self: OwnershipInfo, expr: TypedExpr) =
|
||||
self.record(assignment.name)
|
||||
self.visitExpr(assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.visitExpr(TypedUnaryExpr(expr).a)
|
||||
self.visitExpr(TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.visitExpr(binary.a)
|
||||
self.visitExpr(binary.b)
|
||||
self.visitExpr(binary.left)
|
||||
self.visitExpr(binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
|
||||
@@ -12,7 +12,7 @@
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checker
|
||||
import frontend/compiler/type_checking
|
||||
import frontend/compiler/ownership_analysis
|
||||
|
||||
import std/tables
|
||||
@@ -37,8 +37,6 @@ type
|
||||
|
||||
func newCheckPlan*: CheckPlan = CheckPlan(derefChecks: newTable[uint, CheckDecision]())
|
||||
|
||||
func exprKey(expression: TypedExpr): uint {.inline.} = cast[uint](expression)
|
||||
|
||||
func baseKey(expression: TypedExpr): uint =
|
||||
if expression.isNil():
|
||||
return 0
|
||||
@@ -174,11 +172,11 @@ proc planExpr(plan: CheckPlan, ownershipInfo: OwnershipInfo, expression: TypedEx
|
||||
plan.planExpr(ownershipInfo, assignment.value, validated)
|
||||
validated.invalidate()
|
||||
of NodeKind.unaryExpr:
|
||||
plan.planExpr(ownershipInfo, TypedUnaryExpr(expression).a, validated)
|
||||
plan.planExpr(ownershipInfo, TypedUnaryExpr(expression).operand, validated)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expression)
|
||||
plan.planExpr(ownershipInfo, binary.a, validated)
|
||||
plan.planExpr(ownershipInfo, binary.b, validated)
|
||||
plan.planExpr(ownershipInfo, binary.left, validated)
|
||||
plan.planExpr(ownershipInfo, binary.right, validated)
|
||||
else:
|
||||
discard
|
||||
|
||||
File diff suppressed because it is too large
Load Diff
596
src/frontend/compiler/type_checking.nim
Normal file
596
src/frontend/compiler/type_checking.nim
Normal file
@@ -0,0 +1,596 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
## Entry-point module for the peon type checker. Ties together all the
|
||||
## submodules extracted from the original monolithic type_checker and
|
||||
## exposes the public API consumed by the rest of the compiler.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/type_checking/pragmas
|
||||
import frontend/compiler/type_checking/user_defines
|
||||
import frontend/compiler/type_checking/borrow_check
|
||||
import frontend/compiler/type_checking/interfaces
|
||||
import frontend/compiler/type_checking/builtins
|
||||
import frontend/compiler/type_checking/names
|
||||
import frontend/compiler/type_checking/resolution
|
||||
import frontend/compiler/type_checking/expressions
|
||||
import frontend/compiler/type_checking/declarations
|
||||
|
||||
import frontend/compiler/module_loader
|
||||
import frontend/compiler/comptime_eval
|
||||
import frontend/parsing/parser
|
||||
|
||||
import std/os
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
|
||||
export types, type_utils, comparison, user_defines, names, expressions, resolution
|
||||
|
||||
|
||||
## Forward declarations
|
||||
proc typecheck(self: TypeChecker, node: ASTNode): TypedNode
|
||||
proc typecheckModule(self: TypeChecker, module: LoadedModule): seq[TypedNode]
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Module import / export helpers
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc importKey(node: ImportStmt): string =
|
||||
## Returns a stable cache key for a specific import site
|
||||
&"{node.file}:{node.token.pos.start}:{node.token.line}"
|
||||
|
||||
|
||||
proc recordModuleExport(self: TypeChecker, name: Name) =
|
||||
## Records a name as exportable from the current module
|
||||
let modulePath = self.currentModule.absPath
|
||||
if not self.moduleExports.hasKey(modulePath):
|
||||
self.moduleExports[modulePath] = @[]
|
||||
if name notin self.moduleExports[modulePath]:
|
||||
self.moduleExports[modulePath].add(name)
|
||||
|
||||
|
||||
proc getOrCreateModule(self: TypeChecker, path: string): Name =
|
||||
## Creates or retrieves the stable module descriptor for the given path
|
||||
let absPath = absolutePath(path)
|
||||
if self.knownModules.hasKey(absPath):
|
||||
return self.knownModules[absPath]
|
||||
let moduleName = Name(kind: NameKind.Module,
|
||||
depth: 0,
|
||||
isPrivate: true,
|
||||
owner: nil,
|
||||
file: absPath,
|
||||
path: absPath,
|
||||
absPath: absPath,
|
||||
ident: newIdentExpr(Token(lexeme: splitFile(absPath).name, kind: Identifier)),
|
||||
line: 1,
|
||||
names: newTable[string, seq[Name]]())
|
||||
moduleName.module = moduleName
|
||||
self.knownModules[absPath] = moduleName
|
||||
return moduleName
|
||||
|
||||
|
||||
proc findModuleExports(self: TypeChecker, modulePath, symbol: string): seq[Name] =
|
||||
## Finds all exported names with the given symbol in the module export table
|
||||
for name in self.moduleExports.getOrDefault(modulePath, @[]):
|
||||
if name.ident.token.lexeme == symbol:
|
||||
result.add(name)
|
||||
|
||||
|
||||
proc bindImport(self: TypeChecker, imported: Name, importer: Name) =
|
||||
## Marks an exported name as visible to the importing module
|
||||
if importer notin imported.exportedTo:
|
||||
imported.exportedTo.add(importer)
|
||||
|
||||
|
||||
proc importedModuleAlias(importNode: ImportStmt): IdentExpr =
|
||||
## Derives the local module alias from the imported module path.
|
||||
let modulePath = importNode.moduleName.token.lexeme
|
||||
var alias = splitFile(modulePath).name
|
||||
if alias.len() == 0:
|
||||
alias = modulePath
|
||||
let offset = max(0, modulePath.len() - alias.len())
|
||||
let token = Token(kind: Identifier,
|
||||
lexeme: alias,
|
||||
line: importNode.moduleName.token.line,
|
||||
pos: (importNode.moduleName.token.pos.start + offset,
|
||||
importNode.moduleName.token.pos.start + offset + alias.len()),
|
||||
relPos: (importNode.moduleName.token.relPos.start + offset,
|
||||
importNode.moduleName.token.relPos.start + offset + alias.len()))
|
||||
result = newIdentExpr(token)
|
||||
result.file = importNode.file
|
||||
|
||||
|
||||
proc declareModuleAlias(self: TypeChecker, importNode: ImportStmt, importedModule: Name): Name =
|
||||
## Declares a local module alias for the import statement
|
||||
var exports = newTable[string, seq[Name]]()
|
||||
for name in self.moduleExports.getOrDefault(importedModule.absPath, @[]):
|
||||
if not exports.hasKey(name.ident.token.lexeme):
|
||||
exports[name.ident.token.lexeme] = @[]
|
||||
exports[name.ident.token.lexeme].add(name)
|
||||
let aliasIdent = importedModuleAlias(importNode)
|
||||
result = Name(kind: NameKind.Module,
|
||||
depth: self.scopeDepth,
|
||||
ident: aliasIdent,
|
||||
module: self.currentModule,
|
||||
file: self.file,
|
||||
isPrivate: true,
|
||||
valueType: nil,
|
||||
owner: self.currentFunction,
|
||||
line: importNode.token.line,
|
||||
node: nil,
|
||||
path: importNode.moduleName.token.lexeme,
|
||||
absPath: importedModule.absPath,
|
||||
names: exports)
|
||||
self.addName(result)
|
||||
|
||||
|
||||
proc declareImportedAlias(self: TypeChecker, imported: Name, localName: IdentExpr): Name =
|
||||
## Declares a local alias for an imported exported name
|
||||
result = Name(kind: imported.kind,
|
||||
ident: localName,
|
||||
module: self.currentModule,
|
||||
file: self.file,
|
||||
depth: self.scopeDepth,
|
||||
isPrivate: true,
|
||||
valueType: imported.valueType,
|
||||
owner: self.currentFunction,
|
||||
line: localName.token.line,
|
||||
node: imported.node)
|
||||
self.addName(result)
|
||||
if not result.valueType.isNil() and result.valueType.kind == TypeKind.Function:
|
||||
self.recordCapabilities(result)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Import / export statements
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc importStmt(self: TypeChecker, node: ImportStmt): TypedImportStmt =
|
||||
## Typechecks imports by loading the imported module, exposing its exports,
|
||||
## and preserving the import site for later code generation
|
||||
let key = importKey(node)
|
||||
if self.resolvedImports.hasKey(key):
|
||||
return self.resolvedImports[key]
|
||||
if self.moduleLoader.isNil():
|
||||
self.error("imports require module-aware typechecking", node)
|
||||
let importedPath = absolutePath(resolveModulePath(self.file, node.moduleName.token.lexeme, node.moduleName.token.line))
|
||||
let loaded = self.moduleLoader.loadModule(importedPath,
|
||||
moduleName=node.moduleName.token.lexeme,
|
||||
importer=self.file,
|
||||
importLine=node.moduleName.token.line)
|
||||
discard self.typecheckModule(loaded)
|
||||
let importedModule = self.getOrCreateModule(importedPath)
|
||||
var localBindings: seq[Name] = @[]
|
||||
let exports =
|
||||
if not node.fromImport:
|
||||
self.moduleExports.getOrDefault(importedPath, @[])
|
||||
else:
|
||||
block:
|
||||
var selected: seq[Name] = @[]
|
||||
for item in node.items:
|
||||
let matches = self.findModuleExports(importedPath, item.name.token.lexeme)
|
||||
if matches.len() == 0:
|
||||
self.error(&"cannot import '{item.name.token.lexeme}' from '{node.moduleName.token.lexeme}': name is not exported", item.name)
|
||||
for exported in matches:
|
||||
let localName = if item.alias.isNil(): item.name else: item.alias
|
||||
localBindings.add(self.declareImportedAlias(exported, localName))
|
||||
selected.add(exported)
|
||||
selected
|
||||
if not node.fromImport:
|
||||
localBindings.add(self.declareModuleAlias(node, importedModule))
|
||||
for exported in exports:
|
||||
self.bindImport(exported, self.currentModule)
|
||||
result = newTypedImportStmt(node, importedModule, exports, localBindings)
|
||||
self.resolvedImports[key] = result
|
||||
|
||||
|
||||
proc exportStmt(self: TypeChecker, node: ExportStmt) =
|
||||
## Records an export from the current module, including re-exports
|
||||
let exportedExpr = self.expression(node.name)
|
||||
let exported = exportedExpr.getName()
|
||||
if exported.isNil():
|
||||
self.error("export statements require an identifier or module member access", node.name)
|
||||
self.recordModuleExport(exported)
|
||||
if exported.kind == NameKind.Module:
|
||||
for members in exported.names.values():
|
||||
for member in members:
|
||||
self.recordModuleExport(member)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Pre-declaration pass
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc predeclareTopLevelTypes(self: TypeChecker, tree: ParseTree) =
|
||||
for node in tree:
|
||||
if node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
discard self.declare(TypeDecl(node))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Main AST node dispatcher
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc typecheck(self: TypeChecker, node: ASTNode): TypedNode =
|
||||
## Dispatches typeless AST nodes to typecheck them and turn
|
||||
## them into typed ones
|
||||
if not node.isNil():
|
||||
node.activeDefines = snapshotActiveDefines(self.userDefines)
|
||||
case node.kind:
|
||||
of NodeKind.binaryExpr, NodeKind.unaryExpr, NodeKind.genericExpr,
|
||||
NodeKind.assignExpr, NodeKind.blockExpr, NodeKind.ifExpr,
|
||||
NodeKind.whenExpr, NodeKind.whileExpr, NodeKind.matchExpr,
|
||||
NodeKind.identExpr, NodeKind.getterExpr, NodeKind.setterExpr,
|
||||
NodeKind.groupingExpr, NodeKind.callExpr,
|
||||
NodeKind.intExpr, NodeKind.floatExpr, NodeKind.octExpr,
|
||||
NodeKind.binExpr, NodeKind.hexExpr, NodeKind.trueExpr, NodeKind.falseExpr,
|
||||
NodeKind.nanExpr, NodeKind.infExpr:
|
||||
result = self.expression(Expression(node))
|
||||
of NodeKind.exprStmt:
|
||||
let statement = ExprStmt(node)
|
||||
result = TypedExprStmt(node: statement, expression: TypedExpr(self.typecheck(statement.expression)))
|
||||
of NodeKind.whileStmt:
|
||||
result = self.whileStmt(WhileStmt(node))
|
||||
of NodeKind.blockStmt:
|
||||
result = self.blockStmt(BlockStmt(node))
|
||||
of NodeKind.namedBlockStmt:
|
||||
result = self.namedBlockStmt(NamedBlockStmt(node))
|
||||
of NodeKind.ifStmt:
|
||||
result = self.ifStmt(IfStmt(node))
|
||||
of NodeKind.whenStmt:
|
||||
result = self.whenStmt(WhenStmt(node))
|
||||
of NodeKind.matchStmt:
|
||||
result = self.matchStmt(MatchStmt(node))
|
||||
of NodeKind.breakStmt:
|
||||
result = self.breakStmt(BreakStmt(node))
|
||||
of NodeKind.continueStmt:
|
||||
result = self.continueStmt(ContinueStmt(node))
|
||||
of NodeKind.returnStmt:
|
||||
result = self.returnStmt(ReturnStmt(node))
|
||||
of NodeKind.varDecl:
|
||||
result = self.varDecl(VarDecl(node))
|
||||
of NodeKind.funDecl:
|
||||
result = self.funDecl(FunDecl(node))
|
||||
of NodeKind.typeDecl:
|
||||
result = self.typeDecl(TypeDecl(node))
|
||||
of NodeKind.importStmt:
|
||||
result = self.importStmt(ImportStmt(node))
|
||||
of NodeKind.exportStmt:
|
||||
self.exportStmt(ExportStmt(node))
|
||||
of NodeKind.pragmaExpr:
|
||||
# Pragma "expressions" (they're more like compiler directives)
|
||||
# don't really return anything
|
||||
self.pragmaExpr(Pragma(node))
|
||||
else:
|
||||
self.error(&"failed to dispatch node of type {node.kind}", node)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Module-level type checking
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc typecheckModule(self: TypeChecker, module: LoadedModule): seq[TypedNode] =
|
||||
## Typechecks a single loaded module within the current checker session
|
||||
let modulePath = absolutePath(module.path)
|
||||
if self.typedModules.hasKey(modulePath):
|
||||
return self.typedModules[modulePath]
|
||||
if modulePath in self.typingModules:
|
||||
self.error(&"cyclic dependency detected while typechecking '{modulePath}'", module.tree[0])
|
||||
self.typingModules.incl(modulePath)
|
||||
let
|
||||
previousCurrent = self.current
|
||||
previousTree = self.tree
|
||||
previousSource = self.source
|
||||
previousFile = self.file
|
||||
previousCurrentFunction = self.currentFunction
|
||||
previousCurrentModule = self.currentModule
|
||||
previousUserDefines = self.userDefines
|
||||
previousPushedUserDefines = self.pushedUserDefines
|
||||
previousPushedConfigNode = self.pushedConfigNode
|
||||
previousDisabledWarnings = cloneDisabledWarnings(self.disabledWarnings)
|
||||
previousPushedDisabledWarnings = cloneDisabledWarnings(self.pushedDisabledWarnings)
|
||||
self.current = 0
|
||||
self.tree = module.tree
|
||||
self.source = module.source
|
||||
self.file = module.path
|
||||
self.currentFunction = nil
|
||||
self.currentModule = self.getOrCreateModule(modulePath)
|
||||
self.userDefines = cloneUserDefines(self.baseUserDefines)
|
||||
self.pushedUserDefines = nil
|
||||
self.pushedConfigNode = nil
|
||||
self.disabledWarnings = cloneDisabledWarnings(self.disabledWarnings)
|
||||
self.pushedDisabledWarnings = @[]
|
||||
for node in module.tree:
|
||||
if node.kind == NodeKind.importStmt:
|
||||
discard self.importStmt(ImportStmt(node))
|
||||
self.predeclareTopLevelTypes(module.tree)
|
||||
while not self.done():
|
||||
let node = self.step()
|
||||
let typed = self.typecheck(node)
|
||||
if not typed.isNil():
|
||||
result.add(typed)
|
||||
if not self.pushedUserDefines.isNil():
|
||||
self.error("unterminated configuration context: missing '#pragma[pop]'", self.pushedConfigNode)
|
||||
self.validateDeclaredInterfaces(result)
|
||||
self.validateUnusedWarnings(result)
|
||||
validateRawPointerLeakWarnings(self, result)
|
||||
self.typedModules[modulePath] = result
|
||||
self.current = previousCurrent
|
||||
self.tree = previousTree
|
||||
self.source = previousSource
|
||||
self.file = previousFile
|
||||
self.currentFunction = previousCurrentFunction
|
||||
self.currentModule = previousCurrentModule
|
||||
self.userDefines = previousUserDefines
|
||||
self.pushedUserDefines = previousPushedUserDefines
|
||||
self.pushedConfigNode = previousPushedConfigNode
|
||||
self.disabledWarnings = previousDisabledWarnings
|
||||
self.pushedDisabledWarnings = previousPushedDisabledWarnings
|
||||
self.typingModules.excl(modulePath)
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Public API
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc typecheck*(self: TypeChecker, tree: ParseTree, file, source: string, showMismatches: bool = false,
|
||||
disabledWarnings: seq[WarningKind] = @[]): seq[TypedNode] =
|
||||
## Transforms a sequence of typeless AST nodes
|
||||
## into a sequence of typed AST nodes
|
||||
self.file = file
|
||||
self.entryModulePath =
|
||||
if file in ["", "<string>"]:
|
||||
file
|
||||
else:
|
||||
absolutePath(file)
|
||||
self.source = source
|
||||
self.tree = tree
|
||||
self.current = 0
|
||||
self.scopeDepth = -1
|
||||
self.isMainModule = false
|
||||
self.currentFunction = nil
|
||||
self.showMismatches = showMismatches
|
||||
self.disabledWarnings = cloneDisabledWarnings(disabledWarnings)
|
||||
self.names = @[]
|
||||
self.capabilities = newTable[string, seq[TypeCapabilities]]()
|
||||
self.intrinsicInterfaces = newTable[string, seq[Type]]()
|
||||
self.moduleLoader = nil
|
||||
self.typedModules = newTable[string, seq[TypedNode]]()
|
||||
self.moduleExports = newTable[string, seq[Name]]()
|
||||
self.knownModules = newTable[string, Name]()
|
||||
self.resolvedImports = newTable[string, TypedImportStmt]()
|
||||
self.typingModules = initHashSet[string]()
|
||||
self.initializedNames = initHashSet[uint]()
|
||||
self.nameInits = newTable[uint, TypedExpr]()
|
||||
self.typedFunctions = newTable[uint, TypedFunDecl]()
|
||||
self.lentReturnSummaries = newTable[uint, LentOriginSet]()
|
||||
self.loopDepth = 0
|
||||
self.namedBlocks = @[]
|
||||
self.lambdaCounter = 0
|
||||
self.userDefines = cloneUserDefines(self.baseUserDefines)
|
||||
self.pushedUserDefines = nil
|
||||
self.pushedConfigNode = nil
|
||||
self.pushedDisabledWarnings = @[]
|
||||
self.beginScope()
|
||||
var mainModule = Name(kind: NameKind.Module,
|
||||
depth: 0,
|
||||
isPrivate: true,
|
||||
owner: nil,
|
||||
file: self.file,
|
||||
path: self.file,
|
||||
absPath: self.file,
|
||||
ident: newIdentExpr(Token(lexeme: self.file, kind: Identifier)),
|
||||
line: 1,
|
||||
names: newTable[string, seq[Name]]())
|
||||
mainModule.module = mainModule
|
||||
self.addName(mainModule)
|
||||
self.currentModule = mainModule
|
||||
self.predeclareTopLevelTypes(tree)
|
||||
# Every peon program has a hidden entry point in
|
||||
# which user code is wrapped. Think of it as if
|
||||
# peon is implicitly writing the main() function
|
||||
# of your program and putting all of your code in
|
||||
# there
|
||||
var main = Name(depth: 0,
|
||||
isPrivate: true,
|
||||
owner: self.currentModule,
|
||||
file: self.file,
|
||||
module: self.currentModule,
|
||||
valueType: Type(kind: Function,
|
||||
returnType: nil,
|
||||
signature: @[],
|
||||
),
|
||||
ident: newIdentExpr(Token(lexeme: "", kind: Identifier)),
|
||||
line: 1)
|
||||
self.addName(main)
|
||||
var node: TypedNode
|
||||
while not self.done():
|
||||
node = self.typecheck(self.step())
|
||||
if node.isNil():
|
||||
continue
|
||||
result.add(node)
|
||||
if not self.pushedUserDefines.isNil():
|
||||
self.error("unterminated configuration context: missing '#pragma[pop]'", self.pushedConfigNode)
|
||||
self.validateDeclaredInterfaces(result)
|
||||
self.validateUnusedWarnings(result)
|
||||
validateRawPointerLeakWarnings(self, result)
|
||||
doAssert self.scopeDepth == 0
|
||||
# Do not close the global scope if
|
||||
# we're being imported
|
||||
if self.isMainModule:
|
||||
self.endScope()
|
||||
|
||||
|
||||
proc typecheck*(self: TypeChecker, module: LoadedModule, loader: ModuleLoader,
|
||||
showMismatches: bool = false, disabledWarnings: seq[WarningKind] = @[]): seq[TypedNode] =
|
||||
## Typechecks a loader-owned module graph and returns the typed main module
|
||||
self.current = 0
|
||||
self.tree = @[]
|
||||
self.scopeDepth = -1
|
||||
self.source = ""
|
||||
self.file = module.path
|
||||
self.entryModulePath = absolutePath(module.path)
|
||||
self.isMainModule = true
|
||||
self.currentFunction = nil
|
||||
self.currentModule = nil
|
||||
self.disabledWarnings = cloneDisabledWarnings(disabledWarnings)
|
||||
self.names = @[]
|
||||
self.showMismatches = showMismatches
|
||||
self.capabilities = newTable[string, seq[TypeCapabilities]]()
|
||||
self.intrinsicInterfaces = newTable[string, seq[Type]]()
|
||||
self.moduleLoader = loader
|
||||
self.typedModules = newTable[string, seq[TypedNode]]()
|
||||
self.moduleExports = newTable[string, seq[Name]]()
|
||||
self.knownModules = newTable[string, Name]()
|
||||
self.resolvedImports = newTable[string, TypedImportStmt]()
|
||||
self.typingModules = initHashSet[string]()
|
||||
self.initializedNames = initHashSet[uint]()
|
||||
self.nameInits = newTable[uint, TypedExpr]()
|
||||
self.typedFunctions = newTable[uint, TypedFunDecl]()
|
||||
self.lentReturnSummaries = newTable[uint, LentOriginSet]()
|
||||
self.loopDepth = 0
|
||||
self.namedBlocks = @[]
|
||||
self.lambdaCounter = 0
|
||||
self.userDefines = cloneUserDefines(self.baseUserDefines)
|
||||
self.pushedUserDefines = nil
|
||||
self.pushedConfigNode = nil
|
||||
self.pushedDisabledWarnings = @[]
|
||||
self.beginScope()
|
||||
result = self.typecheckModule(module)
|
||||
|
||||
|
||||
proc collectTypedModules*(self: TypeChecker, order: seq[string]): seq[TypedNode] =
|
||||
## Collects typed modules in dependency order for code generation
|
||||
for path in order:
|
||||
result.add(self.typedModules.getOrDefault(absolutePath(path), @[]))
|
||||
|
||||
|
||||
# ---------------------------------------------------------------------------
|
||||
# Constructor
|
||||
# ---------------------------------------------------------------------------
|
||||
|
||||
proc typecheckWrapper(self: TypeChecker, node: ASTNode): TypedNode =
|
||||
typecheck(self, node)
|
||||
|
||||
proc funDeclWrapper(self: TypeChecker, node: FunDecl, name: Name): TypedFunDecl =
|
||||
funDecl(self, node, name)
|
||||
|
||||
proc typecheckModuleWrapper(self: TypeChecker, module: LoadedModule): seq[TypedNode] =
|
||||
typecheckModule(self, module)
|
||||
|
||||
proc getOrCreateModuleWrapper(self: TypeChecker, path: string): Name =
|
||||
getOrCreateModule(self, path)
|
||||
|
||||
proc recordModuleExportWrapper(self: TypeChecker, name: Name) =
|
||||
recordModuleExport(self, name)
|
||||
|
||||
proc findModuleExportsWrapper(self: TypeChecker, modulePath, symbol: string): seq[Name] =
|
||||
findModuleExports(self, modulePath, symbol)
|
||||
|
||||
proc bindImportWrapper(self: TypeChecker, imported: Name, importer: Name) =
|
||||
bindImport(self, imported, importer)
|
||||
|
||||
proc declareModuleAliasWrapper(self: TypeChecker, importNode: ImportStmt, importedModule: Name): Name =
|
||||
declareModuleAlias(self, importNode, importedModule)
|
||||
|
||||
proc declareImportedAliasWrapper(self: TypeChecker, imported: Name, localName: IdentExpr): Name =
|
||||
declareImportedAlias(self, imported, localName)
|
||||
|
||||
|
||||
proc matchWrapper(self: TypeChecker, name: string, sig: TypeSignature,
|
||||
args: seq[TypedExpr], node: ASTNode): Name =
|
||||
match(self, name, sig, args, node)
|
||||
|
||||
|
||||
proc attachConstructContextWrapper(self: TypeChecker, construct: TypedConstructExpr, expected: Type) =
|
||||
## Wrapper for the proc-var hook
|
||||
let genericArgs = self.inferConstructGenericArgs(construct.callee, expected)
|
||||
if genericArgs.len() == 0:
|
||||
return
|
||||
if not construct.variant.isNil():
|
||||
construct.variant = self.instantiateDeclaredType(construct.callee, construct.variant, genericArgs, construct.node)
|
||||
return
|
||||
construct.templateDecl = construct.callee
|
||||
construct.genericArgs = genericArgs
|
||||
|
||||
|
||||
proc inferConstructGenericArgsWrapper(self: TypeChecker, callee: Name, expected: Type): seq[GenericArg] =
|
||||
## Wrapper for the proc-var hook
|
||||
inferConstructGenericArgs(self, callee, expected)
|
||||
|
||||
|
||||
proc instantiateDeclaredTypeWrapper(self: TypeChecker, templateDecl: Name, typ: Type,
|
||||
genericArgs: seq[GenericArg], node: ASTNode): Type =
|
||||
## Wrapper for the proc-var hook
|
||||
instantiateDeclaredType(self, templateDecl, typ, genericArgs, node)
|
||||
|
||||
|
||||
proc dispatchPragmasWrapper(self: TypeChecker, name: Name) =
|
||||
## Wrapper for the proc-var hook
|
||||
dispatchPragmas(self, name)
|
||||
|
||||
|
||||
proc newTypeChecker*(userDefines: TableRef[string, UserDefine] = nil): TypeChecker =
|
||||
let resolved = resolveUserDefines(userDefines)
|
||||
result = newTypeCheckerBase(resolved)
|
||||
result.expressionImpl = expression
|
||||
result.attachConstructContextImpl = attachConstructContextWrapper
|
||||
result.inferConstructGenericArgsImpl = inferConstructGenericArgsWrapper
|
||||
result.instantiateDeclaredTypeImpl = instantiateDeclaredTypeWrapper
|
||||
result.dispatchPragmasImpl = dispatchPragmasWrapper
|
||||
result.matchImpl = matchWrapper
|
||||
result.typecheckImpl = typecheckWrapper
|
||||
result.funDeclImpl = funDeclWrapper
|
||||
result.typecheckModuleImpl = typecheckModuleWrapper
|
||||
result.getOrCreateModuleImpl = getOrCreateModuleWrapper
|
||||
result.recordModuleExportImpl = recordModuleExportWrapper
|
||||
result.findModuleExportsImpl = findModuleExportsWrapper
|
||||
result.bindImportImpl = bindImportWrapper
|
||||
result.declareModuleAliasImpl = declareModuleAliasWrapper
|
||||
result.declareImportedAliasImpl = declareImportedAliasWrapper
|
||||
# Register all pragma handlers
|
||||
result.pragmas["magic"] = PragmaFunc(kind: Immediate, handler: handleMagicPragma)
|
||||
result.pragmas["pure"] = PragmaFunc(kind: Immediate, handler: handlePurePragma)
|
||||
result.pragmas["error"] = PragmaFunc(kind: Delayed, handler: handleErrorPragma)
|
||||
result.pragmas["warn"] = PragmaFunc(kind: Delayed, handler: handleWarnPragma)
|
||||
result.pragmas["warning"] = PragmaFunc(kind: Delayed, handler: handleWarnPragma)
|
||||
result.pragmas["deprecated"] = PragmaFunc(kind: Delayed, handler: handleDeprecatedPragma)
|
||||
result.pragmas["used"] = PragmaFunc(kind: Immediate, handler: handleUsedPragma)
|
||||
result.pragmas["noinit"] = PragmaFunc(kind: Immediate, handler: handleNoInitPragma)
|
||||
result.pragmas["noreturn"] = PragmaFunc(kind: Immediate, handler: handleNoReturnPragma)
|
||||
result.pragmas["inline"] = PragmaFunc(kind: Immediate, handler: handleInlinePragma)
|
||||
result.pragmas["importc"] = PragmaFunc(kind: Immediate, handler: handleImportCPragma)
|
||||
result.pragmas["exportc"] = PragmaFunc(kind: Immediate, handler: handleExportCPragma)
|
||||
result.pragmas["header"] = PragmaFunc(kind: Immediate, handler: handleHeaderPragma)
|
||||
result.pragmas["define"] = PragmaFunc(kind: Immediate, handler: handleDefinePragma)
|
||||
result.pragmas["booldefine"] = PragmaFunc(kind: Immediate, handler: handleBoolDefinePragma)
|
||||
result.pragmas["push"] = PragmaFunc(kind: Immediate, handler: handlePushPragma)
|
||||
result.pragmas["pop"] = PragmaFunc(kind: Immediate, handler: handlePopPragma)
|
||||
result.pragmas["checks"] = PragmaFunc(kind: Immediate, handler: handleChecksConfigPragma)
|
||||
result.pragmas["boundChecks"] = PragmaFunc(kind: Immediate, handler: handleBoundChecksConfigPragma)
|
||||
result.pragmas["overflowCheck"] = PragmaFunc(kind: Immediate, handler: handleOverflowCheckConfigPragma)
|
||||
result.pragmas["overflowChecks"] = PragmaFunc(kind: Immediate, handler: handleOverflowCheckConfigPragma)
|
||||
result.pragmas["floatCheck"] = PragmaFunc(kind: Immediate, handler: handleFloatCheckConfigPragma)
|
||||
result.pragmas["floatChecks"] = PragmaFunc(kind: Immediate, handler: handleFloatCheckConfigPragma)
|
||||
result.pragmas["lineTrace"] = PragmaFunc(kind: Immediate, handler: handleLineTraceConfigPragma)
|
||||
result.pragmas["stackTrace"] = PragmaFunc(kind: Immediate, handler: handleStackTraceConfigPragma)
|
||||
result.pragmas["debug"] = PragmaFunc(kind: Immediate, handler: handleDebugConfigPragma)
|
||||
result.pragmas["release"] = PragmaFunc(kind: Immediate, handler: handleReleaseConfigPragma)
|
||||
result.pragmas["danger"] = PragmaFunc(kind: Immediate, handler: handleDangerConfigPragma)
|
||||
result.pragmas["noWarn"] = PragmaFunc(kind: Immediate, handler: handleNoWarnConfigPragma)
|
||||
492
src/frontend/compiler/type_checking/borrow_check.nim
Normal file
492
src/frontend/compiler/type_checking/borrow_check.nim
Normal file
@@ -0,0 +1,492 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
|
||||
|
||||
# borrowSourceMutable, borrowType moved to comparison.nim
|
||||
|
||||
# Forward declarations
|
||||
proc lentReturnSummary*(self: TypeChecker, function: Name, active: var HashSet[uint]): LentOriginSet
|
||||
proc substituteParamOrigins*(self: TypeChecker, summary: LentOriginSet, args: seq[TypedExpr], function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet
|
||||
|
||||
proc canPassVarArgument*(self: TypeChecker, expr: TypedExpr): bool =
|
||||
if expr.isNil():
|
||||
return false
|
||||
let name = expr.getName()
|
||||
if not name.isNil() and name.kind == NameKind.Var:
|
||||
return self.borrowSourceMutable(expr)
|
||||
if not self.borrowSourceMutable(expr):
|
||||
return false
|
||||
expr of TypedFieldExpr or expr of TypedIndexExpr or expr of TypedDerefExpr
|
||||
|
||||
|
||||
proc describeVarArgumentMismatch*(self: TypeChecker, expr: TypedExpr): string =
|
||||
if expr.isNil() or expr.kind.isNil():
|
||||
return "the argument is not a mutable l-value"
|
||||
|
||||
let name = expr.getName()
|
||||
if not name.isNil() and name.kind == NameKind.Var:
|
||||
if name.isVarParam:
|
||||
return "the argument is not a mutable l-value"
|
||||
if not name.node.isNil() and name.node.kind == NodeKind.varDecl:
|
||||
let decl = VarDecl(name.node)
|
||||
if decl.constant:
|
||||
return &"'{name.ident.token.lexeme}' is a constant"
|
||||
if not decl.mutable:
|
||||
return &"'{name.ident.token.lexeme}' is immutable"
|
||||
|
||||
let targetType = expr.kind.unwrapType()
|
||||
if targetType.kind == Pointer and targetType.value.unwrapType().kind == TypeKind.Const:
|
||||
return "the argument points to const storage"
|
||||
if targetType.kind in {Reference, TypeKind.Lent} and not targetType.mutable:
|
||||
if not name.isNil():
|
||||
return &"'{name.ident.token.lexeme}' is an immutable handle"
|
||||
return "the argument is an immutable handle"
|
||||
|
||||
if expr of TypedFieldExpr:
|
||||
return self.describeVarArgumentMismatch(TypedFieldExpr(expr).obj)
|
||||
if expr of TypedIndexExpr:
|
||||
return self.describeVarArgumentMismatch(TypedIndexExpr(expr).obj)
|
||||
if expr of TypedDerefExpr:
|
||||
let source = TypedDerefExpr(expr).target
|
||||
let sourceType = source.kind.unwrapType()
|
||||
if sourceType.kind == Pointer and sourceType.value.unwrapType().kind == TypeKind.Const:
|
||||
return "the argument points to const storage"
|
||||
if sourceType.kind in {Reference, TypeKind.Lent} and not sourceType.mutable:
|
||||
let sourceName = source.getName()
|
||||
if not sourceName.isNil():
|
||||
return &"'{sourceName.ident.token.lexeme}' is an immutable handle"
|
||||
return "the argument is an immutable handle"
|
||||
|
||||
"the argument is not a mutable l-value"
|
||||
|
||||
|
||||
proc lowerVarArgument*(self: TypeChecker, arg: TypedExpr, expected: Type,
|
||||
node: ASTNode): TypedExpr =
|
||||
if not self.canPassVarArgument(arg):
|
||||
self.error("var parameters currently require a mutable l-value", node)
|
||||
self.coerceWithContext(arg, expected, node)
|
||||
|
||||
|
||||
proc canPassMoveArgument*(self: TypeChecker, expr: TypedExpr): bool =
|
||||
if expr.isNil():
|
||||
return false
|
||||
let name = expr.getName()
|
||||
if not name.isNil() and name.kind == NameKind.Var:
|
||||
return not self.borrowType(expr).isNil()
|
||||
if self.borrowType(expr).isNil():
|
||||
return false
|
||||
expr of TypedFieldExpr or expr of TypedIndexExpr or expr of TypedDerefExpr
|
||||
|
||||
|
||||
proc lowerMoveArgument*(self: TypeChecker, arg: TypedExpr, expected: Type,
|
||||
node: ASTNode): TypedExpr =
|
||||
if not self.canPassMoveArgument(arg):
|
||||
self.error("move sources currently require a stable l-value", node)
|
||||
self.coerceWithContext(arg, expected, node)
|
||||
|
||||
|
||||
proc isTemporaryBorrow*(self: TypeChecker, expr: TypedExpr): bool =
|
||||
expr of TypedBorrowExpr and self.borrowType(TypedBorrowExpr(expr).target).isNil()
|
||||
|
||||
|
||||
func tracksMove*(self: Name): bool =
|
||||
## Returns whether a name requires move tracking. A name tracks moves
|
||||
## when it is a variable with a non-nil value type and one of these
|
||||
## conditions holds: the type is a Reference (owned handle); or the
|
||||
## type is a concrete value type (not Lent, Const, Any, Typevar,
|
||||
## Generic, Union, or an interface Structure) that may have copy/move/
|
||||
## destroy hooks.
|
||||
if self.isNil() or self.kind != NameKind.Var or self.valueType.isNil():
|
||||
return false
|
||||
let unwrapped = self.valueType.unwrapType()
|
||||
# Owned handles always need move tracking
|
||||
if unwrapped.kind == Reference:
|
||||
return true
|
||||
# Types that are inherently borrowed / polymorphic / abstract
|
||||
# never need move tracking on their own
|
||||
let exemptKind = unwrapped.kind in {TypeKind.Lent, TypeKind.Const,
|
||||
Any, Typevar, Generic, Union}
|
||||
let isAbstractStructure = unwrapped.kind == Structure and unwrapped.isInterface
|
||||
if exemptKind or isAbstractStructure:
|
||||
return unwrapped.copyHook != nil or
|
||||
unwrapped.moveHook != nil or
|
||||
unwrapped.destroyHook != nil
|
||||
# Concrete value types with lifecycle hooks
|
||||
if unwrapped.copyHook != nil or unwrapped.moveHook != nil or unwrapped.destroyHook != nil:
|
||||
return true
|
||||
# All other concrete value types (not exempt, not abstract)
|
||||
true
|
||||
|
||||
|
||||
proc clearMoved*(self: TypeChecker, name: Name) =
|
||||
if name.isNil():
|
||||
return
|
||||
self.movedNames.excl(name.nameKey())
|
||||
|
||||
|
||||
proc ensureNotMoved*(self: TypeChecker, name: Name, node: ASTNode) =
|
||||
if name.isNil() or name.nameKey() notin self.movedNames:
|
||||
return
|
||||
self.error(&"cannot use '{name.ident.token.lexeme}' after it was moved", node)
|
||||
|
||||
|
||||
proc moveSourceName*(expr: TypedExpr): Name =
|
||||
if expr.isNil():
|
||||
return nil
|
||||
if expr of TypedMoveExpr:
|
||||
return moveSourceName(TypedMoveExpr(expr).target)
|
||||
if expr of TypedIdentExpr:
|
||||
return TypedIdentExpr(expr).name
|
||||
nil
|
||||
|
||||
|
||||
proc consumeMoves*(self: TypeChecker, expr: TypedExpr)
|
||||
|
||||
|
||||
proc consumeMoves*(self: TypeChecker, expr: TypedExpr) =
|
||||
if expr.isNil():
|
||||
return
|
||||
if expr of TypedMoveExpr:
|
||||
let key = cast[uint](expr)
|
||||
if key in self.consumedMoves:
|
||||
return
|
||||
self.consumedMoves.incl(key)
|
||||
let name = moveSourceName(expr)
|
||||
if not name.isNil() and name.tracksMove():
|
||||
self.ensureNotMoved(name, expr.node)
|
||||
self.movedNames.incl(name.nameKey())
|
||||
self.consumeMoves(TypedMoveExpr(expr).target)
|
||||
return
|
||||
if expr of TypedDerefExpr:
|
||||
self.consumeMoves(TypedDerefExpr(expr).target)
|
||||
return
|
||||
if expr of TypedBorrowExpr:
|
||||
self.consumeMoves(TypedBorrowExpr(expr).target)
|
||||
return
|
||||
if expr of TypedConstructExpr:
|
||||
for field in TypedConstructExpr(expr).fields.values():
|
||||
self.consumeMoves(field)
|
||||
return
|
||||
if expr of TypedArrayConstructExpr:
|
||||
for element in TypedArrayConstructExpr(expr).elements:
|
||||
self.consumeMoves(element)
|
||||
return
|
||||
if expr of TypedGenericExpr:
|
||||
for arg in TypedGenericExpr(expr).args:
|
||||
self.consumeMoves(arg)
|
||||
return
|
||||
case expr.node.kind
|
||||
of NodeKind.callExpr:
|
||||
self.consumeMoves(TypedCallExpr(expr).callee)
|
||||
for arg in TypedCallExpr(expr).args:
|
||||
self.consumeMoves(arg)
|
||||
of NodeKind.getterExpr:
|
||||
if expr of TypedFieldExpr:
|
||||
self.consumeMoves(TypedFieldExpr(expr).obj)
|
||||
elif expr of TypedIndexExpr:
|
||||
self.consumeMoves(TypedIndexExpr(expr).obj)
|
||||
self.consumeMoves(TypedIndexExpr(expr).indexExpr)
|
||||
of NodeKind.sliceExpr:
|
||||
self.consumeMoves(TypedIndexExpr(expr).obj)
|
||||
self.consumeMoves(TypedIndexExpr(expr).indexExpr)
|
||||
of NodeKind.assignExpr, NodeKind.setterExpr:
|
||||
let assignment = TypedAssignExpr(expr)
|
||||
self.consumeMoves(assignment.obj)
|
||||
self.consumeMoves(assignment.indexExpr)
|
||||
self.consumeMoves(assignment.value)
|
||||
of NodeKind.unaryExpr:
|
||||
self.consumeMoves(TypedUnaryExpr(expr).operand)
|
||||
of NodeKind.binaryExpr:
|
||||
let binary = TypedBinaryExpr(expr)
|
||||
self.consumeMoves(binary.left)
|
||||
self.consumeMoves(binary.right)
|
||||
else:
|
||||
discard
|
||||
|
||||
|
||||
proc isParameterOf*(self: TypeChecker, function, name: Name): bool =
|
||||
if function.isNil() or name.isNil() or function.node.isNil() or function.node.kind != NodeKind.funDecl:
|
||||
return false
|
||||
name.ident.token.lexeme in FunDecl(function.node).parameters
|
||||
|
||||
|
||||
proc getParameterDecl*(self: TypeChecker, function, name: Name): Parameter =
|
||||
if not self.isParameterOf(function, name):
|
||||
return nil
|
||||
FunDecl(function.node).parameters.getOrDefault(name.ident.token.lexeme)
|
||||
|
||||
|
||||
proc isVarParameter*(self: TypeChecker, function, name: Name): bool =
|
||||
let parameter = self.getParameterDecl(function, name)
|
||||
not parameter.isNil() and parameter.isVar
|
||||
|
||||
|
||||
proc parameterIndex*(self: TypeChecker, function, name: Name): int =
|
||||
if not self.isParameterOf(function, name):
|
||||
return -1
|
||||
var index = 0
|
||||
for parameterName in FunDecl(function.node).parameters.keys():
|
||||
if parameterName == name.ident.token.lexeme:
|
||||
return index
|
||||
inc(index)
|
||||
-1
|
||||
|
||||
|
||||
proc isGlobalStorageName*(name: Name): bool =
|
||||
not name.isNil() and name.depth <= 0
|
||||
|
||||
|
||||
proc classifyStorageNameOrigin*(self: TypeChecker, function, name: Name): LentOriginSet =
|
||||
if name.isNil() or name.kind != NameKind.Var or name.valueType.isNil():
|
||||
return @[unknownLentOrigin()]
|
||||
let valueType = name.valueType.unwrapType()
|
||||
if self.isParameterOf(function, name):
|
||||
if valueType.kind in {TypeKind.Lent, Pointer, TypeKind.Const} or
|
||||
self.isVarParameter(function, name):
|
||||
let index = self.parameterIndex(function, name)
|
||||
if index >= 0:
|
||||
return @[paramLentOrigin(index)]
|
||||
return @[unknownLentOrigin()]
|
||||
return @[localLentOrigin(name)]
|
||||
if name.isGlobalStorageName():
|
||||
return @[globalLentOrigin(name)]
|
||||
@[localLentOrigin(name)]
|
||||
|
||||
|
||||
proc storageOrigins*(self: TypeChecker, expr: TypedExpr, function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet
|
||||
proc lentOrigins*(self: TypeChecker, expr: TypedExpr, function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet
|
||||
|
||||
|
||||
proc storageOrigins*(self: TypeChecker, expr: TypedExpr, function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet =
|
||||
## Traces the storage origin of an expression back through borrows,
|
||||
## moves, derefs, field accesses, and index operations to determine
|
||||
## which parameter, local, or global variable ultimately owns the
|
||||
## underlying memory. This is the first stage of the lent-origin
|
||||
## pipeline: storageOrigins identifies the concrete storage location,
|
||||
## while lentOrigins (below) resolves borrowed-handle expressions by
|
||||
## delegating back to storageOrigins for the borrow target.
|
||||
if expr.isNil():
|
||||
return @[unknownLentOrigin()]
|
||||
let key = cast[uint](expr)
|
||||
if key in seenExprs:
|
||||
return @[]
|
||||
seenExprs.incl(key)
|
||||
defer: seenExprs.excl(key)
|
||||
|
||||
if expr of TypedBorrowExpr:
|
||||
return self.storageOrigins(TypedBorrowExpr(expr).target, function, seenExprs, activeFns)
|
||||
if expr of TypedMoveExpr:
|
||||
return self.storageOrigins(TypedMoveExpr(expr).target, function, seenExprs, activeFns)
|
||||
if expr of TypedDerefExpr:
|
||||
return self.storageOrigins(TypedDerefExpr(expr).target, function, seenExprs, activeFns)
|
||||
if expr of TypedFieldExpr:
|
||||
return self.storageOrigins(TypedFieldExpr(expr).obj, function, seenExprs, activeFns)
|
||||
if expr of TypedIndexExpr:
|
||||
return self.storageOrigins(TypedIndexExpr(expr).obj, function, seenExprs, activeFns)
|
||||
if expr of TypedIdentExpr:
|
||||
let name = TypedIdentExpr(expr).name
|
||||
if name.isNil():
|
||||
return @[unknownLentOrigin()]
|
||||
let valueType =
|
||||
if name.valueType.isNil():
|
||||
nil
|
||||
else:
|
||||
name.valueType.unwrapType()
|
||||
if not valueType.isNil() and valueType.kind == TypeKind.Lent and self.nameInits.hasKey(name.nameKey()):
|
||||
return self.lentOrigins(self.nameInits[name.nameKey()], function, seenExprs, activeFns)
|
||||
if not valueType.isNil() and valueType.kind in {Pointer, TypeKind.Const} and self.nameInits.hasKey(name.nameKey()):
|
||||
return self.storageOrigins(self.nameInits[name.nameKey()], function, seenExprs, activeFns)
|
||||
return self.classifyStorageNameOrigin(function, name)
|
||||
if not expr.kind.isNil() and expr.kind.unwrapType().kind == TypeKind.Lent:
|
||||
return self.lentOrigins(expr, function, seenExprs, activeFns)
|
||||
@[unknownLentOrigin()]
|
||||
|
||||
|
||||
proc lentOrigins*(self: TypeChecker, expr: TypedExpr, function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet =
|
||||
## Computes the set of storage origins that a lent (borrowed-handle)
|
||||
## expression may refer to. While storageOrigins walks through
|
||||
## structural wrappers (field, index, deref) to find the owning
|
||||
## variable, lentOrigins specifically handles lent-typed expressions:
|
||||
## borrow expressions delegate to storageOrigins on their target,
|
||||
## call expressions use the callee's lentReturnSummary to substitute
|
||||
## argument origins, and identifiers chase through their initializer
|
||||
## expression. Together with storageOrigins, this forms the core of
|
||||
## the lent-origin pipeline used by ensureLentReturnsDoNotEscape to
|
||||
## verify that borrowed values returned from a function do not refer
|
||||
## to local storage.
|
||||
if expr.isNil() or expr.kind.isNil():
|
||||
return @[]
|
||||
let key = cast[uint](expr)
|
||||
if key in seenExprs:
|
||||
return @[]
|
||||
seenExprs.incl(key)
|
||||
defer: seenExprs.excl(key)
|
||||
|
||||
let kind = expr.kind.unwrapType()
|
||||
if kind.kind != TypeKind.Lent:
|
||||
return @[]
|
||||
|
||||
if expr of TypedBorrowExpr:
|
||||
return self.storageOrigins(TypedBorrowExpr(expr).target, function, seenExprs, activeFns)
|
||||
if expr of TypedMoveExpr:
|
||||
return self.lentOrigins(TypedMoveExpr(expr).target, function, seenExprs, activeFns)
|
||||
if expr of TypedIdentExpr:
|
||||
let name = TypedIdentExpr(expr).name
|
||||
if not name.isNil() and self.nameInits.hasKey(name.nameKey()):
|
||||
return self.lentOrigins(self.nameInits[name.nameKey()], function, seenExprs, activeFns)
|
||||
return self.classifyStorageNameOrigin(function, name)
|
||||
if expr of TypedCallExpr:
|
||||
let call = TypedCallExpr(expr)
|
||||
let summary = self.lentReturnSummary(call.getCallableName(), activeFns)
|
||||
return self.substituteParamOrigins(summary, call.args, function, seenExprs, activeFns)
|
||||
if expr of TypedFieldExpr:
|
||||
return self.storageOrigins(TypedFieldExpr(expr).obj, function, seenExprs, activeFns)
|
||||
if expr of TypedIndexExpr:
|
||||
return self.storageOrigins(TypedIndexExpr(expr).obj, function, seenExprs, activeFns)
|
||||
if expr of TypedDerefExpr:
|
||||
return self.storageOrigins(TypedDerefExpr(expr).target, function, seenExprs, activeFns)
|
||||
@[unknownLentOrigin()]
|
||||
|
||||
|
||||
proc argumentOrigins*(self: TypeChecker, expr: TypedExpr, function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet =
|
||||
if expr.isNil() or expr.kind.isNil():
|
||||
return @[unknownLentOrigin()]
|
||||
let kind = expr.kind.unwrapType()
|
||||
if kind.kind == TypeKind.Lent:
|
||||
return self.lentOrigins(expr, function, seenExprs, activeFns)
|
||||
self.storageOrigins(expr, function, seenExprs, activeFns)
|
||||
|
||||
|
||||
proc substituteParamOrigins*(self: TypeChecker, summary: LentOriginSet, args: seq[TypedExpr], function: Name,
|
||||
seenExprs: var HashSet[uint], activeFns: var HashSet[uint]): LentOriginSet =
|
||||
for origin in summary:
|
||||
case origin.kind:
|
||||
of ParamLentOrigin:
|
||||
if origin.paramIndex < 0 or origin.paramIndex >= args.len:
|
||||
result.addLentOrigin(unknownLentOrigin())
|
||||
else:
|
||||
result.addLentOrigins(self.argumentOrigins(args[origin.paramIndex], function, seenExprs, activeFns))
|
||||
else:
|
||||
result.addLentOrigin(origin)
|
||||
|
||||
|
||||
proc collectReturnStatements*(self: TypeChecker, node: TypedNode, returns: var seq[TypedReturnStmt]) =
|
||||
if node.isNil() or node of TypedFunDecl:
|
||||
return
|
||||
if node of TypedReturnStmt:
|
||||
returns.add(TypedReturnStmt(node))
|
||||
elif node of TypedBlockStmt:
|
||||
for piece in TypedBlockStmt(node).body:
|
||||
self.collectReturnStatements(piece, returns)
|
||||
elif node of TypedIfStmt:
|
||||
self.collectReturnStatements(TypedIfStmt(node).thenBranch, returns)
|
||||
self.collectReturnStatements(TypedIfStmt(node).elseBranch, returns)
|
||||
elif node of TypedWhenStmt:
|
||||
self.collectReturnStatements(TypedWhenStmt(node).body, returns)
|
||||
elif node of TypedWhileStmt:
|
||||
self.collectReturnStatements(TypedWhileStmt(node).body, returns)
|
||||
elif node of TypedMatchStmt:
|
||||
let stmt = TypedMatchStmt(node)
|
||||
for arm in stmt.arms:
|
||||
self.collectReturnStatements(arm.body, returns)
|
||||
self.collectReturnStatements(stmt.default, returns)
|
||||
|
||||
|
||||
proc collectFunctionReturnOrigins*(self: TypeChecker, function: Name, body: TypedBlockStmt,
|
||||
activeFns: var HashSet[uint]): LentOriginSet =
|
||||
var returns: seq[TypedReturnStmt] = @[]
|
||||
self.collectReturnStatements(body, returns)
|
||||
for ret in returns:
|
||||
if ret.value.isNil():
|
||||
continue
|
||||
var seenExprs = initHashSet[uint]()
|
||||
result.addLentOrigins(self.lentOrigins(ret.value, function, seenExprs, activeFns))
|
||||
|
||||
|
||||
proc lentReturnSummary*(self: TypeChecker, function: Name, active: var HashSet[uint]): LentOriginSet =
|
||||
## Computes a fixed-point summary of the lent-origin set for a
|
||||
## function's return value. The summary describes which parameter
|
||||
## slots (or global/unknown origins) the returned borrowed handle
|
||||
## may point to. It is cached in self.lentReturnSummaries and
|
||||
## re-used by substituteParamOrigins at each call site so that
|
||||
## callers can determine whether returned borrows escape local
|
||||
## storage. The active set guards against infinite recursion in
|
||||
## mutually recursive functions.
|
||||
if function.isNil() or function.valueType.isNil() or function.valueType.returnType.isNil():
|
||||
return @[]
|
||||
if function.valueType.returnType.unwrapType().kind != TypeKind.Lent:
|
||||
return @[]
|
||||
let key = function.nameKey()
|
||||
if key in active:
|
||||
return self.lentReturnSummaries.getOrDefault(key, @[])
|
||||
if not self.typedFunctions.hasKey(key):
|
||||
return @[unknownLentOrigin()]
|
||||
|
||||
active.incl(key)
|
||||
var summary = self.lentReturnSummaries.getOrDefault(key, @[])
|
||||
while true:
|
||||
let next = self.collectFunctionReturnOrigins(function, self.typedFunctions[key].body, active)
|
||||
if sameLentOrigins(summary, next):
|
||||
break
|
||||
summary = next
|
||||
self.lentReturnSummaries[key] = summary
|
||||
active.excl(key)
|
||||
self.lentReturnSummaries[key] = summary
|
||||
summary
|
||||
|
||||
|
||||
proc ensureLentReturnsDoNotEscape*(self: TypeChecker, function: Name, body: TypedBlockStmt) =
|
||||
## Validates that every return statement in the given function body
|
||||
## returns a borrowed (lent) value whose origins are safe to expose
|
||||
## to callers. Specifically, this rejects returns whose lent origins
|
||||
## contain local storage (the borrow would dangle after the function
|
||||
## returns) or unknown provenance (the analysis cannot prove safety).
|
||||
## This is the top-level entry point for the lent-escape check,
|
||||
## called once per lent-returning function after type checking its
|
||||
## body.
|
||||
if function.isNil() or function.valueType.isNil() or function.valueType.returnType.isNil():
|
||||
return
|
||||
if function.valueType.returnType.unwrapType().kind != TypeKind.Lent:
|
||||
return
|
||||
var activeFns = initHashSet[uint]()
|
||||
discard self.lentReturnSummary(function, activeFns)
|
||||
var returns: seq[TypedReturnStmt] = @[]
|
||||
self.collectReturnStatements(body, returns)
|
||||
for ret in returns:
|
||||
if ret.value.isNil():
|
||||
continue
|
||||
var seenExprs = initHashSet[uint]()
|
||||
let origins = self.lentOrigins(ret.value, function, seenExprs, activeFns)
|
||||
if origins.containsLocalLentOrigin():
|
||||
self.error("cannot return borrowed value that refers to local storage", ret.value.node)
|
||||
if origins.containsUnknownLentOrigin():
|
||||
self.error("cannot return borrowed value with unknown provenance", ret.value.node)
|
||||
|
||||
|
||||
# isBorrowedHandleMismatch moved to comparison.nim
|
||||
525
src/frontend/compiler/type_checking/builtins.nim
Normal file
525
src/frontend/compiler/type_checking/builtins.nim
Normal file
@@ -0,0 +1,525 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/type_checking/borrow_check
|
||||
import frontend/compiler/type_checking/names
|
||||
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/sequtils
|
||||
|
||||
|
||||
func isSeqType*(target: Type): bool
|
||||
proc lengthTargetType*(exprType: Type): Type
|
||||
proc indexTargetType*(exprType: Type): Type
|
||||
|
||||
proc builtinNewType*(self: TypeChecker, node: CallExpr): Type =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'new' only accepts a positional type argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'new' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let typeExpr = self.ensureType(node.arguments.positionals[0])
|
||||
let storage = typeExpr.kind.unwrapType()
|
||||
if storage.kind == Reference:
|
||||
self.error("builtin 'new' expects a plain storage type, not a managed ref type", node.arguments.positionals[0])
|
||||
if not self.supportsManagedRefStorage(storage):
|
||||
self.error(&"builtin 'new' expects a plain storage type that can live behind a managed ref, got {self.stringify(storage, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
result = storage.toRef().withHandleMutability(true)
|
||||
|
||||
|
||||
proc rawAllocationType*(self: TypeChecker, node: Expression, label: string): Type =
|
||||
let typeExpr = self.ensureType(node)
|
||||
let storage = typeExpr.kind.unwrapType()
|
||||
if storage.kind in {Reference, TypeKind.Lent, UncheckedArray}:
|
||||
self.error(&"builtin '{label}' expects a plain storage type, got {self.stringify(storage, true)} instead",
|
||||
node)
|
||||
self.validateRawPointerPointee(storage, node)
|
||||
storage
|
||||
|
||||
|
||||
proc builtinRawAllocType*(self: TypeChecker, node: CallExpr, label: string): Type =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error(&"builtin '{label}' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 2:
|
||||
self.error(&"builtin '{label}' expects exactly 2 arguments (got {node.arguments.positionals.len()})", node)
|
||||
let storage = self.rawAllocationType(node.arguments.positionals[0], label)
|
||||
let count = self.inferOrError(node.arguments.positionals[1])
|
||||
if count.kind.unwrapType().kind != Integer:
|
||||
self.error(&"builtin '{label}' expects an integer item count, got {self.stringify(count.kind, true)} instead",
|
||||
node.arguments.positionals[1])
|
||||
result = storage.toPtr()
|
||||
|
||||
|
||||
proc builtinBorrowExpr*(self: TypeChecker, node: CallExpr, argument: TypedExpr): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'borrow' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'borrow' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let borrowed = self.borrowType(argument)
|
||||
if borrowed.isNil():
|
||||
self.error("builtin 'borrow' expects a stable value location", node.arguments.positionals[0])
|
||||
result = newTypedBorrowExpr(node, borrowed, argument)
|
||||
|
||||
|
||||
proc builtinDeallocExpr*(self: TypeChecker, node: CallExpr, callee: Name, argument: TypedExpr): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'dealloc' only accepts a positional pointer argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'dealloc' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let pointerType = argument.kind.unwrapType()
|
||||
if pointerType.kind != Pointer:
|
||||
self.error(&"builtin 'dealloc' expects a raw pointer value, got {self.stringify(argument.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[argument])
|
||||
|
||||
|
||||
proc builtinReallocExpr*(self: TypeChecker, node: CallExpr, callee: Name,
|
||||
pointerArg, countArg: TypedExpr): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'realloc' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 2:
|
||||
self.error(&"builtin 'realloc' expects exactly 2 arguments (got {node.arguments.positionals.len()})", node)
|
||||
let pointerType = pointerArg.kind.unwrapType()
|
||||
if pointerType.kind != Pointer:
|
||||
self.error(&"builtin 'realloc' expects a raw pointer value, got {self.stringify(pointerArg.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[pointerArg, countArg])
|
||||
result.kind = pointerType
|
||||
|
||||
|
||||
proc currentHookTarget*(self: TypeChecker, hookName: string, arity: int): Type =
|
||||
if self.currentFunction.isNil() or self.currentFunction.ident.isNil() or
|
||||
self.currentFunction.ident.token.lexeme != hookName or
|
||||
self.currentFunction.valueType.isNil() or
|
||||
self.currentFunction.valueType.signature.len() != arity:
|
||||
return nil
|
||||
self.currentFunction.valueType.signature[0].kind.unwrapType()
|
||||
|
||||
|
||||
proc currentCopyHookTarget*(self: TypeChecker): Type =
|
||||
self.currentHookTarget("copy=", 2)
|
||||
|
||||
|
||||
proc currentMoveHookTarget*(self: TypeChecker): Type =
|
||||
self.currentHookTarget("move=", 2)
|
||||
|
||||
|
||||
proc currentDestroyHookTarget*(self: TypeChecker): Type =
|
||||
self.currentHookTarget("destroy=", 1)
|
||||
|
||||
|
||||
proc builtinDefaultCopyExpr*(self: TypeChecker, node: CallExpr, callee: Name): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'defaultCopy' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 0:
|
||||
self.error(&"builtin 'defaultCopy' expects exactly 0 arguments (got {node.arguments.positionals.len()})", node)
|
||||
if self.currentCopyHookTarget().isNil():
|
||||
self.error("builtin 'defaultCopy' can only be used inside a 'copy=' hook", node)
|
||||
result = newTypedCallExpr(node, callee, @[])
|
||||
|
||||
|
||||
proc builtinCopyExpr*(self: TypeChecker, node: CallExpr, callee: Name,
|
||||
dst, src: TypedExpr, genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'copy' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 2:
|
||||
self.error(&"builtin 'copy' expects exactly 2 arguments (got {node.arguments.positionals.len()})", node)
|
||||
let loweredDst = self.lowerVarArgument(dst, dst.kind, node.arguments.positionals[0])
|
||||
if loweredDst.kind.unwrapType().kind == Reference:
|
||||
self.error("builtin 'copy' does not support managed refs; use clone() or move(..., ...)", node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[loweredDst, src],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
|
||||
|
||||
proc builtinDefaultMoveExpr*(self: TypeChecker, node: CallExpr, callee: Name): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'defaultMove' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 0:
|
||||
self.error(&"builtin 'defaultMove' expects exactly 0 arguments (got {node.arguments.positionals.len()})", node)
|
||||
if self.currentMoveHookTarget().isNil():
|
||||
self.error("builtin 'defaultMove' can only be used inside a 'move=' hook", node)
|
||||
result = newTypedCallExpr(node, callee, @[])
|
||||
|
||||
|
||||
proc builtinDefaultDestroyExpr*(self: TypeChecker, node: CallExpr, callee: Name): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'defaultDestroy' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() != 0:
|
||||
self.error(&"builtin 'defaultDestroy' expects exactly 0 arguments (got {node.arguments.positionals.len()})", node)
|
||||
if self.currentDestroyHookTarget().isNil():
|
||||
self.error("builtin 'defaultDestroy' can only be used inside a 'destroy=' hook", node)
|
||||
result = newTypedCallExpr(node, callee, @[])
|
||||
|
||||
|
||||
proc builtinDestroyExpr*(self: TypeChecker, node: CallExpr, callee: Name,
|
||||
argument: TypedExpr, genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'destroy' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'destroy' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let lowered = self.lowerVarArgument(argument, argument.kind, node.arguments.positionals[0])
|
||||
if lowered.kind.unwrapType().kind == Reference:
|
||||
self.error("builtin 'destroy' does not support managed refs; refs are destroyed automatically and custom ref cleanup runs at that point", node.arguments.positionals[0])
|
||||
let destroyed = lowered.getName()
|
||||
if not destroyed.isNil():
|
||||
self.clearInitialized(destroyed)
|
||||
self.nameInits.del(destroyed.nameKey())
|
||||
if destroyed.tracksMove():
|
||||
self.movedNames.incl(destroyed.nameKey())
|
||||
result = newTypedCallExpr(node, callee, @[lowered],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
|
||||
|
||||
# hasImplicitCopyCapability moved to comparison.nim
|
||||
|
||||
|
||||
proc validateValueHookSignature*(self: TypeChecker, name: Name, node: FunDecl,
|
||||
label: string, arity: int): Type =
|
||||
if name.isNil() or node.isNil() or name.ident.token.lexeme != label:
|
||||
return nil
|
||||
if name.valueType.signature.len() != arity:
|
||||
self.error(&"'{label}' hook must take exactly {arity} parameter" &
|
||||
(if arity == 1: "" else: "s"), node)
|
||||
if not name.valueType.returnType.isNil():
|
||||
self.error(&"'{label}' hook must not return a value", node.returnType)
|
||||
let target = name.valueType.signature[0].kind.unwrapType()
|
||||
if target.kind != Structure or target.intrinsic:
|
||||
self.error(&"'{label}' hook can only target user-defined value types", node)
|
||||
target
|
||||
|
||||
|
||||
proc validateDestroyHookTarget*(self: TypeChecker, name: Name, node: FunDecl): Type =
|
||||
if name.isNil() or node.isNil() or name.ident.token.lexeme != "destroy=":
|
||||
return nil
|
||||
if name.valueType.signature.len() != 1:
|
||||
self.error("'destroy=' hook must take exactly 1 parameter", node)
|
||||
if not name.valueType.returnType.isNil():
|
||||
self.error("'destroy=' hook must not return a value", node.returnType)
|
||||
let target = name.valueType.signature[0].kind.unwrapType()
|
||||
case target.kind
|
||||
of Structure:
|
||||
if target.intrinsic:
|
||||
self.error("'destroy=' hook can only target user-defined value types or ref object types", node)
|
||||
of Reference:
|
||||
let payload = target.value.unwrapType()
|
||||
if payload.kind != Structure or payload.intrinsic:
|
||||
self.error("'destroy=' hook can only target user-defined value types or ref object types", node)
|
||||
else:
|
||||
self.error("'destroy=' hook can only target user-defined value types or ref object types", node)
|
||||
target
|
||||
|
||||
|
||||
proc validateCopyHook*(self: TypeChecker, name: Name, node: FunDecl) =
|
||||
if name.isNil() or node.isNil() or name.ident.token.lexeme != "copy=":
|
||||
return
|
||||
let target = self.validateValueHookSignature(name, node, "copy=", 2)
|
||||
let dst = name.valueType.signature[0]
|
||||
let src = name.valueType.signature[1]
|
||||
if not dst.isVar:
|
||||
self.error("'copy=' destination parameter must be declared as 'var'", node)
|
||||
if src.isVar:
|
||||
self.error("'copy=' source parameter must not be declared as 'var'", node)
|
||||
if not self.compare(dst.kind, target) or not self.compare(src.kind, target):
|
||||
self.error("'copy=' hook parameters must both use the hooked type", node)
|
||||
if not target.copyHook.isNil() and target.copyHook != name:
|
||||
self.error(&"type '{target.name}' already defines a 'copy=' hook", node)
|
||||
target.copyHook = name
|
||||
self.markResolved(name)
|
||||
|
||||
|
||||
proc validateMoveHook*(self: TypeChecker, name: Name, node: FunDecl) =
|
||||
if name.isNil() or node.isNil() or name.ident.token.lexeme != "move=":
|
||||
return
|
||||
let target = self.validateValueHookSignature(name, node, "move=", 2)
|
||||
let dst = name.valueType.signature[0]
|
||||
let src = name.valueType.signature[1]
|
||||
if not dst.isVar:
|
||||
self.error("'move=' destination parameter must be declared as 'var'", node)
|
||||
if not src.isVar:
|
||||
self.error("'move=' source parameter must be declared as 'var'", node)
|
||||
if not self.compare(dst.kind, target) or not self.compare(src.kind, target):
|
||||
self.error("'move=' hook parameters must both use the hooked type", node)
|
||||
if not target.moveHook.isNil() and target.moveHook != name:
|
||||
self.error(&"type '{target.name}' already defines a 'move=' hook", node)
|
||||
target.moveHook = name
|
||||
self.markResolved(name)
|
||||
|
||||
|
||||
proc validateDestroyHook*(self: TypeChecker, name: Name, node: FunDecl) =
|
||||
if name.isNil() or node.isNil() or name.ident.token.lexeme != "destroy=":
|
||||
return
|
||||
let target = self.validateDestroyHookTarget(name, node)
|
||||
let parameter = name.valueType.signature[0]
|
||||
if not parameter.isVar:
|
||||
self.error("'destroy=' hook parameter must be declared as 'var'", node)
|
||||
if not self.compare(parameter.kind, target):
|
||||
self.error("'destroy=' hook parameter must use the hooked type", node)
|
||||
if not target.destroyHook.isNil() and target.destroyHook != name:
|
||||
self.error(&"type '{target.name}' already defines a 'destroy=' hook", node)
|
||||
target.destroyHook = name
|
||||
self.markResolved(name)
|
||||
|
||||
|
||||
proc builtinMoveExpr*(self: TypeChecker, node: CallExpr, callee: Name,
|
||||
args: seq[TypedExpr], genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'move' only accepts positional arguments", node)
|
||||
if node.arguments.positionals.len() == 1:
|
||||
let argument = args[0]
|
||||
let moved = argument.kind.unwrapType()
|
||||
if moved.kind == Reference:
|
||||
return newTypedMoveExpr(node, argument.kind, argument)
|
||||
self.error(&"builtin 'move' expects a managed ref value, got {self.stringify(argument.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
if node.arguments.positionals.len() != 2:
|
||||
self.error(&"builtin 'move' expects exactly 1 or 2 arguments (got {node.arguments.positionals.len()})", node)
|
||||
let loweredDst = self.lowerVarArgument(args[0], args[0].kind, node.arguments.positionals[0])
|
||||
let loweredSrc = self.lowerMoveArgument(args[1], args[1].kind, node.arguments.positionals[1])
|
||||
let moved = loweredSrc.getName()
|
||||
if not moved.isNil():
|
||||
self.clearInitialized(moved)
|
||||
self.nameInits.del(moved.nameKey())
|
||||
if moved.tracksMove():
|
||||
self.movedNames.incl(moved.nameKey())
|
||||
result = newTypedCallExpr(node, callee, @[loweredDst, loweredSrc],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
|
||||
|
||||
proc isOwningManagedRef*(self: TypeChecker, argument: TypedExpr): bool =
|
||||
if argument.isNil() or argument.kind.isNil() or argument.kind.unwrapType().kind != Reference:
|
||||
return false
|
||||
if argument of TypedMoveExpr or argument of TypedConstructExpr or argument of TypedArrayConstructExpr:
|
||||
return true
|
||||
case argument.node.kind:
|
||||
of identExpr:
|
||||
let name = argument.getName()
|
||||
if name.isNil():
|
||||
return false
|
||||
if name.kind == NameKind.Var and name.node.isNil():
|
||||
return true
|
||||
if not name.node.isNil() and name.node.kind == NodeKind.varDecl:
|
||||
return true
|
||||
if not self.currentFunction.isNil() and
|
||||
not self.currentFunction.node.isNil() and
|
||||
self.currentFunction.node.kind == NodeKind.funDecl:
|
||||
return name.ident.token.lexeme in FunDecl(self.currentFunction.node).parameters
|
||||
return false
|
||||
of callExpr:
|
||||
return true
|
||||
else:
|
||||
return false
|
||||
|
||||
|
||||
proc builtinCloneExpr*(self: TypeChecker, node: CallExpr, callee: Name, returnType: Type,
|
||||
argument: TypedExpr, genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'clone' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'clone' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let cloned = argument.kind.unwrapType()
|
||||
if cloned.kind != Reference:
|
||||
self.error(&"builtin 'clone' expects a managed ref value, got {self.stringify(argument.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
if not self.isOwningManagedRef(argument):
|
||||
self.error("builtin 'clone' expects an owning managed ref value", node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[argument],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
result.kind =
|
||||
if returnType.isNil():
|
||||
argument.kind
|
||||
else:
|
||||
returnType
|
||||
|
||||
|
||||
proc builtinLowType*(self: TypeChecker, node: CallExpr): Type =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'low' only accepts a positional argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'low' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let typeExpr = self.ensureType(node.arguments.positionals[0])
|
||||
let target = typeExpr.kind.unwrapType()
|
||||
case target.kind
|
||||
of Integer:
|
||||
return target
|
||||
of CInterop:
|
||||
if target.ckind.isCInteropInteger():
|
||||
return target
|
||||
of Array, TypeKind.String:
|
||||
return "int64".toIntrinsic()
|
||||
of Structure:
|
||||
if isSeqType(target):
|
||||
return "int64".toIntrinsic()
|
||||
else:
|
||||
discard
|
||||
self.error(&"builtin 'low' expects an integer, string, array, or seq type, got {self.stringify(target, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
|
||||
|
||||
proc builtinLowExpr*(self: TypeChecker, node: CallExpr, callee: Name, returnType: Type,
|
||||
argument: TypedExpr, label: string,
|
||||
genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error(&"builtin '{label}' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin '{label}' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let target = lengthTargetType(argument.kind)
|
||||
if target.isNil() or target.kind notin {TypeKind.String, Array}:
|
||||
self.error(&"builtin '{label}' expects a string or array value, got {self.stringify(argument.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[argument],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
result.kind =
|
||||
if returnType.isNil():
|
||||
"int64".toIntrinsic()
|
||||
else:
|
||||
returnType
|
||||
|
||||
|
||||
proc builtinLenLikeExpr*(self: TypeChecker, node: CallExpr, callee: Name, returnType: Type,
|
||||
argument: TypedExpr, label: string,
|
||||
genericArgs: seq[GenericArg] = @[]): TypedExpr =
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error(&"builtin '{label}' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin '{label}' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
let target = lengthTargetType(argument.kind)
|
||||
if target.isNil() or target.kind notin {TypeKind.String, Array}:
|
||||
self.error(&"builtin '{label}' expects a string or array value, got {self.stringify(argument.kind, true)} instead",
|
||||
node.arguments.positionals[0])
|
||||
result = newTypedCallExpr(node, callee, @[argument],
|
||||
templateDecl=(if genericArgs.len > 0: callee else: nil),
|
||||
genericArgs=genericArgs)
|
||||
result.kind =
|
||||
if returnType.isNil():
|
||||
"int64".toIntrinsic()
|
||||
else:
|
||||
returnType
|
||||
|
||||
|
||||
proc bitcastSupportedType*(typ: Type): bool =
|
||||
if typ.isNil():
|
||||
return false
|
||||
case typ.unwrapType().kind
|
||||
of Integer, Float, Boolean, Byte, Char, Pointer:
|
||||
true
|
||||
of CInterop:
|
||||
typ.unwrapType().ckind notin {CVoid, CString}
|
||||
else:
|
||||
false
|
||||
|
||||
|
||||
proc bitcastTypeSize*(typ: Type): int =
|
||||
if typ.isNil():
|
||||
return -1
|
||||
let target = typ.unwrapType()
|
||||
case target.kind
|
||||
of Integer:
|
||||
int(target.size) div 8
|
||||
of Float:
|
||||
int(target.width) div 8
|
||||
of Boolean, Byte, Char:
|
||||
1
|
||||
of CInterop:
|
||||
case target.ckind
|
||||
of CBool:
|
||||
1
|
||||
of CFloat:
|
||||
int(sizeof(cfloat))
|
||||
of CDouble:
|
||||
int(sizeof(cdouble))
|
||||
of CString:
|
||||
sizeof(pointer)
|
||||
of CVoid:
|
||||
-1
|
||||
else:
|
||||
let info = cinteropIntegerInfo(target.ckind)
|
||||
info.bits div 8
|
||||
of Pointer:
|
||||
sizeof(pointer)
|
||||
else:
|
||||
-1
|
||||
|
||||
|
||||
proc builtinCastExpr*(self: TypeChecker, node: CallExpr, generic: GenericExpr): TypedExpr =
|
||||
if generic.args.len != 1:
|
||||
self.error(&"builtin 'cast' expects exactly 1 explicit target type argument (got {generic.args.len})", generic)
|
||||
if node.arguments.keyword.len() != 0:
|
||||
self.error("builtin 'cast' only accepts a positional value argument", node)
|
||||
if node.arguments.positionals.len() != 1:
|
||||
self.error(&"builtin 'cast' expects exactly 1 argument (got {node.arguments.positionals.len()})", node)
|
||||
|
||||
let candidates = self.findAll("cast").filterIt(
|
||||
not it.valueType.isNil() and
|
||||
it.valueType.kind == TypeKind.Function and
|
||||
it.valueType.isBuiltin and
|
||||
self.builtinMagic(it) == "cast"
|
||||
)
|
||||
if candidates.len() != 1:
|
||||
self.error("reference to undefined builtin 'cast'", generic.ident)
|
||||
|
||||
let targetType = self.ensureType(generic.args[0]).kind.unwrapType()
|
||||
let argument = self.expressionImpl(self, node.arguments.positionals[0])
|
||||
let sourceType = argument.kind.unwrapType()
|
||||
|
||||
if not bitcastSupportedType(sourceType) or not bitcastSupportedType(targetType):
|
||||
self.error(&"builtin 'cast' only supports raw pointer and scalar bitcasts, got {self.stringify(sourceType, true)} -> {self.stringify(targetType, true)}",
|
||||
node)
|
||||
|
||||
let sourceSize = bitcastTypeSize(sourceType)
|
||||
let targetSize = bitcastTypeSize(targetType)
|
||||
if sourceSize != targetSize:
|
||||
self.error(&"builtin 'cast' requires source and target types to have the same size, got {self.stringify(sourceType, true)} -> {self.stringify(targetType, true)}",
|
||||
node)
|
||||
|
||||
result = newTypedCallExpr(node,
|
||||
candidates[0],
|
||||
@[argument],
|
||||
templateDecl = candidates[0],
|
||||
genericArgs = @[newTypeGenericArg(targetType)],
|
||||
calleeNode = Expression(generic))
|
||||
result.kind = targetType
|
||||
|
||||
|
||||
proc lengthTargetType*(exprType: Type): Type =
|
||||
if exprType.isNil():
|
||||
return nil
|
||||
var current = exprType.unwrapType()
|
||||
while current.kind in {Reference, TypeKind.Lent, TypeKind.Const, Pointer}:
|
||||
current = current.value.unwrapType()
|
||||
current
|
||||
|
||||
|
||||
proc indexTargetType*(exprType: Type): Type =
|
||||
if exprType.isNil():
|
||||
return nil
|
||||
var current = exprType.unwrapType()
|
||||
while current.kind in {Reference, TypeKind.Lent}:
|
||||
current = current.value.unwrapType()
|
||||
current
|
||||
|
||||
|
||||
func isSeqType*(target: Type): bool =
|
||||
let unwrapped = target.unwrapType()
|
||||
unwrapped.kind == Structure and unwrapped.name == "seq"
|
||||
818
src/frontend/compiler/type_checking/comparison.nim
Normal file
818
src/frontend/compiler/type_checking/comparison.nim
Normal file
@@ -0,0 +1,818 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/sequtils
|
||||
import std/parseutils
|
||||
|
||||
|
||||
## Forward declarations for functions defined in this module
|
||||
proc compare*(self: TypeChecker, a, b: Type): bool
|
||||
proc coerceWithContext*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): TypedExpr
|
||||
proc checkWithContext*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): Type {.inline, discardable.}
|
||||
|
||||
## Implementations of helpers needed by comparison that were formerly
|
||||
## forward-declared across modules in the monolithic type_checker.
|
||||
|
||||
func hasImplicitCopyCapability*(self: TypeChecker, typ: Type): bool =
|
||||
if typ.isNil():
|
||||
return false
|
||||
let target = typ.unwrapType()
|
||||
case target.kind
|
||||
of Reference, TypeKind.Lent, TypeKind.Const, Any, Typevar, Generic, Union:
|
||||
false
|
||||
of Structure:
|
||||
not target.isInterface
|
||||
else:
|
||||
true
|
||||
|
||||
|
||||
proc implementedInterfaces*(self: TypeChecker, typ: Type): seq[Type] =
|
||||
proc matchesDeclared(candidate: Type, target: Type): bool =
|
||||
let storage =
|
||||
if candidate.isNil():
|
||||
nil
|
||||
elif candidate.kind == Reference:
|
||||
candidate.value.unwrapType()
|
||||
else:
|
||||
candidate.unwrapType()
|
||||
not storage.isNil() and self.compare(storage, target) and self.compare(target, storage)
|
||||
|
||||
let target = typ.unwrapType()
|
||||
if target.isNil():
|
||||
return @[]
|
||||
if target.interfaces.len() > 0:
|
||||
return target.interfaces
|
||||
let intrinsicKey = intrinsicTypeKey(target)
|
||||
if intrinsicKey.len() > 0 and self.intrinsicInterfaces.hasKey(intrinsicKey):
|
||||
return self.intrinsicInterfaces[intrinsicKey]
|
||||
for moduleNodes in self.typedModules.values():
|
||||
for node in moduleNodes:
|
||||
if node.isNil() or node.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
let declared = TypedTypeDecl(node).name.valueType
|
||||
if matchesDeclared(declared, target):
|
||||
return declared.unwrapType().interfaces
|
||||
for scope in self.names:
|
||||
for bucket in scope.values():
|
||||
for candidate in bucket:
|
||||
if candidate.isNil() or candidate.node.isNil() or candidate.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
if matchesDeclared(candidate.valueType, target):
|
||||
return candidate.valueType.unwrapType().interfaces
|
||||
@[]
|
||||
|
||||
|
||||
proc borrowSourceMutable*(self: TypeChecker, expr: TypedExpr): bool =
|
||||
if expr.isNil() or expr.kind.isNil():
|
||||
return false
|
||||
let directName = expr.getName()
|
||||
if not directName.isNil() and directName.kind == NameKind.Var:
|
||||
if directName.isVarParam:
|
||||
return true
|
||||
if not directName.node.isNil() and directName.node.kind == NodeKind.varDecl:
|
||||
let decl = VarDecl(directName.node)
|
||||
return decl.mutable and not decl.constant
|
||||
|
||||
proc derefSourceMutable(source: TypedExpr): bool =
|
||||
if source.isNil() or source.kind.isNil():
|
||||
return false
|
||||
let sourceType = source.kind.unwrapType()
|
||||
case sourceType.kind
|
||||
of Reference, TypeKind.Lent, Pointer, TypeKind.Const:
|
||||
sourceType.kind in {Reference, TypeKind.Lent} and sourceType.mutable
|
||||
else:
|
||||
false
|
||||
|
||||
let actual = expr.kind.unwrapType()
|
||||
case actual.kind
|
||||
of Reference, TypeKind.Lent:
|
||||
case expr.node.kind
|
||||
of identExpr, getterExpr, sliceExpr:
|
||||
return actual.mutable
|
||||
else:
|
||||
return false
|
||||
else:
|
||||
case expr.node.kind
|
||||
of identExpr:
|
||||
let name = expr.getName()
|
||||
if name.isNil() or name.kind != NameKind.Var:
|
||||
return false
|
||||
if name.isVarParam:
|
||||
return true
|
||||
if not name.node.isNil() and name.node.kind == NodeKind.varDecl:
|
||||
let decl = VarDecl(name.node)
|
||||
return decl.mutable and not decl.constant
|
||||
return false
|
||||
of getterExpr:
|
||||
if expr of TypedFieldExpr:
|
||||
return self.borrowSourceMutable(TypedFieldExpr(expr).obj)
|
||||
return false
|
||||
of sliceExpr:
|
||||
if expr of TypedDerefExpr:
|
||||
return derefSourceMutable(TypedDerefExpr(expr).target)
|
||||
if expr of TypedIndexExpr:
|
||||
return self.borrowSourceMutable(TypedIndexExpr(expr).obj)
|
||||
return false
|
||||
else:
|
||||
return false
|
||||
|
||||
|
||||
proc borrowType*(self: TypeChecker, expr: TypedExpr): Type =
|
||||
if expr.isNil() or expr.kind.isNil():
|
||||
return nil
|
||||
|
||||
proc derefSourceInfo(source: TypedExpr): Type =
|
||||
if source.isNil() or source.kind.isNil():
|
||||
return nil
|
||||
let sourceType = source.kind.unwrapType()
|
||||
case sourceType.kind
|
||||
of Reference, TypeKind.Lent, Pointer, TypeKind.Const:
|
||||
sourceType.value.unwrapType()
|
||||
else:
|
||||
nil
|
||||
|
||||
proc makeBorrow(target: Type): Type =
|
||||
let base =
|
||||
if target.unwrapType().kind == Reference:
|
||||
target.unwrapType().withHandleMutability(false)
|
||||
else:
|
||||
target.unwrapType()
|
||||
result = base.toLent()
|
||||
result.mutable = false
|
||||
|
||||
let actual = expr.kind.unwrapType()
|
||||
case actual.kind
|
||||
of Reference, TypeKind.Lent:
|
||||
case expr.node.kind
|
||||
of identExpr, getterExpr, sliceExpr:
|
||||
if actual.kind == TypeKind.Lent:
|
||||
result = actual.normalizedBorrowedHandle()
|
||||
result.mutable = false
|
||||
return result
|
||||
return makeBorrow(actual)
|
||||
else:
|
||||
return nil
|
||||
else:
|
||||
case expr.node.kind
|
||||
of identExpr:
|
||||
let name = expr.getName()
|
||||
if name.isNil() or name.kind != NameKind.Var:
|
||||
return nil
|
||||
if name.isVarParam:
|
||||
return makeBorrow(actual)
|
||||
return makeBorrow(actual)
|
||||
of getterExpr:
|
||||
if expr of TypedFieldExpr:
|
||||
if self.borrowType(TypedFieldExpr(expr).obj).isNil():
|
||||
return nil
|
||||
return makeBorrow(actual)
|
||||
return nil
|
||||
of sliceExpr:
|
||||
if expr of TypedDerefExpr:
|
||||
if derefSourceInfo(TypedDerefExpr(expr).target).isNil():
|
||||
return nil
|
||||
return makeBorrow(actual)
|
||||
if expr of TypedIndexExpr:
|
||||
if self.borrowType(TypedIndexExpr(expr).obj).isNil():
|
||||
return nil
|
||||
return makeBorrow(actual)
|
||||
return nil
|
||||
else:
|
||||
return nil
|
||||
|
||||
|
||||
proc rawPointerMismatch*(self: TypeChecker, actual: TypedExpr, expected: Type): bool =
|
||||
if actual.isNil() or actual.kind.isNil() or expected.isNil():
|
||||
return false
|
||||
let source = actual.kind.unwrapType()
|
||||
let target = expected.unwrapType()
|
||||
source.kind == Pointer and
|
||||
target.kind == Pointer and
|
||||
(source.value.unwrapType().kind == Any) != (target.value.unwrapType().kind == Any)
|
||||
|
||||
|
||||
proc rawPointerLabel*(self: TypeChecker, typ: Type): string =
|
||||
if typ.isNil():
|
||||
return self.stringify(typ, true)
|
||||
let target = typ.unwrapType()
|
||||
if target.kind == Pointer and target.value.unwrapType().kind == Any:
|
||||
return "pointer"
|
||||
self.stringify(typ, true)
|
||||
|
||||
|
||||
proc isBorrowedHandleMismatch*(self: TypeChecker, term: TypedExpr, expected: Type): bool =
|
||||
if term.isNil() or term.kind.isNil() or expected.isNil():
|
||||
return false
|
||||
let actual = term.kind.unwrapType()
|
||||
let target = expected.unwrapType()
|
||||
if target.kind != Reference or actual.kind != TypeKind.Lent:
|
||||
return false
|
||||
let borrowedOwner = actual.value.unwrapType()
|
||||
if borrowedOwner.kind != Reference:
|
||||
return false
|
||||
self.compare(borrowedOwner.value, target.value)
|
||||
|
||||
|
||||
proc borrowedHandleExprLabel*(self: TypeChecker, term: TypedExpr): string =
|
||||
if term of TypedBorrowExpr:
|
||||
let sourceName = TypedBorrowExpr(term).target.getName()
|
||||
if not sourceName.isNil():
|
||||
return &"borrow({sourceName.ident.token.lexeme})"
|
||||
return "this expression"
|
||||
|
||||
|
||||
proc borrowedHandleMoveHint*(self: TypeChecker, term: TypedExpr): string =
|
||||
if term of TypedBorrowExpr:
|
||||
let sourceName = TypedBorrowExpr(term).target.getName()
|
||||
if not sourceName.isNil():
|
||||
return &"use move({sourceName.ident.token.lexeme}) to transfer ownership"
|
||||
return "use move(...) to transfer ownership"
|
||||
|
||||
|
||||
proc infer(self: TypeChecker, node: LiteralExpr): TypedExpr
|
||||
|
||||
proc infer*(self: TypeChecker, node: Expression): TypedExpr =
|
||||
## Infers the type of the given expression using the
|
||||
## expression dispatch hook
|
||||
if node.isConst():
|
||||
return self.infer(LiteralExpr(node))
|
||||
result = self.expressionImpl(self, node)
|
||||
|
||||
|
||||
proc inferOrError*(self: TypeChecker, node: Expression): TypedExpr =
|
||||
## Attempts to infer the type of
|
||||
## the given expression and raises an
|
||||
## error if it fails
|
||||
result = self.infer(node)
|
||||
if result.isNil():
|
||||
self.error("expression has no type", node)
|
||||
|
||||
|
||||
proc attachConstructContext(self: TypeChecker, construct: TypedConstructExpr, expected: Type) =
|
||||
self.attachConstructContextImpl(self, construct, expected)
|
||||
|
||||
|
||||
proc inferConstructGenericArgs(self: TypeChecker, callee: Name, expected: Type): seq[GenericArg] =
|
||||
self.inferConstructGenericArgsImpl(self, callee, expected)
|
||||
|
||||
|
||||
proc instantiateDeclaredType(self: TypeChecker, templateDecl: Name, typ: Type,
|
||||
genericArgs: seq[GenericArg], node: ASTNode = nil): Type =
|
||||
self.instantiateDeclaredTypeImpl(self, templateDecl, typ, genericArgs, node)
|
||||
|
||||
|
||||
proc isAny*(typ: Type): bool =
|
||||
## Returns true if the given type is
|
||||
## of (or contains) the any type. Not
|
||||
## applicable to typevars
|
||||
case typ.kind:
|
||||
of Any:
|
||||
return true
|
||||
of Union:
|
||||
for condition in typ.constraints:
|
||||
if condition.kind.isAny():
|
||||
return true
|
||||
of Typevar:
|
||||
return typ.wrapped.isAny()
|
||||
else:
|
||||
return false
|
||||
|
||||
|
||||
proc isStaticallyConvertible(value: TypedExpr, b: Type): bool =
|
||||
## Returns whether a given typed expression can
|
||||
## be statically converted to the given type.
|
||||
## This is useful for things like "var x: int16 = 1;"
|
||||
## where the inferred type of 1 would be int64,
|
||||
## which causes a type error because an int16 is
|
||||
## expected. This is meant to make the type system
|
||||
## a little more context-aware and friendly to use.
|
||||
case b.kind:
|
||||
of Integer:
|
||||
if value.kind.kind == Integer and value.node.isConst():
|
||||
try:
|
||||
let lexeme = value.node.token.lexeme.split("'")[0]
|
||||
if b.signed:
|
||||
let bits = b.size.int
|
||||
let upperBound =
|
||||
if bits == 64:
|
||||
high(int64)
|
||||
else:
|
||||
(1'i64 shl (bits - 1)) - 1
|
||||
let lowerBound =
|
||||
if bits == 64:
|
||||
low(int64)
|
||||
else:
|
||||
-(1'i64 shl (bits - 1))
|
||||
let integer = parseBiggestInt(lexeme)
|
||||
return integer <= BiggestInt(upperBound) and integer >= BiggestInt(lowerBound)
|
||||
else:
|
||||
let bits = b.size.int
|
||||
let upperBound =
|
||||
if bits == 64:
|
||||
high(uint64)
|
||||
else:
|
||||
(1'u64 shl bits) - 1
|
||||
let integer = parseBiggestUInt(lexeme)
|
||||
return integer <= upperBound and integer >= 0
|
||||
except ValueError:
|
||||
return false
|
||||
of CInterop:
|
||||
case b.ckind
|
||||
of CBool:
|
||||
return value.kind.kind == Boolean and value.node.isConst()
|
||||
of CString:
|
||||
return value.kind.kind == String and value.node.isConst()
|
||||
of CFloat, CDouble:
|
||||
return value.kind.kind == Float and value.node.isConst()
|
||||
else:
|
||||
if value.kind.kind == Integer and value.node.isConst():
|
||||
try:
|
||||
let lexeme = value.node.token.lexeme.split("'")[0]
|
||||
let info = cinteropIntegerInfo(b.ckind)
|
||||
if info.signed:
|
||||
let upperBound =
|
||||
if info.bits == 64:
|
||||
high(int64)
|
||||
else:
|
||||
(1'i64 shl (info.bits - 1)) - 1
|
||||
let lowerBound =
|
||||
if info.bits == 64:
|
||||
low(int64)
|
||||
else:
|
||||
-(1'i64 shl (info.bits - 1))
|
||||
let integer = parseBiggestInt(lexeme)
|
||||
return integer <= BiggestInt(upperBound) and integer >= BiggestInt(lowerBound)
|
||||
let upperBound =
|
||||
if info.bits == 64:
|
||||
high(uint64)
|
||||
else:
|
||||
(1'u64 shl info.bits) - 1
|
||||
let integer = parseBiggestUInt(lexeme)
|
||||
return integer <= upperBound and integer >= 0
|
||||
except ValueError:
|
||||
return false
|
||||
of Byte:
|
||||
if value.kind.kind == Integer and value.node.isConst():
|
||||
try:
|
||||
let integer = parseBiggestUInt(value.node.token.lexeme.split("'")[0])
|
||||
return integer <= BiggestUInt(uint8.high)
|
||||
except ValueError:
|
||||
return false
|
||||
else: # TODO
|
||||
discard
|
||||
|
||||
|
||||
proc infer(self: TypeChecker, node: LiteralExpr): TypedExpr =
|
||||
case node.kind:
|
||||
of trueExpr, falseExpr:
|
||||
return newTypedExpr(node, "bool".toIntrinsic())
|
||||
of strExpr:
|
||||
return newTypedExpr(node, "string".toIntrinsic())
|
||||
of charExpr:
|
||||
return newTypedExpr(node, "char".toIntrinsic())
|
||||
of nanExpr:
|
||||
return newTypedExpr(node, "NaN".toIntrinsic())
|
||||
of infExpr:
|
||||
return newTypedExpr(node, "Inf".toIntrinsic())
|
||||
of intExpr, binExpr, octExpr, hexExpr:
|
||||
let size = node.token.lexeme.split("'")
|
||||
if size.len() == 1:
|
||||
return newTypedExpr(node, "int64".toIntrinsic())
|
||||
let typ = size[1].toIntrinsic()
|
||||
if typ.isNil() or typ.kind != TypeKind.Integer:
|
||||
self.error(&"invalid type specifier '{size[1]}' for integer", node)
|
||||
result = newTypedExpr(node, typ)
|
||||
# Temporary hack so isStaticallyConvertible doesn't
|
||||
# break when trying to parse the integer's lexeme
|
||||
let prevLexeme = node.token.lexeme
|
||||
node.token.lexeme = size[0]
|
||||
let isConvertible = result.isStaticallyConvertible(typ)
|
||||
node.token.lexeme = prevLexeme
|
||||
if not isConvertible:
|
||||
self.error(&"invalid value '{size[0]}' for {self.stringify(typ)}", node)
|
||||
of floatExpr:
|
||||
let size = node.token.lexeme.split("'")
|
||||
if size.len() == 1:
|
||||
return newTypedExpr(node, "float".toIntrinsic())
|
||||
let typ = size[1].toIntrinsic()
|
||||
if typ.isNil() or typ.kind != TypeKind.Float:
|
||||
self.error(&"invalid type specifier '{size[1]}' for float", node)
|
||||
result = newTypedExpr(node, typ)
|
||||
if result.isNil():
|
||||
self.error(&"invalid type specifier '{size[1]}' for float", node)
|
||||
else:
|
||||
discard
|
||||
|
||||
|
||||
proc isSubsetOf*(self: TypeChecker, a, b: TypeCapabilities): bool =
|
||||
## Checks if type union a is a subset of type union b.
|
||||
## If a contains more elements than b, false is returned
|
||||
if len(a) > len(b):
|
||||
return false
|
||||
for cond1 in a:
|
||||
var matches = 0
|
||||
for cond2 in b:
|
||||
if self.compare(cond1.kind, cond2.kind) and cond1.match == cond2.match:
|
||||
inc(matches)
|
||||
if matches == 0:
|
||||
# Type is nowhere in b: not a subset
|
||||
return false
|
||||
return true
|
||||
|
||||
|
||||
proc matchUnion*(self: TypeChecker, a: Type, b: TypeCapabilities): bool =
|
||||
## Returns whether a non-union type a matches
|
||||
## the given untagged union b
|
||||
assert a.kind != Union
|
||||
for constraint in b:
|
||||
if self.compare(constraint.kind, a) and constraint.match:
|
||||
return true
|
||||
return false
|
||||
|
||||
|
||||
proc matchGeneric*(self: TypeChecker, a: Type, b: TypeCapabilities): bool =
|
||||
## Returns whether a concrete type matches the
|
||||
## given generic type b
|
||||
for constraint in b:
|
||||
if not self.compare(constraint.kind, a) or not constraint.match:
|
||||
return false
|
||||
return true
|
||||
|
||||
|
||||
proc compareImpl(self: TypeChecker, a, b: Type, active: var HashSet[TypeComparePair]): bool
|
||||
|
||||
|
||||
proc implementsInterface(self: TypeChecker, concrete, iface: Type,
|
||||
active: var HashSet[TypeComparePair]): bool =
|
||||
let actual = concrete.unwrapType()
|
||||
let target = iface.unwrapType()
|
||||
if actual.isNil() or target.isNil():
|
||||
return false
|
||||
if target.kind != Structure or not target.isInterface:
|
||||
return false
|
||||
if target.name == "Copy" and self.hasImplicitCopyCapability(actual):
|
||||
return true
|
||||
if actual.kind == Structure and actual.isInterface and actual.name == target.name:
|
||||
return true
|
||||
for implemented in self.implementedInterfaces(actual):
|
||||
if self.compareImpl(implemented, target, active):
|
||||
return true
|
||||
false
|
||||
|
||||
|
||||
proc matchUnionImpl(self: TypeChecker, a: Type, b: TypeCapabilities,
|
||||
active: var HashSet[TypeComparePair]): bool =
|
||||
## Checks whether type a is part of the type capability set b while
|
||||
## preserving cycle information across recursive comparisons.
|
||||
for cond in b:
|
||||
var matches = self.compareImpl(a, cond.kind, active)
|
||||
if not cond.match:
|
||||
matches = not matches
|
||||
if matches:
|
||||
return true
|
||||
return false
|
||||
|
||||
|
||||
proc compareImpl(self: TypeChecker, a, b: Type, active: var HashSet[TypeComparePair]): bool =
|
||||
## Compares two types and returns whether they are
|
||||
## compatible with each other
|
||||
if a.isNil() or b.isNil():
|
||||
return a.isNil() and b.isNil()
|
||||
if a.kind == Typevar:
|
||||
return self.compareImpl(a.wrapped, b, active)
|
||||
if b.kind == Typevar:
|
||||
return self.compareImpl(a, b.wrapped, active)
|
||||
if a.isAny() or b.isAny():
|
||||
return true
|
||||
let pair = (left: cast[uint](a), right: cast[uint](b))
|
||||
if pair.left != 0'u and pair.right != 0'u:
|
||||
if pair in active:
|
||||
return true
|
||||
active.incl(pair)
|
||||
defer:
|
||||
active.excl(pair)
|
||||
if a.kind == Structure and a.isInterface and (b.kind != Structure or not b.isInterface):
|
||||
return self.implementsInterface(b, a, active)
|
||||
if b.kind == Structure and b.isInterface and (a.kind != Structure or not a.isInterface):
|
||||
return self.implementsInterface(a, b, active)
|
||||
if a.kind == b.kind:
|
||||
case a.kind:
|
||||
of Typevar:
|
||||
return self.compareImpl(a.wrapped, b.wrapped, active)
|
||||
of Structure, EnumEntry:
|
||||
if a.kind == EnumEntry and not a.parent.isNil() and not b.parent.isNil():
|
||||
if not self.compareImpl(a.parent, b.parent, active):
|
||||
return false
|
||||
if a.tag != b.tag:
|
||||
return false
|
||||
# Compare type names, if they both have it
|
||||
# (some internally generated types may not
|
||||
# have names)
|
||||
if a.name.len() > 0 and b.name.len() > 0:
|
||||
if a.name != b.name:
|
||||
return false
|
||||
var hashSet = initHashSet[string]()
|
||||
for generic in a.generics.keys():
|
||||
hashSet.incl(generic)
|
||||
for generic in b.generics.keys():
|
||||
hashSet.incl(generic)
|
||||
for generic in hashSet:
|
||||
if generic notin a.generics:
|
||||
return false
|
||||
if generic notin b.generics:
|
||||
return false
|
||||
for generic in hashSet:
|
||||
if not self.compareImpl(a.generics[generic], b.generics[generic], active):
|
||||
return false
|
||||
return true
|
||||
# Compare fields
|
||||
var hashSet = initHashSet[string]()
|
||||
for field in a.fields.keys():
|
||||
hashSet.incl(field)
|
||||
for field in b.fields.keys():
|
||||
hashSet.incl(field)
|
||||
# Ensure both types have the same field
|
||||
# names
|
||||
for field in hashSet:
|
||||
if field notin a.fields:
|
||||
return false
|
||||
if field notin b.fields:
|
||||
return false
|
||||
# Ensure fields have matching types
|
||||
for field in hashSet:
|
||||
if not self.compareImpl(a.fields[field], b.fields[field], active):
|
||||
return false
|
||||
hashSet.clear()
|
||||
# Compare generic arguments
|
||||
|
||||
# Check generic types
|
||||
for generic in a.generics.keys():
|
||||
hashSet.incl(generic)
|
||||
for generic in b.generics.keys():
|
||||
hashSet.incl(generic)
|
||||
|
||||
# Ensure both types have the same generic
|
||||
# argument names
|
||||
for generic in hashSet:
|
||||
if generic notin a.generics:
|
||||
return false
|
||||
|
||||
for generic in hashSet:
|
||||
if not self.compareImpl(a.generics[generic], b.generics[generic], active):
|
||||
return false
|
||||
|
||||
return true
|
||||
of Boolean, Infinity, TypeKind.Nan, Any, Char, Byte, String:
|
||||
return true
|
||||
of Integer:
|
||||
return a.size == b.size and a.signed == b.signed
|
||||
of Float:
|
||||
return a.width == b.width
|
||||
of CInterop:
|
||||
return a.ckind == b.ckind
|
||||
of Array:
|
||||
if a.lengthParam.isNil() and b.lengthParam.isNil():
|
||||
if a.length != b.length:
|
||||
return false
|
||||
else:
|
||||
if a.lengthParam.isNil() or b.lengthParam.isNil():
|
||||
return false
|
||||
if not self.compareImpl(a.lengthParam, b.lengthParam, active):
|
||||
return false
|
||||
return self.compareImpl(a.elementType, b.elementType, active)
|
||||
of UncheckedArray:
|
||||
return self.compareImpl(a.elementType, b.elementType, active)
|
||||
of TypeKind.Lent, Reference, Pointer, TypeKind.Const:
|
||||
if a.kind in {Reference, TypeKind.Lent} and b.mutable and not a.mutable:
|
||||
return false
|
||||
if a.kind == Pointer:
|
||||
let leftValue = a.value.unwrapType()
|
||||
let rightValue = b.value.unwrapType()
|
||||
if leftValue.kind == UncheckedArray and rightValue.kind == UncheckedArray:
|
||||
return self.compareImpl(leftValue.elementType, rightValue.elementType, active)
|
||||
if leftValue.kind == UncheckedArray:
|
||||
return self.compareImpl(leftValue.elementType, rightValue, active)
|
||||
if rightValue.kind == UncheckedArray:
|
||||
return self.compareImpl(leftValue, rightValue.elementType, active)
|
||||
return self.compareImpl(a.value, b.value, active)
|
||||
of Union, Generic:
|
||||
if a.constraints.len() > b.constraints.len():
|
||||
return self.isSubsetOf(b.constraints, a.constraints)
|
||||
return self.isSubsetOf(a.constraints, b.constraints)
|
||||
of Function:
|
||||
if a.signature.len() != b.signature.len():
|
||||
return false
|
||||
if a.returnType.isNil() != b.returnType.isNil():
|
||||
return false
|
||||
if not a.returnType.isNil() and not self.compareImpl(a.returnType, b.returnType, active):
|
||||
return false
|
||||
for i, parameter in a.signature:
|
||||
let other = b.signature[i]
|
||||
if parameter.name != other.name:
|
||||
return false
|
||||
if parameter.isVar != other.isVar:
|
||||
return false
|
||||
if not self.compareImpl(parameter.kind, other.kind, active):
|
||||
return false
|
||||
return true
|
||||
if a.kind == Union:
|
||||
return self.matchUnionImpl(b, a.constraints, active)
|
||||
if b.kind == Union:
|
||||
return self.matchUnionImpl(a, b.constraints, active)
|
||||
if a.kind == Generic:
|
||||
return self.matchGeneric(b, a.constraints)
|
||||
if b.kind == Generic:
|
||||
return self.matchGeneric(a, b.constraints)
|
||||
return false
|
||||
|
||||
|
||||
proc compare*(self: TypeChecker, a, b: Type): bool =
|
||||
var active = initHashSet[TypeComparePair]()
|
||||
self.compareImpl(a, b, active)
|
||||
|
||||
|
||||
proc compareWithContext*(self: TypeChecker, a: TypedExpr, b: Type): Type =
|
||||
## Same as compare(), except it takes in extra context
|
||||
## for its first argument and attempts to convert it to
|
||||
## the given type if the first simple comparison fails.
|
||||
## The type of the resulting expression (which will be
|
||||
## the same input type if no static type conversion is
|
||||
## needed) is returned. If the simple and context-aware
|
||||
## comparisons both fail, nil is returned
|
||||
let expected = b.unwrapType()
|
||||
|
||||
proc isFreshOwnedHandle(expr: TypedExpr): bool =
|
||||
if expr.isNil():
|
||||
return false
|
||||
let exprType = expr.kind.unwrapType()
|
||||
if exprType.kind notin {Reference, TypeKind.Lent}:
|
||||
return false
|
||||
expr of TypedConstructExpr or
|
||||
expr of TypedArrayConstructExpr or
|
||||
expr.node.kind == NodeKind.callExpr
|
||||
|
||||
if expected.kind in {Reference, TypeKind.Lent}:
|
||||
let actual = a.kind.unwrapType()
|
||||
if actual.kind == expected.kind and
|
||||
self.compare(actual.value, expected.value) and
|
||||
expected.mutable and
|
||||
not actual.mutable and
|
||||
isFreshOwnedHandle(a):
|
||||
return expected
|
||||
if actual.kind == expected.kind and
|
||||
self.compare(actual.value, expected.value) and
|
||||
self.compare(actual, expected) and
|
||||
not self.compare(expected, actual):
|
||||
return expected
|
||||
|
||||
if a of TypedArrayConstructExpr and TypedArrayConstructExpr(a).isLiteral:
|
||||
let payloadType =
|
||||
case expected.kind:
|
||||
of Reference:
|
||||
let payload = expected.value.unwrapType()
|
||||
if payload.kind == Array: payload else: nil
|
||||
of Array:
|
||||
expected
|
||||
else:
|
||||
nil
|
||||
if not payloadType.isNil():
|
||||
let literal = TypedArrayConstructExpr(a)
|
||||
if literal.elements.len != payloadType.length:
|
||||
return nil
|
||||
for element in literal.elements.mitems():
|
||||
element = self.coerceWithContext(element, payloadType.elementType, element.node)
|
||||
if expected.kind == Reference:
|
||||
return expected
|
||||
return payloadType
|
||||
|
||||
if expected.kind == TypeKind.Lent:
|
||||
let borrowed = self.borrowType(a)
|
||||
if not borrowed.isNil() and
|
||||
self.compare(borrowed.value, expected.value) and
|
||||
(not expected.mutable or self.borrowSourceMutable(a)):
|
||||
result = borrowed.deepCopy()
|
||||
result.mutable = expected.mutable
|
||||
return
|
||||
|
||||
if a of TypedConstructExpr:
|
||||
let construct = TypedConstructExpr(a)
|
||||
let genericArgs = self.inferConstructGenericArgs(construct.callee, expected)
|
||||
if genericArgs.len() > 0:
|
||||
let resolvedType = self.instantiateDeclaredType(construct.callee, construct.kind.unwrapType(), genericArgs, construct.node)
|
||||
if self.compare(resolvedType, expected):
|
||||
if not construct.variant.isNil():
|
||||
for candidate in expected.variants:
|
||||
if candidate.name == construct.variant.name:
|
||||
construct.variant = candidate
|
||||
break
|
||||
return expected
|
||||
|
||||
result = a.kind
|
||||
if self.compare(result, b):
|
||||
if self.rawPointerMismatch(a, b):
|
||||
return nil
|
||||
return
|
||||
if a.isStaticallyConvertible(b):
|
||||
return b
|
||||
return nil
|
||||
|
||||
|
||||
proc check*(self: TypeChecker, term, expected: Type, node: ASTNode = nil): Type {.inline, discardable.} =
|
||||
## Checks the type of the given expression against a known one.
|
||||
## Raises an error if appropriate and returns the typed expression
|
||||
## otherwise. The node is passed in to error() in case of a failure
|
||||
## for more precise error reporting
|
||||
if not self.compare(term, expected):
|
||||
self.error(&"expecting an expression of type {self.stringify(expected, true)}, got {self.stringify(term, true)} instead", node)
|
||||
result = term
|
||||
|
||||
|
||||
proc check*(self: TypeChecker, term, expected: TypedExpr, node: ASTNode = nil): TypedExpr {.inline, discardable.} =
|
||||
## Checks the type of the given expression against a known one.
|
||||
## Raises an error if appropriate and returns the typed expression
|
||||
## otherwise. The node is passed in to error() in case of a failure
|
||||
## for more precise error reporting
|
||||
if not self.compare(term.kind, expected.kind):
|
||||
self.error(&"expecting an expression of type {self.stringify(expected, true)}, got {self.stringify(term, true)} instead", node)
|
||||
result = term
|
||||
|
||||
|
||||
proc check*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): TypedExpr {.inline, discardable.} =
|
||||
## Checks the type of the given expression against a known one.
|
||||
## Raises an error if appropriate and returns the typed expression
|
||||
## otherwise. The node is passed in to error() in case of a failure
|
||||
## for more precise error reporting
|
||||
if not self.compare(term.kind, expected):
|
||||
self.error(&"expecting an expression of type {self.stringify(expected, true)}, got {self.stringify(term, true)} instead", node)
|
||||
result = term
|
||||
|
||||
|
||||
proc checkWithContext*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): Type {.inline, discardable.} =
|
||||
## The same as check(), but it uses compareWithContext() rather
|
||||
## than compare() to perform type comparisons
|
||||
result = self.compareWithContext(term, expected)
|
||||
if result.isNil():
|
||||
if self.rawPointerMismatch(term, expected):
|
||||
self.error(&"cannot implicitly convert {self.rawPointerLabel(term.kind)} to {self.rawPointerLabel(expected)}; use cast[{self.rawPointerLabel(expected)}](...) instead",
|
||||
node)
|
||||
if self.isBorrowedHandleMismatch(term, expected):
|
||||
self.error(&"cannot use borrowed handle where an owning ref is required: " &
|
||||
&"{self.borrowedHandleExprLabel(term)} produces a non-owning borrow, " &
|
||||
&"but this context expects {self.stringify(expected, true)}; " &
|
||||
self.borrowedHandleMoveHint(term),
|
||||
node)
|
||||
self.error(&"expecting an expression of type {self.stringify(expected, true)}, got {self.stringify(term, true)} instead", node)
|
||||
|
||||
|
||||
proc coerceCallArgument*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): TypedExpr =
|
||||
let target = expected.unwrapType()
|
||||
if target.kind == TypeKind.Lent and not target.mutable and self.borrowType(term).isNil():
|
||||
let actual = term.kind.unwrapType()
|
||||
if actual.kind notin {Reference, Pointer, TypeKind.Lent, TypeKind.Const} and
|
||||
self.compare(term.kind, target.value):
|
||||
return newTypedBorrowExpr(Expression(term.node), expected, term)
|
||||
self.coerceWithContext(term, expected, node)
|
||||
|
||||
|
||||
proc coerceWithContext*(self: TypeChecker, term: TypedExpr, expected: Type, node: ASTNode = nil): TypedExpr =
|
||||
let resolved = self.checkWithContext(term, expected, node)
|
||||
let target = resolved.unwrapType()
|
||||
let actual = term.kind.unwrapType()
|
||||
if term of TypedConstructExpr:
|
||||
self.attachConstructContext(TypedConstructExpr(term), resolved)
|
||||
if target.kind == TypeKind.Lent and actual.kind != TypeKind.Lent:
|
||||
return newTypedBorrowExpr(Expression(term.node), resolved, term)
|
||||
if target.kind == Reference and
|
||||
actual.kind == Reference and
|
||||
not (term of TypedMoveExpr) and
|
||||
not (term of TypedConstructExpr) and
|
||||
not (term of TypedArrayConstructExpr) and
|
||||
term.node.kind != NodeKind.callExpr:
|
||||
return newTypedMoveExpr(Expression(term.node), resolved, term)
|
||||
result = term
|
||||
result.kind = resolved
|
||||
|
||||
|
||||
proc check*(self: TypeChecker, term: Expression, expected: Type, node: ASTNode = nil): TypedExpr {.inline, discardable.} =
|
||||
## Checks the type of the given expression against a known one.
|
||||
## Raises an error if appropriate and returns the typed expression
|
||||
## otherwise
|
||||
result = self.check(self.inferOrError(term), expected, node)
|
||||
429
src/frontend/compiler/type_checking/declarations.nim
Normal file
429
src/frontend/compiler/type_checking/declarations.nim
Normal file
@@ -0,0 +1,429 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/type_checking/pragmas
|
||||
import frontend/compiler/type_checking/names
|
||||
import frontend/compiler/type_checking/resolution
|
||||
import frontend/compiler/type_checking/expressions
|
||||
import frontend/compiler/type_checking/borrow_check
|
||||
import frontend/compiler/type_checking/interfaces
|
||||
import frontend/compiler/type_checking/user_defines
|
||||
import frontend/compiler/type_checking/builtins
|
||||
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/sequtils
|
||||
|
||||
|
||||
proc invalidCAbiType(self: TypeChecker, typ: Type, node: ASTNode) =
|
||||
let unwrapped = typ.unwrapType()
|
||||
if not unwrapped.isNil() and unwrapped.kind == String:
|
||||
self.error("type 'string' is not allowed in a C ABI signature; use cstring instead", node)
|
||||
self.error(&"type '{self.stringify(typ, true)}' is not allowed in a C ABI signature", node)
|
||||
|
||||
|
||||
proc validateCAbiPointerTarget(self: TypeChecker, typ: Type, node: ASTNode)
|
||||
|
||||
|
||||
proc validateCAbiType(self: TypeChecker, typ: Type, node: ASTNode, allowBareVoid = false) =
|
||||
if typ.isNil():
|
||||
return
|
||||
let unwrapped = typ.unwrapType()
|
||||
case unwrapped.kind
|
||||
of CInterop:
|
||||
case unwrapped.ckind
|
||||
of CVoid:
|
||||
if not allowBareVoid:
|
||||
self.error("type 'cvoid' is only valid behind 'ptr' in a C ABI signature", node)
|
||||
else:
|
||||
discard
|
||||
of Pointer:
|
||||
self.validateCAbiPointerTarget(unwrapped.value, node)
|
||||
else:
|
||||
self.invalidCAbiType(typ, node)
|
||||
|
||||
|
||||
proc validateCAbiPointerTarget(self: TypeChecker, typ: Type, node: ASTNode) =
|
||||
if typ.isNil():
|
||||
return
|
||||
let unwrapped = typ.unwrapType()
|
||||
case unwrapped.kind
|
||||
of CInterop:
|
||||
case unwrapped.ckind
|
||||
of CString:
|
||||
self.error("type 'ptr cstring' is not allowed in a C ABI signature", node)
|
||||
else:
|
||||
discard
|
||||
of UncheckedArray:
|
||||
let elementType = unwrapped.elementType.unwrapType()
|
||||
if not elementType.isNil() and elementType.kind == CInterop and elementType.ckind == CString:
|
||||
self.error("type 'ptr UncheckedArray[cstring]' is not allowed in a C ABI signature", node)
|
||||
self.validateCAbiType(unwrapped.elementType, node)
|
||||
else:
|
||||
self.invalidCAbiType(typ, node)
|
||||
|
||||
|
||||
proc validateCAbiDeclaration(self: TypeChecker, name: Name, node: FunDecl, pragmaName,
|
||||
genericError: string) =
|
||||
if not functionHasPragma(name, pragmaName):
|
||||
return
|
||||
if node.generics.len() > 0:
|
||||
self.error(genericError, node.name)
|
||||
var index = 0
|
||||
for parameter in node.parameters.values():
|
||||
if parameter.isVar:
|
||||
self.error("var parameters are not allowed in a C ABI signature", parameter.ident)
|
||||
self.validateCAbiType(name.valueType.signature[index].kind, parameter.ident)
|
||||
inc(index)
|
||||
self.validateCAbiType(name.valueType.returnType, if node.returnType.isNil(): node.name else: node.returnType,
|
||||
allowBareVoid = true)
|
||||
|
||||
|
||||
proc validateImportCDeclaration(self: TypeChecker, name: Name, node: FunDecl) =
|
||||
self.validateCAbiDeclaration(name, node, "importc", "importc does not support generic functions")
|
||||
|
||||
|
||||
proc validateExportCDeclaration(self: TypeChecker, name: Name, node: FunDecl) =
|
||||
self.validateCAbiDeclaration(name, node, "exportc", "generic functions cannot be exported to C")
|
||||
|
||||
|
||||
proc varDecl*(self: TypeChecker, node: VarDecl): TypedVarDecl =
|
||||
## Typechecks variable declarations
|
||||
var
|
||||
name = self.declare(node)
|
||||
init: TypedExpr
|
||||
typ: Type
|
||||
let pragmaKind = findVarDefinePragma(node)
|
||||
if pragmaKind == ConflictingDefinePragmas:
|
||||
self.error("variables cannot use both 'define' and 'booldefine' pragmas", node)
|
||||
let hasUserDefine = pragmaKind != NoDefinePragma and
|
||||
not self.userDefines.isNil() and
|
||||
self.userDefines.hasKey(node.name.token.lexeme)
|
||||
let userDefine =
|
||||
if hasUserDefine:
|
||||
self.userDefines[node.name.token.lexeme]
|
||||
else:
|
||||
UserDefine()
|
||||
if node.value.isNil():
|
||||
if node.constant:
|
||||
self.error("const declaration requires an initializer", node)
|
||||
else:
|
||||
if not hasUserDefine and node.constant and not node.value.isConst():
|
||||
self.error("const initializer must be a value of constant type", node.value)
|
||||
let originalInit = TypedExpr(self.inferOrError(node.value))
|
||||
typ = originalInit.kind
|
||||
if not hasUserDefine:
|
||||
init = originalInit
|
||||
let inferredBorrow = not init.isNil() and init of TypedBorrowExpr
|
||||
if not node.valueType.isNil():
|
||||
# Explicit type declaration always takes over
|
||||
|
||||
# Check that the inferred expression represents a type
|
||||
# and not a value. This is to guard against things
|
||||
# like "var x: 1 = 1;". We unwrap it immediately because
|
||||
# we don't actually want the returned expression to be
|
||||
# a wrapped type, we just do this step as a safety check
|
||||
typ = self.ensureType(node.valueType).kind.unwrapType()
|
||||
elif not typ.isNil() and typ.kind in {Reference, TypeKind.Lent}:
|
||||
if not inferredBorrow and (init.isNil() or init.node.isNil() or init.node.kind != NodeKind.mutExpr):
|
||||
typ = typ.withHandleMutability(node.mutable)
|
||||
if typ.isNil():
|
||||
self.error("expecting either a type declaration or an initializer value, but neither was found", node)
|
||||
if not inferredBorrow and
|
||||
not typ.isNil() and typ.kind in {Reference, TypeKind.Lent} and node.mutable and not typ.mutable:
|
||||
typ = typ.withHandleMutability(true)
|
||||
# Now check that the type of the initializer, if it exists,
|
||||
# matches the type of the variable
|
||||
case pragmaKind:
|
||||
of ConflictingDefinePragmas:
|
||||
self.error("variables cannot use both 'define' and 'booldefine' pragmas", node)
|
||||
of BoolDefinePragma:
|
||||
if typ.unwrapType().kind != Boolean:
|
||||
self.error("'booldefine' can only be used with bool variables", node)
|
||||
of DefinePragma:
|
||||
if typ.unwrapType().kind notin {Boolean, Integer, Float, String}:
|
||||
self.error("'define' can only be used with primitive bool, integer, float, or string variables", node)
|
||||
of NoDefinePragma:
|
||||
discard
|
||||
if hasUserDefine:
|
||||
init = self.inferOrError(self.defineLiteralExpr(node, typ, node.name.token.lexeme, userDefine, pragmaKind))
|
||||
if node.constant and not init.node.isConst():
|
||||
self.error("const initializer must be a value of constant type", init.node)
|
||||
if not init.isNil():
|
||||
init = self.coerceWithContext(init, typ, init.node)
|
||||
self.consumeMoves(init)
|
||||
name.valueType = typ
|
||||
if not init.isNil():
|
||||
self.nameInits[name.nameKey()] = init
|
||||
self.markInitialized(name)
|
||||
if not name.isPrivate:
|
||||
self.recordModuleExportImpl(self,name)
|
||||
result = newTypedVarDecl(node, name, init)
|
||||
|
||||
|
||||
proc funDecl*(self: TypeChecker, node: FunDecl, name: Name = nil): TypedFunDecl =
|
||||
## Typechecks function declarations
|
||||
|
||||
# Some things are just not possible in life
|
||||
if node.name.token.lexeme in [".", ]:
|
||||
self.error(&"due to compiler limitations, the '{node.name.token.lexeme}' operator cannot be overridden", node.name)
|
||||
var name = name
|
||||
if name.isNil():
|
||||
name = self.declare(node)
|
||||
var node = node
|
||||
let typedBody =
|
||||
if node.body.isNil():
|
||||
newTypedBlockStmt(newBlockStmt(@[], node.token), @[])
|
||||
else:
|
||||
newTypedBlockStmt(BlockStmt(node.body), @[])
|
||||
result = newTypedFunDecl(node, name, typedBody)
|
||||
self.typedFunctions[name.nameKey()] = result
|
||||
# Begin a new scope
|
||||
self.beginScope()
|
||||
# First we declare the function's generics, if it has any
|
||||
self.declareGenerics(name)
|
||||
# We now declare and typecheck the function's
|
||||
# arguments and return type
|
||||
for paramName in node.parameters.keys():
|
||||
let parameter = node.parameters[paramName]
|
||||
var paramName = Name(kind: NameKind.Var, ident: parameter.ident, module: self.currentModule, file: self.file, depth: self.scopeDepth,
|
||||
isPrivate: true, valueType: nil, owner: self.currentFunction, line: parameter.ident.token.line,
|
||||
node: nil)
|
||||
var default = if not parameter.default.isNil(): self.expression(parameter.default) else: nil
|
||||
case parameter.valueType.kind:
|
||||
of NodeKind.binaryExpr, NodeKind.unaryExpr:
|
||||
# Untagged type union
|
||||
paramName.valueType = Type(kind: Union, constraints: @[])
|
||||
self.expandTypeConstraints(parameter.valueType, paramName.valueType.constraints)
|
||||
else:
|
||||
paramName.valueType = self.ensureType(parameter.valueType).kind.unwrapType()
|
||||
# Ensure the expression represents a type and not a value
|
||||
if not parameter.default.isNil():
|
||||
paramName.valueType = self.check(paramName.valueType, default.kind)
|
||||
paramName.isVarParam = parameter.isVar
|
||||
name.valueType.signature.add((parameter.ident.token.lexeme, paramName.valueType, default, parameter.isVar))
|
||||
self.addName(paramName)
|
||||
if not node.returnType.isNil():
|
||||
name.valueType.returnType = self.ensureType(node.returnType).kind.unwrapType()
|
||||
self.validateCopyHook(name, node)
|
||||
self.validateMoveHook(name, node)
|
||||
self.validateDestroyHook(name, node)
|
||||
self.recordCapabilities(name)
|
||||
if not name.isPrivate:
|
||||
self.recordModuleExportImpl(self,name)
|
||||
if name.valueType.isBuiltin:
|
||||
self.endScope()
|
||||
return
|
||||
self.validateImportCDeclaration(name, node)
|
||||
self.validateExportCDeclaration(name, node)
|
||||
if node.body.isNil():
|
||||
self.endScope()
|
||||
if functionHasPragma(name, "importc"):
|
||||
return
|
||||
self.error("forward declarations are only supported for importc functions", node)
|
||||
if BlockStmt(node.body).body.len() == 0:
|
||||
self.error("cannot declare function with empty body")
|
||||
# We store the current function to restore
|
||||
# it later
|
||||
let function = self.currentFunction
|
||||
let previousUserDefines = self.userDefines
|
||||
let previousPushedUserDefines = self.pushedUserDefines
|
||||
let previousPushedConfigNode = self.pushedConfigNode
|
||||
let previousDisabledWarnings = cloneDisabledWarnings(self.disabledWarnings)
|
||||
let previousPushedDisabledWarnings = cloneDisabledWarnings(self.pushedDisabledWarnings)
|
||||
self.currentFunction = name
|
||||
self.userDefines = cloneUserDefines(self.userDefines)
|
||||
self.pushedUserDefines = nil
|
||||
self.pushedConfigNode = nil
|
||||
self.disabledWarnings = cloneDisabledWarnings(self.disabledWarnings)
|
||||
self.pushedDisabledWarnings = @[]
|
||||
for pragma in node.pragmas:
|
||||
if not isConfigPragmaName(pragma.name.token.lexeme):
|
||||
continue
|
||||
if pragma.name.token.lexeme in ["push", "pop"]:
|
||||
self.error(&"'{pragma.name.token.lexeme}' pragma is not needed on function bodies; attach the desired config pragma directly",
|
||||
pragma)
|
||||
self.pragmas[pragma.name.token.lexeme].handler(self, pragma, nil)
|
||||
node.activeDefines = snapshotActiveDefines(self.userDefines)
|
||||
for decl in BlockStmt(node.body).body:
|
||||
result.body.body.add(self.typecheckImpl(self, decl))
|
||||
if not self.pushedUserDefines.isNil():
|
||||
self.error("unterminated configuration context: missing '#pragma[pop]'", self.pushedConfigNode)
|
||||
self.ensureLentReturnsDoNotEscape(name, result.body)
|
||||
if not name.valueType.returnType.isNil() and not self.blockDefinitelyReturns(result.body):
|
||||
self.error("not all control-flow paths return a value", node.body)
|
||||
self.endScope()
|
||||
self.userDefines = previousUserDefines
|
||||
self.pushedUserDefines = previousPushedUserDefines
|
||||
self.pushedConfigNode = previousPushedConfigNode
|
||||
self.disabledWarnings = previousDisabledWarnings
|
||||
self.pushedDisabledWarnings = previousPushedDisabledWarnings
|
||||
# Restores the enclosing function (if any).
|
||||
# Makes nested calls work (including recursion)
|
||||
self.currentFunction = function
|
||||
|
||||
|
||||
# declareGenerics moved to names.nim
|
||||
|
||||
|
||||
proc typeDecl*(self: TypeChecker, node: TypeDecl, name: Name = nil): TypedTypeDecl =
|
||||
## Typechecks type declarations
|
||||
var name = name
|
||||
var declaredVariants: seq[Name] = @[]
|
||||
if name.isNil():
|
||||
let existing = self.find(node.name.token.lexeme, "typevar".toIntrinsic())
|
||||
if not existing.isNil() and existing.node == node:
|
||||
name = existing
|
||||
else:
|
||||
name = self.declare(node)
|
||||
if node.isEnum:
|
||||
result = newTypedEnumDecl(node, name, @[], name.valueType)
|
||||
else:
|
||||
result = newTypedTypeDecl(node, name, newTable[string, TypedExpr]())
|
||||
if node.isInterface and node.isRef:
|
||||
self.error("interfaces cannot be declared as ref objects", node)
|
||||
if node.value.isNil() and node.isEnum:
|
||||
var variantNames = initHashSet[string]()
|
||||
for i, variant in node.members:
|
||||
if variant.name.token.lexeme in variantNames:
|
||||
self.error(&"duplicate variant name '{variant.name.token.lexeme}' in type '{name.ident.token.lexeme}'", variant)
|
||||
variantNames.incl(variant.name.token.lexeme)
|
||||
let variantType = Type(kind: EnumEntry,
|
||||
name: variant.name.token.lexeme,
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[],
|
||||
pragmas: newTable[string, Pragma](),
|
||||
fields: newTable[string, Type](),
|
||||
fieldOrder: @[],
|
||||
isInterface: false,
|
||||
isEnum: true,
|
||||
variants: @[],
|
||||
parent: name.valueType,
|
||||
tag: i,
|
||||
interfaces: @[])
|
||||
let variantName = Name(depth: self.scopeDepth,
|
||||
module: self.currentModule,
|
||||
node: variant,
|
||||
ident: variant.name,
|
||||
line: variant.name.token.line,
|
||||
isPrivate: name.isPrivate,
|
||||
owner: self.currentFunction,
|
||||
valueType: variantType.wrapType())
|
||||
self.addName(variantName)
|
||||
if not variantName.isPrivate:
|
||||
self.recordModuleExportImpl(self,variantName)
|
||||
declaredVariants.add(variantName)
|
||||
self.beginScope()
|
||||
# Declare the type's generics
|
||||
self.declareGenerics(name)
|
||||
if node.isInterface:
|
||||
for requirement in node.requirements:
|
||||
result.requirements.add(self.interfaceRequirement(name, requirement))
|
||||
name.valueType = name.valueType.wrapType()
|
||||
if not name.isPrivate:
|
||||
self.recordModuleExportImpl(self,name)
|
||||
self.endScope()
|
||||
return
|
||||
if node.value.isNil():
|
||||
# Type is not a type union nor a type alias
|
||||
if not node.isEnum:
|
||||
# Type is a structure type
|
||||
if node.isRef and node.interfaces.len() > 0:
|
||||
self.error("interfaces are only supported on value object types", node)
|
||||
let declaredType = name.valueType.unwrapType()
|
||||
let storageType =
|
||||
if declaredType.kind == Reference: declaredType.value
|
||||
else: declaredType
|
||||
for ifaceExpr in node.interfaces:
|
||||
let ifaceType = self.ensureType(ifaceExpr).kind.unwrapType()
|
||||
if ifaceType.kind != Structure or not ifaceType.isInterface:
|
||||
self.error("object interface lists may only contain interface types", ifaceExpr)
|
||||
storageType.interfaces.add(ifaceType)
|
||||
result.interfaces.add(self.findOrError(ifaceType.name, "typevar".toIntrinsic(), ifaceExpr))
|
||||
let intrinsicKey = intrinsicTypeKey(storageType)
|
||||
if intrinsicKey.len() > 0:
|
||||
self.intrinsicInterfaces[intrinsicKey] = storageType.interfaces
|
||||
var fieldType: TypedExpr
|
||||
var n: Name
|
||||
for field in node.fields.values():
|
||||
fieldType = self.expression(field.valueType)
|
||||
if fieldType.kind.kind != Reference:
|
||||
# Check for self-recursion of non-ref types (which would require
|
||||
# infinite memory)
|
||||
n = fieldType.getName()
|
||||
if isDirectSelfTypeReference(name, n):
|
||||
self.error(&"illegal self-recursion in member '{field.ident.token.lexeme}' for non-ref type '{name.ident.token.lexeme}'", fieldType.node)
|
||||
result.fields[field.ident.token.lexeme] = fieldType
|
||||
storageType.fields[field.ident.token.lexeme] = fieldType.kind
|
||||
storageType.fieldOrder.add(field.ident.token.lexeme)
|
||||
if field.isPrivate:
|
||||
storageType.privateFields.add(field.ident.token.lexeme)
|
||||
else:
|
||||
# Tagged variants live under the parent enum type and are kept out
|
||||
# of the surrounding lexical scope.
|
||||
let typedEnum = TypedEnumDecl(result)
|
||||
var fieldType: TypedExpr
|
||||
var n: Name
|
||||
for i, variant in node.members:
|
||||
let variantName = declaredVariants[i]
|
||||
let variantType = variantName.valueType.unwrapType()
|
||||
var typedVariant = newTypedTypeDecl(variant, variantName, newTable[string, TypedExpr]())
|
||||
|
||||
self.beginScope()
|
||||
self.declareGenerics(variantName)
|
||||
for field in variant.fields.values():
|
||||
fieldType = self.expression(field.valueType)
|
||||
if fieldType.kind.kind != Reference:
|
||||
n = fieldType.getName()
|
||||
if isDirectSelfTypeReference(name, n):
|
||||
self.error(&"illegal self-recursion in member '{field.ident.token.lexeme}' for non-ref type '{name.ident.token.lexeme}'", fieldType.node)
|
||||
typedVariant.fields[field.ident.token.lexeme] = fieldType
|
||||
variantType.fields[field.ident.token.lexeme] = fieldType.kind
|
||||
variantType.fieldOrder.add(field.ident.token.lexeme)
|
||||
self.endScope()
|
||||
|
||||
name.valueType.variants.add(variantType)
|
||||
typedEnum.variants.add(typedVariant)
|
||||
else:
|
||||
case node.value.kind:
|
||||
of NodeKind.identExpr, NodeKind.genericExpr, NodeKind.refExpr,
|
||||
NodeKind.mutExpr,
|
||||
NodeKind.ptrExpr, NodeKind.constExpr, NodeKind.lentExpr:
|
||||
# Type alias
|
||||
name.valueType = self.expression(node.value).kind.unwrapType()
|
||||
of NodeKind.binaryExpr, NodeKind.unaryExpr:
|
||||
# Untagged type union
|
||||
name.valueType = Type(kind: Union, constraints: @[], displayName: name.ident.token.lexeme)
|
||||
self.expandTypeConstraints(node.value, name.valueType.constraints)
|
||||
else:
|
||||
# Unreachable due to how we parse things. Still, better be safe
|
||||
self.error(&"got node of unexpected type {node.value.kind} at typeDecl()")
|
||||
# Turn the declared type into a typevar so that future references
|
||||
# to it will be distinct from its instances
|
||||
name.valueType = name.valueType.wrapType()
|
||||
if not name.isPrivate:
|
||||
self.recordModuleExportImpl(self,name)
|
||||
self.endScope()
|
||||
|
||||
|
||||
proc pragmaExpr*(self: TypeChecker, pragma: Pragma) =
|
||||
## Validates pragma expressions (not bound to a name)
|
||||
if pragma.name.token.lexeme notin self.pragmas:
|
||||
self.error(&"unknown pragma '{pragma.name.token.lexeme}'")
|
||||
self.pragmas[pragma.name.token.lexeme].handler(self, pragma, nil)
|
||||
2430
src/frontend/compiler/type_checking/expressions.nim
Normal file
2430
src/frontend/compiler/type_checking/expressions.nim
Normal file
File diff suppressed because it is too large
Load Diff
245
src/frontend/compiler/type_checking/interfaces.nim
Normal file
245
src/frontend/compiler/type_checking/interfaces.nim
Normal file
@@ -0,0 +1,245 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/type_checking/names
|
||||
|
||||
import std/sets
|
||||
import std/os
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/sequtils
|
||||
|
||||
|
||||
# implementedInterfaces moved to comparison.nim
|
||||
|
||||
|
||||
func isDirectSelfTypeReference*(owner, candidate: Name): bool =
|
||||
not owner.isNil() and
|
||||
not candidate.isNil() and
|
||||
owner == candidate
|
||||
|
||||
|
||||
proc interfaceRequirement*(self: TypeChecker, iface: Name, node: FunDecl): TypedFunDecl =
|
||||
let requirementType = Type(kind: Function,
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[],
|
||||
pragmas: newTable[string, Pragma](),
|
||||
signature: @[],
|
||||
returnType: nil)
|
||||
let requirementName = Name(depth: self.scopeDepth,
|
||||
module: self.currentModule,
|
||||
node: node,
|
||||
ident: node.name,
|
||||
line: node.name.token.line,
|
||||
isPrivate: true,
|
||||
owner: iface,
|
||||
kind: NameKind.Function,
|
||||
valueType: requirementType,
|
||||
resolved: true)
|
||||
result = TypedFunDecl(node: node, name: requirementName, body: nil)
|
||||
self.beginScope()
|
||||
self.declareGenerics(requirementName)
|
||||
for parameter in node.parameters.values():
|
||||
if not parameter.default.isNil():
|
||||
self.error("interface requirements cannot declare default arguments", parameter.ident)
|
||||
let parameterType =
|
||||
case parameter.valueType.kind:
|
||||
of NodeKind.binaryExpr, NodeKind.unaryExpr:
|
||||
var typ = Type(kind: Union, constraints: @[])
|
||||
self.expandTypeConstraints(parameter.valueType, typ.constraints)
|
||||
typ
|
||||
else:
|
||||
self.ensureType(parameter.valueType).kind.unwrapType()
|
||||
requirementType.signature.add((parameter.ident.token.lexeme, parameterType, nil, parameter.isVar))
|
||||
if not node.returnType.isNil():
|
||||
requirementType.returnType = self.ensureType(node.returnType).kind.unwrapType()
|
||||
self.endScope()
|
||||
|
||||
|
||||
proc fullInterfaceRequirementSignature*(targetType: Type, requirement: TypedFunDecl): TypeSignature =
|
||||
if requirement.isNil() or requirement.name.isNil() or
|
||||
requirement.name.valueType.isNil() or requirement.name.valueType.kind != Function:
|
||||
return @[("self", targetType, nil, false)]
|
||||
let signature = requirement.name.valueType.signature
|
||||
if signature.len() > 0 and signature[0].name == "self":
|
||||
result = signature
|
||||
result[0] = ("self", targetType, signature[0].default, signature[0].isVar)
|
||||
return
|
||||
result = @[("self", targetType, nil, false)]
|
||||
for parameter in signature:
|
||||
result.add(parameter)
|
||||
|
||||
|
||||
proc interfaceRequirementMatches*(self: TypeChecker, candidate: Name, targetType: Type,
|
||||
requirement: TypedFunDecl): bool =
|
||||
if candidate.isNil() or candidate.valueType.isNil() or candidate.valueType.kind != Function:
|
||||
return false
|
||||
if requirement.isNil() or requirement.name.isNil() or requirement.name.valueType.isNil():
|
||||
return false
|
||||
let expectedSignature = fullInterfaceRequirementSignature(targetType, requirement)
|
||||
let expectedReturn = requirement.name.valueType.returnType
|
||||
if candidate.valueType.signature.len() != expectedSignature.len():
|
||||
return false
|
||||
for i, parameter in expectedSignature:
|
||||
let actual = candidate.valueType.signature[i]
|
||||
if actual.isVar != parameter.isVar:
|
||||
return false
|
||||
if not self.compare(actual.kind, parameter.kind):
|
||||
return false
|
||||
if expectedReturn.isNil() != candidate.valueType.returnType.isNil():
|
||||
return false
|
||||
if not expectedReturn.isNil() and
|
||||
not self.compare(candidate.valueType.returnType, expectedReturn):
|
||||
return false
|
||||
true
|
||||
|
||||
|
||||
proc findTypedTypeDecl*(self: TypeChecker, typed: seq[TypedNode], name: Name): TypedTypeDecl =
|
||||
if name.isNil():
|
||||
return nil
|
||||
for node in typed:
|
||||
if node.isNil() or node.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
let decl = TypedTypeDecl(node)
|
||||
if decl.name == name or decl.name.node == name.node:
|
||||
return decl
|
||||
for moduleNodes in self.typedModules.values():
|
||||
for node in moduleNodes:
|
||||
if node.isNil() or node.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
let decl = TypedTypeDecl(node)
|
||||
if decl.name == name or decl.name.node == name.node:
|
||||
return decl
|
||||
|
||||
|
||||
proc validateInterfaceImplementation*(self: TypeChecker, typ, ifaceDecl: TypedTypeDecl) =
|
||||
if typ.isNil() or typ.name.isNil() or ifaceDecl.isNil() or ifaceDecl.name.isNil():
|
||||
return
|
||||
let targetType = typ.name.valueType.unwrapType()
|
||||
let typeLabel = typ.name.ident.token.lexeme
|
||||
let ifaceLabel = ifaceDecl.name.ident.token.lexeme
|
||||
if ifaceLabel == "Copy":
|
||||
if targetType.copyHook.isNil() and not self.hasImplicitCopyCapability(targetType):
|
||||
self.error(&"type '{typeLabel}' does not implement interface '{ifaceLabel}': missing 'copy=' hook", typ.node)
|
||||
return
|
||||
for requirement in ifaceDecl.requirements:
|
||||
let expectedSignature = fullInterfaceRequirementSignature(targetType, requirement)
|
||||
let expectedReturn = requirement.name.valueType.returnType
|
||||
let candidates = self.findAll(requirement.name.ident.token.lexeme).filterIt(
|
||||
it.module == typ.name.module and
|
||||
not it.valueType.isNil() and
|
||||
it.valueType.kind == Function
|
||||
)
|
||||
var found = false
|
||||
for candidate in candidates:
|
||||
if self.interfaceRequirementMatches(candidate, targetType, requirement):
|
||||
found = true
|
||||
self.markResolved(candidate)
|
||||
break
|
||||
if found:
|
||||
continue
|
||||
var message = &"type '{typeLabel}' does not implement interface '{ifaceLabel}': missing function '{requirement.name.ident.token.lexeme}' with signature {self.stringify(expectedSignature, true)}"
|
||||
if expectedReturn.isNil():
|
||||
message &= " and return type void"
|
||||
else:
|
||||
message &= &" and return type {self.stringify(expectedReturn, true)}"
|
||||
self.error(message, typ.node)
|
||||
|
||||
|
||||
proc validateDeclaredInterfaces*(self: TypeChecker, typed: seq[TypedNode]) =
|
||||
for node in typed:
|
||||
if node.isNil() or node.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
let decl = TypedTypeDecl(node)
|
||||
if TypeDecl(decl.node).isInterface:
|
||||
continue
|
||||
for iface in decl.interfaces:
|
||||
let ifaceDecl = self.findTypedTypeDecl(typed, iface)
|
||||
if ifaceDecl.isNil():
|
||||
self.error(&"internal error: failed to resolve typed interface metadata for '{iface.ident.token.lexeme}'", decl.node)
|
||||
self.validateInterfaceImplementation(decl, ifaceDecl)
|
||||
|
||||
|
||||
proc hasCustomDestroyHook*(self: TypeChecker, typ: Type): bool =
|
||||
if typ.isNil():
|
||||
return false
|
||||
let target = typ.unwrapType()
|
||||
if not target.destroyHook.isNil():
|
||||
return true
|
||||
for candidate in self.findAll("destroy="):
|
||||
if candidate.isNil() or candidate.valueType.isNil() or candidate.valueType.kind != Function:
|
||||
continue
|
||||
if candidate.valueType.signature.len() != 1 or not candidate.valueType.signature[0].isVar:
|
||||
continue
|
||||
if self.compare(candidate.valueType.signature[0].kind, target):
|
||||
return true
|
||||
false
|
||||
|
||||
|
||||
proc containsUnmanagedRawPointer*(self: TypeChecker, typ: Type, active: var HashSet[uint]): bool =
|
||||
if typ.isNil():
|
||||
return false
|
||||
let target = typ.unwrapType()
|
||||
let key = cast[uint](target)
|
||||
if key in active:
|
||||
return false
|
||||
active.incl(key)
|
||||
defer:
|
||||
active.excl(key)
|
||||
|
||||
if self.hasCustomDestroyHook(target):
|
||||
return false
|
||||
|
||||
case target.kind
|
||||
of Pointer:
|
||||
true
|
||||
of Reference, Lent, Const:
|
||||
self.containsUnmanagedRawPointer(target.value, active)
|
||||
of Array, UncheckedArray:
|
||||
self.containsUnmanagedRawPointer(target.elementType, active)
|
||||
of Structure, EnumEntry:
|
||||
if target.kind == Structure and target.isEnum:
|
||||
for variant in target.variants:
|
||||
if self.containsUnmanagedRawPointer(variant, active):
|
||||
return true
|
||||
return false
|
||||
for fieldType in target.fields.values():
|
||||
if self.containsUnmanagedRawPointer(fieldType, active):
|
||||
return true
|
||||
false
|
||||
else:
|
||||
false
|
||||
|
||||
|
||||
proc validateRawPointerLeakWarnings*(self: TypeChecker, typed: seq[TypedNode]) =
|
||||
for node in typed:
|
||||
if node.isNil() or node.node.kind != NodeKind.typeDecl:
|
||||
continue
|
||||
let decl = TypedTypeDecl(node)
|
||||
let astDecl = TypeDecl(decl.node)
|
||||
if astDecl.isInterface:
|
||||
continue
|
||||
let target = decl.name.valueType.unwrapType()
|
||||
if target.isNil() or target.intrinsic or target.kind notin {Structure, Reference}:
|
||||
continue
|
||||
var active = initHashSet[uint]()
|
||||
if self.containsUnmanagedRawPointer(target, active):
|
||||
self.warning(RawPointerLeak,
|
||||
&"type '{decl.name.ident.token.lexeme}' transitively contains raw pointers but does not define a custom 'destroy=' hook; default destruction will not free raw-pointer-owned memory",
|
||||
decl.name,
|
||||
decl.node)
|
||||
421
src/frontend/compiler/type_checking/names.nim
Normal file
421
src/frontend/compiler/type_checking/names.nim
Normal file
@@ -0,0 +1,421 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/algorithm
|
||||
|
||||
|
||||
# Forward declarations
|
||||
proc validateClosedScopeUnusedWarnings*(self: TypeChecker, closed: TableRef[string, seq[Name]])
|
||||
proc addName*(self: TypeChecker, name: Name)
|
||||
|
||||
|
||||
# Scope management
|
||||
|
||||
proc beginScope*(self: TypeChecker) =
|
||||
## Begins a new lexical scope
|
||||
inc(self.scopeDepth)
|
||||
self.names.add(newTable[string, seq[Name]]())
|
||||
|
||||
|
||||
proc endScope*(self: TypeChecker) =
|
||||
## Closes the current lexical scope and reverts to the outer one
|
||||
let closed = self.names.pop()
|
||||
self.validateClosedScopeUnusedWarnings(closed)
|
||||
for bucket in closed.values():
|
||||
for name in bucket:
|
||||
self.clearInitialized(name)
|
||||
self.nameInits.del(name.nameKey())
|
||||
self.movedNames.excl(name.nameKey())
|
||||
dec(self.scopeDepth)
|
||||
|
||||
|
||||
# markResolved moved to types.nim
|
||||
|
||||
|
||||
# Name lookup
|
||||
|
||||
proc find*(self: TypeChecker, name: string, kind: Type = "any".toIntrinsic(), skip: int = 0): Name =
|
||||
## Looks up a name in all scopes starting from the current
|
||||
## one. Optionally matches it to the given type. The skip
|
||||
## argument determines how many matching references are
|
||||
## ignored before returning a match (defaults to 0)
|
||||
|
||||
## A note about how namespaces are implemented. At each scope depth lies
|
||||
## a dictionary which maps strings to a list of names. Why a list of names
|
||||
## rather than a single name? Well, that's to allow cool things like overloading
|
||||
## existing functions with different type signatures. We still disallow most cases
|
||||
## of re-declaration, though (for example, shadowing a function with a variable in the
|
||||
## same scope and module) because it would just be confusing
|
||||
var depth = self.scopeDepth
|
||||
var skip = skip
|
||||
while depth >= 0:
|
||||
if self.names[depth].hasKey(name):
|
||||
for obj in reversed(self.names[depth][name]):
|
||||
let sameModule =
|
||||
obj.module.isNil() or self.currentModule.isNil() or
|
||||
obj.module.absPath == self.currentModule.absPath
|
||||
if not sameModule:
|
||||
# We don't own this name, but we
|
||||
# may still have access to it
|
||||
if obj.isPrivate or self.currentModule notin obj.exportedTo:
|
||||
# The name is either private in its owner
|
||||
# module, so we definitely can't use it, or
|
||||
# said module has not explicitly exported it
|
||||
# to us. If the name is public but not exported
|
||||
# in its owner module, then we act as if it's
|
||||
# private. This is to avoid namespace pollution
|
||||
# from imports (i.e. if module A imports modules
|
||||
# C and D and module B imports module A, then B
|
||||
# might not want to also have access to C's and D's
|
||||
# names as they might clash with its own stuff)
|
||||
continue
|
||||
# If we got here, we can access the name
|
||||
if obj.kind == NameKind.Module:
|
||||
if kind.isNil() or kind.isAny():
|
||||
if skip > 0:
|
||||
dec(skip)
|
||||
continue
|
||||
result = obj
|
||||
break
|
||||
continue
|
||||
if self.compare(obj.valueType, kind):
|
||||
if skip > 0:
|
||||
dec(skip)
|
||||
continue
|
||||
result = obj
|
||||
break
|
||||
|
||||
if not result.isNil():
|
||||
break
|
||||
dec(depth)
|
||||
|
||||
|
||||
proc findAll*(self: TypeChecker, name: string, kind: Type = "any".toIntrinsic()): seq[Name] =
|
||||
## Like find(), but doesn't stop at the first match. Returns
|
||||
## a list of matches
|
||||
var skip = 0
|
||||
while true:
|
||||
var found = self.find(name, kind, skip)
|
||||
if found.isNil():
|
||||
break
|
||||
result.add(found)
|
||||
inc(skip)
|
||||
|
||||
|
||||
proc declare*(self: TypeChecker, node: ASTNode): Name {.discardable.} =
|
||||
## Declares a name into the current scope
|
||||
var scope = self.names[self.scopeDepth]
|
||||
var name: Name
|
||||
var declaredName: string = ""
|
||||
case node.kind:
|
||||
of NodeKind.varDecl:
|
||||
var node = VarDecl(node)
|
||||
declaredName = node.name.token.lexeme
|
||||
# Creates a new Name entry so that self.identifier can find it later
|
||||
name = Name(depth: self.scopeDepth,
|
||||
ident: node.name,
|
||||
isPrivate: node.isPrivate,
|
||||
module: self.currentModule,
|
||||
valueType: nil, # Done later in varDecl (for better semantics)
|
||||
line: node.token.line,
|
||||
owner: self.currentFunction,
|
||||
kind: NameKind.Var,
|
||||
node: node,
|
||||
)
|
||||
self.addName(name)
|
||||
of NodeKind.funDecl:
|
||||
var node = FunDecl(node)
|
||||
declaredName = node.name.token.lexeme
|
||||
var kind = Type(kind: Function,
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[],
|
||||
pragmas: newTable[string, Pragma](),
|
||||
)
|
||||
name = Name(depth: self.scopeDepth,
|
||||
module: self.currentModule,
|
||||
node: node,
|
||||
ident: node.name,
|
||||
line: node.name.token.line,
|
||||
isPrivate: node.isPrivate,
|
||||
owner: self.currentFunction,
|
||||
kind: NameKind.Function,
|
||||
valueType: kind)
|
||||
self.addName(name)
|
||||
of NodeKind.typeDecl:
|
||||
var node = TypeDecl(node)
|
||||
declaredName = node.name.token.lexeme
|
||||
var kind = Type(kind: Structure,
|
||||
name: declaredName,
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[],
|
||||
fields: newTable[string, Type](),
|
||||
fieldOrder: @[],
|
||||
isInterface: node.isInterface,
|
||||
isEnum: node.isEnum,
|
||||
variants: @[],
|
||||
parent: nil,
|
||||
tag: -1,
|
||||
interfaces: @[])
|
||||
if node.isRef:
|
||||
kind = kind.toRef()
|
||||
name = Name(depth: self.scopeDepth,
|
||||
module: self.currentModule,
|
||||
node: node,
|
||||
ident: node.name,
|
||||
line: node.name.token.line,
|
||||
isPrivate: node.isPrivate,
|
||||
owner: self.currentFunction,
|
||||
valueType: kind)
|
||||
self.addName(name)
|
||||
else:
|
||||
self.error("not implemented")
|
||||
# TODO: enums
|
||||
if not name.isNil():
|
||||
self.dispatchPragmasImpl(self, name)
|
||||
return name
|
||||
|
||||
|
||||
proc addName*(self: TypeChecker, name: Name) =
|
||||
## Adds a name to the current lexical scope
|
||||
if name.file == "":
|
||||
name.file = self.file
|
||||
var scope = self.names[self.scopeDepth]
|
||||
if scope.hasKey(name.ident.token.lexeme):
|
||||
for obj in scope[name.ident.token.lexeme]:
|
||||
if not name.valueType.isNil() and name.valueType.kind == TypeKind.Function:
|
||||
# We don't check for name clashes for functions because we allow overloading.
|
||||
# Our match() proc will take care of reporting any ambiguity errors at call time
|
||||
continue
|
||||
if (obj.kind in [NameKind.Var, NameKind.Module] or
|
||||
(not obj.valueType.isNil() and obj.valueType.kind in [TypeKind.Structure, TypeKind.EnumEntry])) and
|
||||
name.module == obj.module:
|
||||
self.error(&"re-declaration of '{obj.ident.token.lexeme}' is not allowed (previously declared in {obj.module.ident.token.lexeme}:{obj.ident.token.line}:{obj.ident.token.relPos.start})", name.node)
|
||||
else:
|
||||
scope[name.ident.token.lexeme] = @[]
|
||||
scope[name.ident.token.lexeme].add(name)
|
||||
if not requiresInitializationTracking(name) or not VarDecl(name.node).value.isNil():
|
||||
self.markInitialized(name)
|
||||
|
||||
|
||||
proc recordCapabilities*(self: TypeChecker, function: Name) =
|
||||
## Records type capabilities to be able to match against them
|
||||
## later
|
||||
if not self.capabilities.hasKey(function.ident.token.lexeme):
|
||||
self.capabilities[function.ident.token.lexeme] = @[]
|
||||
var newCapability: TypeCapabilities
|
||||
for parameter in function.valueType.signature:
|
||||
case parameter.kind.kind:
|
||||
of Generic, Union:
|
||||
for constraint in parameter.kind.constraints:
|
||||
newCapability.add((constraint.match, constraint.kind.unwrapType(), constraint.value))
|
||||
else:
|
||||
newCapability.add((true, parameter.kind.unwrapType(), parameter.default))
|
||||
self.capabilities[function.ident.token.lexeme].add(newCapability)
|
||||
|
||||
|
||||
proc validateUnusedWarnings*(self: TypeChecker, typed: seq[TypedNode]) =
|
||||
if not self.isEntryModule():
|
||||
return
|
||||
var warned = initHashSet[uint]()
|
||||
for scope in self.names:
|
||||
for bucket in scope.values():
|
||||
for name in bucket:
|
||||
if name.isNil() or name.node.isNil() or name.module != self.currentModule or name.node.file != self.file:
|
||||
continue
|
||||
let key = cast[uint](name)
|
||||
if key in warned:
|
||||
continue
|
||||
warned.incl(key)
|
||||
if name.resolved or hasPragma(Declaration(name.node).pragmas, "used"):
|
||||
continue
|
||||
case name.kind
|
||||
of NameKind.Var:
|
||||
if name.owner.isNil():
|
||||
if not name.isPrivate:
|
||||
continue
|
||||
self.warning(UnusedGlobalVariable,
|
||||
&"global variable '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
else:
|
||||
self.warning(UnusedLocalVariable,
|
||||
&"local variable '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
of NameKind.Function:
|
||||
if not name.isPrivate or name.ident.token.lexeme.len == 0:
|
||||
continue
|
||||
self.warning(UnusedFunction,
|
||||
&"function '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
else:
|
||||
if not name.isPrivate or name.valueType.isNil():
|
||||
continue
|
||||
let target = name.valueType.unwrapType()
|
||||
if target.kind == EnumEntry:
|
||||
continue
|
||||
self.warning(UnusedTypeDeclaration,
|
||||
&"type '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
for node in typed:
|
||||
if node.isNil() or node.node.kind != NodeKind.importStmt:
|
||||
continue
|
||||
let stmt = TypedImportStmt(node)
|
||||
if stmt.localBindings.len == 0 and stmt.importedNames.len == 0:
|
||||
continue
|
||||
var used = false
|
||||
for binding in stmt.localBindings:
|
||||
if not binding.isNil() and binding.resolved:
|
||||
used = true
|
||||
break
|
||||
if not used:
|
||||
for imported in stmt.importedNames:
|
||||
if not imported.isNil() and imported.resolved:
|
||||
used = true
|
||||
break
|
||||
if not used:
|
||||
self.warning(UnusedImport,
|
||||
&"import from module '{stmt.importedModule.ident.token.lexeme}' is never used",
|
||||
node = stmt.node)
|
||||
|
||||
|
||||
proc validateClosedScopeUnusedWarnings*(self: TypeChecker, closed: TableRef[string, seq[Name]]) =
|
||||
if self.scopeDepth == 0 or not self.isEntryModule():
|
||||
return
|
||||
var warned = initHashSet[uint]()
|
||||
for bucket in closed.values():
|
||||
for name in bucket:
|
||||
if name.isNil() or name.node.isNil() or name.module != self.currentModule or name.node.file != self.file:
|
||||
continue
|
||||
let key = cast[uint](name)
|
||||
if key in warned:
|
||||
continue
|
||||
warned.incl(key)
|
||||
if name.resolved or hasPragma(Declaration(name.node).pragmas, "used"):
|
||||
continue
|
||||
case name.kind
|
||||
of NameKind.Var:
|
||||
self.warning(UnusedLocalVariable,
|
||||
&"local variable '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
of NameKind.Function:
|
||||
if name.ident.token.lexeme.len == 0:
|
||||
continue
|
||||
self.warning(UnusedFunction,
|
||||
&"function '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
else:
|
||||
if name.valueType.isNil():
|
||||
continue
|
||||
let target = name.valueType.unwrapType()
|
||||
if target.kind == EnumEntry:
|
||||
continue
|
||||
self.warning(UnusedTypeDeclaration,
|
||||
&"type '{name.ident.token.lexeme}' is never used",
|
||||
name,
|
||||
name.node)
|
||||
|
||||
|
||||
proc ensureType*(self: TypeChecker, node: Expression): TypedExpr =
|
||||
## Ensures the given expression resolves to a type (rather
|
||||
## than a value) and returns it
|
||||
if not node.isNil() and node.kind == NodeKind.identExpr:
|
||||
let ident = IdentExpr(node)
|
||||
let name = self.find(ident.name.lexeme, "typevar".toIntrinsic())
|
||||
if not name.isNil():
|
||||
self.markResolved(name)
|
||||
return newTypedIdentExpr(ident, name)
|
||||
return self.check(node, "typevar".toIntrinsic())
|
||||
|
||||
|
||||
proc expandTypeConstraints*(self: TypeChecker, condition: Expression, list: var TypeCapabilities, accept: bool = true) =
|
||||
## Recursively unpacks a type constraint
|
||||
case condition.kind:
|
||||
of identExpr, genericExpr:
|
||||
let inferred = self.ensureType(condition)
|
||||
let typ = inferred.kind
|
||||
|
||||
list.add((accept, typ.unwrapType(), inferred))
|
||||
of binaryExpr:
|
||||
let condition = BinaryExpr(condition)
|
||||
case condition.operator.lexeme:
|
||||
of "|", "&":
|
||||
self.expandTypeConstraints(condition.left, list)
|
||||
self.expandTypeConstraints(condition.right, list)
|
||||
else:
|
||||
self.error("invalid type constraint", condition)
|
||||
of unaryExpr:
|
||||
let condition = UnaryExpr(condition)
|
||||
case condition.operator.lexeme:
|
||||
of "~":
|
||||
self.expandTypeConstraints(condition.operand, list, accept=false)
|
||||
else:
|
||||
self.error("invalid type constraint in", condition)
|
||||
else:
|
||||
self.error("invalid type constraint", condition)
|
||||
|
||||
|
||||
proc declareGenerics*(self: TypeChecker, name: Name) =
|
||||
## Helper to declare the generic arguments of the
|
||||
## given name, if it has any
|
||||
var
|
||||
constraints: seq[tuple[match: bool, kind: Type, value: TypedExpr]] = @[]
|
||||
let genericOwner =
|
||||
if name.isNil() or name.valueType.isNil():
|
||||
nil
|
||||
else:
|
||||
constructorPayloadType(name.valueType.unwrapType())
|
||||
if not genericOwner.isNil() and name.node.generics.len() > 0:
|
||||
genericOwner.genericParams.setLen(0)
|
||||
for gen in name.node.generics.values():
|
||||
case gen.kind:
|
||||
of OpaqueTypeParam:
|
||||
constraints = @[(match: true, kind: "any".toIntrinsic(), value: nil)]
|
||||
of FiniteTypeParam:
|
||||
self.expandTypeConstraints(gen.resolvedConstraint, constraints)
|
||||
of ConstParam:
|
||||
let declaredType = self.ensureType(gen.resolvedConstraint).kind.unwrapType()
|
||||
constraints = @[(match: true, kind: declaredType, value: nil)]
|
||||
let generic = Name(kind: Default,
|
||||
ident: gen.ident,
|
||||
module: self.currentModule,
|
||||
owner: self.currentFunction,
|
||||
file: self.currentModule.file,
|
||||
depth: self.scopeDepth,
|
||||
isPrivate: true,
|
||||
valueType: Type(kind: Generic,
|
||||
constraints: constraints,
|
||||
displayName: gen.ident.token.lexeme,
|
||||
constant: gen.kind == ConstParam),
|
||||
line: gen.ident.token.line,
|
||||
)
|
||||
self.addName(generic)
|
||||
genericOwner.generics[gen.ident.token.lexeme] = generic.valueType
|
||||
genericOwner.genericParams.add(TypeGenericParam(name: gen.ident.token.lexeme,
|
||||
kind: gen.kind,
|
||||
symbol: generic.valueType,
|
||||
valueType: (if gen.kind == ConstParam: constraints[0].kind else: nil)))
|
||||
constraints.setLen(0)
|
||||
423
src/frontend/compiler/type_checking/pragmas.nim
Normal file
423
src/frontend/compiler/type_checking/pragmas.nim
Normal file
@@ -0,0 +1,423 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/type_checking/user_defines
|
||||
|
||||
import std/strformat
|
||||
import std/strutils
|
||||
import std/tables
|
||||
|
||||
|
||||
proc handleMagicPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "magic" pragma. Assumes the given name is already
|
||||
## declared
|
||||
if pragma.args.len() != 1:
|
||||
self.error(&"'magic' pragma: wrong number of arguments (expected 1, got {len(pragma.args)} instead)")
|
||||
elif pragma.args[0].kind != strExpr:
|
||||
self.error(&"'magic' pragma: wrong argument type (constant string expected, got {self.stringify(self.inferOrError(pragma.args[0]), true)} instead)")
|
||||
elif name.node.kind == NodeKind.typeDecl:
|
||||
if pragma.args[0].token.lexeme[1..^2] == "owo":
|
||||
# :3
|
||||
self.error("Thou hast discovered ze forbidden type. Huzzah! :3", pragma)
|
||||
name.valueType = pragma.args[0].token.lexeme[1..^2].toIntrinsic()
|
||||
elif name.node.kind == NodeKind.funDecl:
|
||||
name.valueType.isBuiltin = true
|
||||
name.valueType.builtinOp = pragma.args[0].token.lexeme[1..^2]
|
||||
else:
|
||||
self.error("'magic' pragma is not valid in this context")
|
||||
|
||||
|
||||
proc pragmaMessage*(self: TypeChecker, pragma: Pragma, label: string): string =
|
||||
if pragma.args.len() != 1:
|
||||
self.error(&"'{label}' pragma: wrong number of arguments", pragma)
|
||||
if pragma.args[0].kind != strExpr:
|
||||
self.error(&"'{label}' pragma: wrong type of argument (constant string expected)", pragma.args[0])
|
||||
pragma.args[0].token.lexeme[1..^2]
|
||||
|
||||
|
||||
proc parsePragmaWarningKind*(self: TypeChecker, node: Expression, label: string): WarningKind =
|
||||
let raw =
|
||||
case node.kind
|
||||
of identExpr:
|
||||
node.token.lexeme
|
||||
of strExpr:
|
||||
node.token.lexeme[1..^2]
|
||||
else:
|
||||
self.error(&"'{label}' pragma expects warning names as bare identifiers or strings", node)
|
||||
""
|
||||
try:
|
||||
result = parseWarningKind(raw)
|
||||
except ValueError:
|
||||
self.error(&"'{label}' pragma: invalid warning name '{raw}'", node)
|
||||
|
||||
|
||||
proc pragmaWarningKinds*(self: TypeChecker, pragma: Pragma, label: string): seq[WarningKind] =
|
||||
if pragma.args.len() != 1:
|
||||
self.error(&"'{label}' pragma expects exactly 1 warning name or warning list", pragma)
|
||||
let arg = pragma.args[0]
|
||||
if arg.kind == arrayExpr:
|
||||
for element in ArrayExpr(arg).elements:
|
||||
result.add(self.parsePragmaWarningKind(element, label))
|
||||
else:
|
||||
result.add(self.parsePragmaWarningKind(arg, label))
|
||||
|
||||
|
||||
proc disableWarning*(self: TypeChecker, kind: WarningKind) =
|
||||
if kind notin self.disabledWarnings:
|
||||
self.disabledWarnings.add(kind)
|
||||
|
||||
|
||||
# hasPragma moved to types.nim
|
||||
|
||||
|
||||
proc handleErrorPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "error" pragma
|
||||
self.error(self.pragmaMessage(pragma, "error"), if name.isNil(): pragma else: name.node)
|
||||
|
||||
|
||||
proc handleWarnPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "warn" pragma
|
||||
self.warning(UserWarning, self.pragmaMessage(pragma, pragma.name.token.lexeme), name,
|
||||
if name.isNil(): pragma else: name.node)
|
||||
|
||||
|
||||
proc handleDeprecatedPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
let message =
|
||||
if name.isNil() or name.node.isNil() or name.node.kind == NodeKind.funDecl:
|
||||
"this function is deprecated"
|
||||
else:
|
||||
"this object is deprecated"
|
||||
self.warning(UserWarning, message, name, if name.isNil(): pragma else: name.node)
|
||||
|
||||
|
||||
proc handleUsedPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'used' pragma does not take arguments", pragma)
|
||||
elif name.isNil() or name.node.isNil() or name.node.kind notin {NodeKind.varDecl, NodeKind.funDecl, NodeKind.typeDecl}:
|
||||
self.error("'used' pragma is only valid on variables, functions, and types", pragma)
|
||||
else:
|
||||
self.markResolved(name)
|
||||
|
||||
|
||||
proc handlePurePragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "pure" pragma
|
||||
# TODO: Implement purity tracking for functions/lambdas
|
||||
case name.node.kind:
|
||||
of NodeKind.funDecl, lambdaExpr:
|
||||
discard
|
||||
else:
|
||||
self.error("'pure' pragma is not valid in this context")
|
||||
|
||||
|
||||
proc handleNoInitPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "noinit" pragma
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'noinit' pragma does not take arguments")
|
||||
if name.isNil():
|
||||
self.error("'noinit' pragma is not valid in this context", pragma)
|
||||
case name.node.kind:
|
||||
of NodeKind.typeDecl:
|
||||
let typ = name.valueType.unwrapType()
|
||||
if typ.pragmas.isNil():
|
||||
typ.pragmas = newTable[string, Pragma]()
|
||||
typ.pragmas["noinit"] = pragma
|
||||
of NodeKind.varDecl:
|
||||
discard
|
||||
else:
|
||||
self.error("'noinit' pragma is not valid in this context", pragma)
|
||||
|
||||
|
||||
proc handleNoReturnPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "noreturn" pragma
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'noreturn' pragma does not take arguments")
|
||||
elif name.isNil() or name.node.isNil() or name.node.kind != NodeKind.funDecl:
|
||||
self.error("'noreturn' pragma is not valid in this context", pragma)
|
||||
else:
|
||||
if name.valueType.pragmas.isNil():
|
||||
name.valueType.pragmas = newTable[string, Pragma]()
|
||||
name.valueType.pragmas["noreturn"] = pragma
|
||||
|
||||
|
||||
proc handleInlinePragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "inline" pragma
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'inline' pragma does not take arguments")
|
||||
elif name.isNil() or name.node.isNil() or name.node.kind != NodeKind.funDecl:
|
||||
self.error("'inline' pragma is only valid on named functions", pragma)
|
||||
else:
|
||||
if name.valueType.pragmas.isNil():
|
||||
name.valueType.pragmas = newTable[string, Pragma]()
|
||||
name.valueType.pragmas["inline"] = pragma
|
||||
|
||||
|
||||
proc pragmaExternalSymbol*(self: TypeChecker, pragma: Pragma, label: string): string =
|
||||
if pragma.args.len() == 0:
|
||||
return ""
|
||||
if pragma.args.len() != 1:
|
||||
self.error(&"'{label}' pragma expects at most 1 string or bare identifier argument", pragma)
|
||||
case pragma.args[0].kind
|
||||
of identExpr:
|
||||
result = pragma.args[0].token.lexeme
|
||||
of strExpr:
|
||||
result = pragma.args[0].token.lexeme[1..^2]
|
||||
else:
|
||||
self.error(&"'{label}' pragma expects a string or bare identifier argument", pragma.args[0])
|
||||
|
||||
|
||||
proc handleImportCPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
discard self.pragmaExternalSymbol(pragma, "importc")
|
||||
if name.isNil() or name.node.isNil() or name.node.kind != NodeKind.funDecl:
|
||||
self.error("importc is only valid on top-level forward function declarations", pragma)
|
||||
let decl = FunDecl(name.node)
|
||||
if not name.owner.isNil() or not decl.body.isNil():
|
||||
self.error("importc is only valid on top-level forward function declarations", pragma)
|
||||
if hasPragma(decl.pragmas, "exportc"):
|
||||
self.error("importc and exportc cannot be combined", pragma)
|
||||
if name.valueType.pragmas.isNil():
|
||||
name.valueType.pragmas = newTable[string, Pragma]()
|
||||
name.valueType.pragmas["importc"] = pragma
|
||||
|
||||
|
||||
proc handleExportCPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
discard self.pragmaExternalSymbol(pragma, "exportc")
|
||||
if name.isNil() or name.node.isNil() or name.node.kind != NodeKind.funDecl:
|
||||
self.error("exportc is only valid on top-level named function definitions", pragma)
|
||||
let decl = FunDecl(name.node)
|
||||
if not name.owner.isNil() or decl.body.isNil():
|
||||
self.error("exportc is only valid on top-level named function definitions", pragma)
|
||||
if hasPragma(decl.pragmas, "importc"):
|
||||
self.error("importc and exportc cannot be combined", pragma)
|
||||
if name.valueType.pragmas.isNil():
|
||||
name.valueType.pragmas = newTable[string, Pragma]()
|
||||
name.valueType.pragmas["exportc"] = pragma
|
||||
|
||||
|
||||
proc handleHeaderPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
if pragma.args.len() != 1:
|
||||
self.error("'header' pragma expects exactly 1 string argument", pragma)
|
||||
if pragma.args[0].kind != strExpr:
|
||||
self.error("'header' pragma expects exactly 1 string argument", pragma.args[0])
|
||||
if name.isNil() or name.node.isNil() or name.node.kind != NodeKind.funDecl or
|
||||
not hasPragma(FunDecl(name.node).pragmas, "importc"):
|
||||
self.error("header pragma requires importc in the first C interop slice", pragma)
|
||||
if name.valueType.pragmas.isNil():
|
||||
name.valueType.pragmas = newTable[string, Pragma]()
|
||||
name.valueType.pragmas["header"] = pragma
|
||||
|
||||
|
||||
proc handleDefinePragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "define" pragma
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'define' pragma does not take arguments")
|
||||
elif name.isNil() or name.node.isNil() or name.node.kind != NodeKind.varDecl:
|
||||
self.error("'define' pragma is only valid on variables", pragma)
|
||||
|
||||
|
||||
proc handleBoolDefinePragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
## Handles the "booldefine" pragma
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'booldefine' pragma does not take arguments")
|
||||
elif name.isNil() or name.node.isNil() or name.node.kind != NodeKind.varDecl:
|
||||
self.error("'booldefine' pragma is only valid on variables", pragma)
|
||||
|
||||
|
||||
proc requireStandaloneConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name, label: string) =
|
||||
if not name.isNil():
|
||||
self.error(&"'{label}' pragma is only valid as a standalone directive", pragma)
|
||||
|
||||
|
||||
proc isConfigPragmaName*(name: string): bool =
|
||||
name in ["push", "pop", "checks", "boundChecks",
|
||||
"overflowCheck", "overflowChecks", "floatCheck", "floatChecks",
|
||||
"lineTrace", "stackTrace", "debug", "release", "danger", "noWarn"]
|
||||
|
||||
|
||||
proc pragmaArgValue*(self: TypeChecker, pragma: Pragma, label: string): string =
|
||||
if pragma.args.len() == 0:
|
||||
return ""
|
||||
if pragma.args.len() != 1:
|
||||
self.error(&"'{label}' pragma expects at most 1 argument", pragma)
|
||||
let arg = pragma.args[0]
|
||||
case arg.kind
|
||||
of identExpr, intExpr, strExpr, trueExpr, falseExpr:
|
||||
result = arg.token.lexeme
|
||||
else:
|
||||
self.error(&"'{label}' pragma expects a literal or bare identifier argument", arg)
|
||||
|
||||
|
||||
proc pragmaBoolValue*(self: TypeChecker, pragma: Pragma, label: string, defaultValue = true): bool =
|
||||
if pragma.args.len() == 0:
|
||||
return defaultValue
|
||||
let raw = self.pragmaArgValue(pragma, label)
|
||||
let normalized =
|
||||
if pragma.args[0].kind == strExpr:
|
||||
raw[1..^2]
|
||||
else:
|
||||
raw
|
||||
try:
|
||||
result = parseBoolDefineValue(normalized)
|
||||
except ValueError:
|
||||
self.error(&"'{label}' pragma expects a boolean value", pragma.args[0])
|
||||
|
||||
|
||||
proc setResolvedDefine*(self: TypeChecker, pragma: Pragma, defineName: string, value: bool) =
|
||||
let normalizedDefine = canonicalCompilerDefineName(defineName)
|
||||
if self.userDefines.isNil():
|
||||
self.userDefines = newTable[string, UserDefine]()
|
||||
self.userDefines[normalizedDefine] = boolUserDefine(value, explicit = true)
|
||||
try:
|
||||
self.userDefines = resolveUserDefines(self.userDefines)
|
||||
except ValueError as exc:
|
||||
self.error(exc.msg, pragma)
|
||||
|
||||
|
||||
proc setResolvedBuildMode*(self: TypeChecker, pragma: Pragma, mode: CanonicalBuildMode) =
|
||||
if self.userDefines.isNil():
|
||||
self.userDefines = newTable[string, UserDefine]()
|
||||
for defineName in ["debug", "release", "danger"]:
|
||||
self.userDefines.del(defineName)
|
||||
self.userDefines[buildModeDefineName(mode)] = boolUserDefine(true, explicit = true)
|
||||
try:
|
||||
self.userDefines = resolveUserDefines(self.userDefines)
|
||||
except ValueError as exc:
|
||||
self.error(exc.msg, pragma)
|
||||
|
||||
|
||||
proc handlePushPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "push")
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'push' pragma does not take arguments", pragma)
|
||||
if not self.pushedUserDefines.isNil():
|
||||
let pushLine =
|
||||
if self.pushedConfigNode.isNil() or self.pushedConfigNode.token.isNil():
|
||||
"?"
|
||||
else:
|
||||
$self.pushedConfigNode.token.line
|
||||
self.error(&"cannot push a new configuration context before popping the one opened at line {pushLine}", pragma)
|
||||
self.pushedUserDefines = cloneUserDefines(self.userDefines)
|
||||
self.pushedDisabledWarnings = cloneDisabledWarnings(self.disabledWarnings)
|
||||
self.pushedConfigNode = pragma
|
||||
|
||||
|
||||
proc handlePopPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "pop")
|
||||
if pragma.args.len() != 0:
|
||||
self.error("'pop' pragma does not take arguments", pragma)
|
||||
if self.pushedUserDefines.isNil():
|
||||
self.error("'pop' pragma does not have a matching 'push'", pragma)
|
||||
self.userDefines = cloneUserDefines(self.pushedUserDefines)
|
||||
self.disabledWarnings = self.pushedDisabledWarnings
|
||||
self.pushedUserDefines = nil
|
||||
self.pushedDisabledWarnings = @[]
|
||||
self.pushedConfigNode = nil
|
||||
|
||||
|
||||
proc handleChecksConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "checks")
|
||||
self.setResolvedDefine(pragma, "checks", self.pragmaBoolValue(pragma, "checks"))
|
||||
|
||||
|
||||
proc handleBoundChecksConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "boundChecks")
|
||||
self.setResolvedDefine(pragma, "boundChecks", self.pragmaBoolValue(pragma, "boundChecks"))
|
||||
|
||||
|
||||
proc handleOverflowCheckConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "overflowChecks")
|
||||
self.setResolvedDefine(pragma, "overflowChecks", self.pragmaBoolValue(pragma, "overflowChecks"))
|
||||
|
||||
|
||||
proc handleFloatCheckConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "floatChecks")
|
||||
self.setResolvedDefine(pragma, "floatChecks", self.pragmaBoolValue(pragma, "floatChecks"))
|
||||
|
||||
|
||||
proc handleLineTraceConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "lineTrace")
|
||||
self.setResolvedDefine(pragma, "lineTrace", self.pragmaBoolValue(pragma, "lineTrace"))
|
||||
|
||||
|
||||
proc handleStackTraceConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "stackTrace")
|
||||
self.setResolvedDefine(pragma, "stackTrace", self.pragmaBoolValue(pragma, "stackTrace"))
|
||||
|
||||
|
||||
proc handleDebugConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "debug")
|
||||
if not self.pragmaBoolValue(pragma, "debug"):
|
||||
self.error("'debug' pragma cannot be set to off; use 'release' or 'danger' instead", pragma)
|
||||
self.setResolvedBuildMode(pragma, CanonicalBuildMode.DebugBuildMode)
|
||||
|
||||
|
||||
proc handleReleaseConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "release")
|
||||
if not self.pragmaBoolValue(pragma, "release"):
|
||||
self.error("'release' pragma cannot be set to off; use 'debug' or 'danger' instead", pragma)
|
||||
self.setResolvedBuildMode(pragma, CanonicalBuildMode.ReleaseBuildMode)
|
||||
|
||||
|
||||
proc handleDangerConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "danger")
|
||||
if not self.pragmaBoolValue(pragma, "danger"):
|
||||
self.error("'danger' pragma cannot be set to off; use 'debug' or 'release' instead", pragma)
|
||||
self.setResolvedBuildMode(pragma, CanonicalBuildMode.DangerBuildMode)
|
||||
|
||||
|
||||
proc handleNoWarnConfigPragma*(self: TypeChecker, pragma: Pragma, name: Name) =
|
||||
self.requireStandaloneConfigPragma(pragma, name, "noWarn")
|
||||
for kind in self.pragmaWarningKinds(pragma, "noWarn"):
|
||||
self.disableWarning(kind)
|
||||
|
||||
|
||||
proc dispatchPragmas*(self: TypeChecker, name: Name) =
|
||||
## Dispatches pragmas bound to objects
|
||||
if name.node.isNil():
|
||||
return
|
||||
var pragmas: seq[Pragma] = @[]
|
||||
case name.node.kind:
|
||||
of NodeKind.funDecl, NodeKind.typeDecl, NodeKind.varDecl:
|
||||
pragmas = Declaration(name.node).pragmas
|
||||
else:
|
||||
discard # TODO
|
||||
var f: PragmaFunc
|
||||
for pragma in pragmas:
|
||||
if pragma.name.token.lexeme notin self.pragmas:
|
||||
self.error(&"unknown pragma '{pragma.name.token.lexeme}'")
|
||||
if name.node.kind == NodeKind.funDecl and isConfigPragmaName(pragma.name.token.lexeme):
|
||||
continue
|
||||
f = self.pragmas[pragma.name.token.lexeme]
|
||||
if f.kind != Immediate:
|
||||
continue
|
||||
f.handler(self, pragma, name)
|
||||
|
||||
|
||||
proc dispatchDelayedPragmas*(self: TypeChecker, name: Name) {.used.} =
|
||||
## Dispatches pragmas bound to objects once they
|
||||
## are used
|
||||
if name.node.isNil():
|
||||
return
|
||||
var pragmas: seq[Pragma] = @[]
|
||||
pragmas = Declaration(name.node).pragmas
|
||||
var f: PragmaFunc
|
||||
for pragma in pragmas:
|
||||
if pragma.name.token.lexeme notin self.pragmas:
|
||||
self.error(&"unknown pragma '{pragma.name.token.lexeme}'")
|
||||
f = self.pragmas[pragma.name.token.lexeme]
|
||||
if f.kind == Immediate:
|
||||
continue
|
||||
f.handler(self, pragma, name)
|
||||
1012
src/frontend/compiler/type_checking/resolution.nim
Normal file
1012
src/frontend/compiler/type_checking/resolution.nim
Normal file
File diff suppressed because it is too large
Load Diff
363
src/frontend/compiler/type_checking/type_utils.nim
Normal file
363
src/frontend/compiler/type_checking/type_utils.nim
Normal file
@@ -0,0 +1,363 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
|
||||
import std/strutils
|
||||
import std/strformat
|
||||
import std/parseutils
|
||||
import std/tables
|
||||
|
||||
|
||||
func toIntrinsic*(name: string): Type
|
||||
|
||||
|
||||
proc wrapType*(self: Type): Type {.inline.} =
|
||||
## Wraps a type in a typevar
|
||||
case self.kind:
|
||||
of Union:
|
||||
result = Type(kind: Union, constraints: @[], displayName: self.displayName)
|
||||
for typ in self.constraints:
|
||||
result.constraints.add((match: typ.match, kind: typ.kind.wrapType(), value: typ.value))
|
||||
else:
|
||||
result = Type(kind: Typevar, wrapped: self)
|
||||
|
||||
|
||||
func toRef*(self: Type): Type {.inline.} =
|
||||
Type(kind: Reference, value: self, mutable: false, generics: self.generics,
|
||||
genericParams: self.genericParams, pragmas: self.pragmas)
|
||||
|
||||
func toPtr*(self: Type): Type {.inline} =
|
||||
Type(kind: Pointer, value: self, generics: self.generics,
|
||||
genericParams: self.genericParams, pragmas: self.pragmas)
|
||||
|
||||
func toLent*(self: Type): Type {.inline.} =
|
||||
result = "lent".toIntrinsic()
|
||||
result.value = self
|
||||
result.mutable = false
|
||||
result.generics = self.generics
|
||||
result.genericParams = self.genericParams
|
||||
result.pragmas = self.pragmas
|
||||
|
||||
func toConst*(self: Type): Type {.inline.} =
|
||||
result = "const".toIntrinsic()
|
||||
result.value = self
|
||||
result.generics = self.generics
|
||||
result.genericParams = self.genericParams
|
||||
result.pragmas = self.pragmas
|
||||
|
||||
func withHandleMutability*(self: Type, mutable: bool): Type {.inline.} =
|
||||
if self.isNil():
|
||||
return nil
|
||||
let unwrapped = self.unwrapType()
|
||||
if unwrapped.kind notin {Reference, TypeKind.Lent}:
|
||||
return self
|
||||
result = unwrapped.deepCopy()
|
||||
result.mutable = mutable
|
||||
|
||||
|
||||
func normalizedBorrowedHandle*(self: Type): Type {.inline.} =
|
||||
if self.isNil():
|
||||
return nil
|
||||
let unwrapped = self.unwrapType()
|
||||
result = unwrapped.deepCopy()
|
||||
if result.kind != TypeKind.Lent or result.value.isNil():
|
||||
return result
|
||||
let inner = result.value.unwrapType()
|
||||
if inner.kind == Reference:
|
||||
result.value = inner.withHandleMutability(false)
|
||||
|
||||
|
||||
proc validateRawPointerPointee*(self: TypeChecker, target: Type, node: ASTNode) =
|
||||
if target.isNil():
|
||||
return
|
||||
let unwrapped = target.unwrapType()
|
||||
case unwrapped.kind
|
||||
of Reference:
|
||||
self.error("raw pointers cannot point to managed ref types", node)
|
||||
of TypeKind.Lent:
|
||||
self.error("raw pointers cannot point to lent types", node)
|
||||
of UncheckedArray:
|
||||
self.validateRawPointerPointee(unwrapped.elementType, node)
|
||||
of TypeKind.Const:
|
||||
self.validateRawPointerPointee(unwrapped.value, node)
|
||||
else:
|
||||
discard
|
||||
|
||||
|
||||
|
||||
proc supportsManagedRefStorage*(self: TypeChecker, storage: Type): bool {.inline.} =
|
||||
if storage.isNil():
|
||||
return false
|
||||
let unwrapped = storage.unwrapType()
|
||||
case unwrapped.kind
|
||||
of Structure:
|
||||
not unwrapped.isEnum
|
||||
of Array, Integer, Float, Boolean, Byte, Char:
|
||||
true
|
||||
of CInterop:
|
||||
unwrapped.ckind != CVoid
|
||||
else:
|
||||
false
|
||||
|
||||
|
||||
func toCInterop*(name: string): Type =
|
||||
case name
|
||||
of "cvoid":
|
||||
Type(kind: CInterop, ckind: CVoid)
|
||||
of "cchar":
|
||||
Type(kind: CInterop, ckind: CChar)
|
||||
of "cschar":
|
||||
Type(kind: CInterop, ckind: CSChar)
|
||||
of "cuchar":
|
||||
Type(kind: CInterop, ckind: CUChar)
|
||||
of "cshort":
|
||||
Type(kind: CInterop, ckind: CShort)
|
||||
of "cushort":
|
||||
Type(kind: CInterop, ckind: CUShort)
|
||||
of "cint":
|
||||
Type(kind: CInterop, ckind: CInt)
|
||||
of "cuint":
|
||||
Type(kind: CInterop, ckind: CUInt)
|
||||
of "clong":
|
||||
Type(kind: CInterop, ckind: CLong)
|
||||
of "culong":
|
||||
Type(kind: CInterop, ckind: CULong)
|
||||
of "clonglong":
|
||||
Type(kind: CInterop, ckind: CLongLong)
|
||||
of "culonglong":
|
||||
Type(kind: CInterop, ckind: CULongLong)
|
||||
of "cfloat":
|
||||
Type(kind: CInterop, ckind: CFloat)
|
||||
of "cdouble":
|
||||
Type(kind: CInterop, ckind: CDouble)
|
||||
of "cbool":
|
||||
Type(kind: CInterop, ckind: CBool)
|
||||
of "csize":
|
||||
Type(kind: CInterop, ckind: CSize)
|
||||
of "cptrdiff":
|
||||
Type(kind: CInterop, ckind: CPtrdiff)
|
||||
of "cstring":
|
||||
Type(kind: CInterop, ckind: CString)
|
||||
else:
|
||||
raise newException(ValueError, &"invalid C interop intrinsic '{name}'")
|
||||
|
||||
|
||||
func cinteropIntegerInfo*(kind: CInteropKind): tuple[signed: bool, bits: int] =
|
||||
case kind
|
||||
of CChar:
|
||||
(signed: low(cchar) < cchar(0), bits: int(sizeof(cchar)) * 8)
|
||||
of CSChar:
|
||||
(signed: true, bits: int(sizeof(cschar)) * 8)
|
||||
of CUChar:
|
||||
(signed: false, bits: int(sizeof(uint8)) * 8)
|
||||
of CShort:
|
||||
(signed: true, bits: int(sizeof(cshort)) * 8)
|
||||
of CUShort:
|
||||
(signed: false, bits: int(sizeof(cushort)) * 8)
|
||||
of CInt:
|
||||
(signed: true, bits: int(sizeof(cint)) * 8)
|
||||
of CUInt:
|
||||
(signed: false, bits: int(sizeof(cuint)) * 8)
|
||||
of CLong:
|
||||
(signed: true, bits: int(sizeof(clong)) * 8)
|
||||
of CULong:
|
||||
(signed: false, bits: int(sizeof(culong)) * 8)
|
||||
of CLongLong:
|
||||
(signed: true, bits: int(sizeof(clonglong)) * 8)
|
||||
of CULongLong:
|
||||
(signed: false, bits: int(sizeof(culonglong)) * 8)
|
||||
of CSize:
|
||||
(signed: false, bits: int(sizeof(csize_t)) * 8)
|
||||
of CPtrdiff:
|
||||
(signed: true, bits: int(sizeof(pointer)) * 8)
|
||||
else:
|
||||
(signed: false, bits: 0)
|
||||
|
||||
|
||||
func toIntrinsic*(name: string): Type =
|
||||
## Converts a string to an intrinsic
|
||||
## type
|
||||
case name:
|
||||
of "typevar":
|
||||
return Type(kind: Typevar,
|
||||
intrinsic: true,
|
||||
wrapped: "any".toIntrinsic(),
|
||||
genericParams: @[TypeGenericParam(name: "T", kind: OpaqueTypeParam)])
|
||||
of "array":
|
||||
return Type(kind: Array,
|
||||
intrinsic: true,
|
||||
length: -1,
|
||||
elementType: "any".toIntrinsic(),
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[
|
||||
TypeGenericParam(name: "N", kind: ConstParam, valueType: "int64".toIntrinsic()),
|
||||
TypeGenericParam(name: "T", kind: OpaqueTypeParam)
|
||||
],
|
||||
pragmas: newTable[string, Pragma]())
|
||||
of "UncheckedArray":
|
||||
return Type(kind: UncheckedArray,
|
||||
intrinsic: true,
|
||||
length: -1,
|
||||
elementType: "any".toIntrinsic(),
|
||||
generics: newOrderedTable[string, Type](),
|
||||
genericParams: @[
|
||||
TypeGenericParam(name: "T", kind: OpaqueTypeParam)
|
||||
],
|
||||
pragmas: newTable[string, Pragma]())
|
||||
of "any":
|
||||
return Type(kind: Any)
|
||||
of "int64", "i64":
|
||||
return Type(kind: Integer, size: LongLong, signed: true)
|
||||
of "uint64", "u64":
|
||||
return Type(kind: Integer, size: LongLong, signed: false)
|
||||
of "int32", "i32":
|
||||
return Type(kind: Integer, size: Long, signed: true)
|
||||
of "uint32", "u32":
|
||||
return Type(kind: Integer, size: Long, signed: false)
|
||||
of "int16", "i16":
|
||||
return Type(kind: Integer, size: Short, signed: true)
|
||||
of "uint16", "u16":
|
||||
return Type(kind: Integer, size: Short, signed: false)
|
||||
of "int8", "i8":
|
||||
return Type(kind: Integer, size: Tiny, signed: true)
|
||||
of "uint8", "u8":
|
||||
return Type(kind: Integer, size: Tiny, signed: false)
|
||||
of "float", "float64", "f64":
|
||||
return Type(kind: Float, width: Full)
|
||||
of "float32", "f32":
|
||||
return Type(kind: Float, width: Half)
|
||||
of "byte":
|
||||
return Type(kind: Byte)
|
||||
of "char":
|
||||
return Type(kind: Char)
|
||||
of "NaN", "nan":
|
||||
return Type(kind: TypeKind.Nan)
|
||||
of "Inf", "inf":
|
||||
return Type(kind: Infinity, positive: true)
|
||||
of "NegInf", "neginf", "-inf":
|
||||
return Type(kind: Infinity)
|
||||
of "bool":
|
||||
return Type(kind: Boolean)
|
||||
of "string":
|
||||
return Type(kind: String)
|
||||
of "cvoid", "cchar", "cschar", "cuchar", "cshort", "cushort", "cint",
|
||||
"cuint", "clong", "culong", "clonglong", "culonglong", "cfloat",
|
||||
"cdouble", "cbool", "csize", "cptrdiff", "cstring":
|
||||
return toCInterop(name)
|
||||
of "pointer":
|
||||
return Type(kind: Pointer, value: "any".toIntrinsic())
|
||||
of "lent":
|
||||
return Type(kind: TypeKind.Lent, value: "any".toIntrinsic())
|
||||
of "const":
|
||||
return Type(kind: TypeKind.Const, value: "any".toIntrinsic())
|
||||
of "ref":
|
||||
return Type(kind: Reference, value: "any".toIntrinsic())
|
||||
else:
|
||||
raise newException(ValueError, &"invalid intrinsic '{name}'")
|
||||
|
||||
|
||||
func intrinsicTypeKey*(typ: Type): string =
|
||||
let target = typ.unwrapType()
|
||||
if target.isNil():
|
||||
return ""
|
||||
case target.kind
|
||||
of Integer:
|
||||
case target.size
|
||||
of LongLong:
|
||||
return if target.signed: "int64" else: "uint64"
|
||||
of Long:
|
||||
return if target.signed: "int32" else: "uint32"
|
||||
of Short:
|
||||
return if target.signed: "int16" else: "uint16"
|
||||
of Tiny:
|
||||
return if target.signed: "int8" else: "uint8"
|
||||
of Float:
|
||||
return if target.width == Full: "float64" else: "float32"
|
||||
of CInterop:
|
||||
return cinteropName(target.ckind)
|
||||
of Boolean:
|
||||
return "bool"
|
||||
of Byte:
|
||||
return "byte"
|
||||
of Char:
|
||||
return "char"
|
||||
of String:
|
||||
return "string"
|
||||
of TypeKind.Nan:
|
||||
return "NaN"
|
||||
of Infinity:
|
||||
return if target.positive: "Inf" else: ""
|
||||
of Typevar:
|
||||
return "typevar"
|
||||
of Array:
|
||||
return "array"
|
||||
of Pointer:
|
||||
return "pointer"
|
||||
of UncheckedArray:
|
||||
return "UncheckedArray"
|
||||
else:
|
||||
return ""
|
||||
|
||||
|
||||
proc isTypeSet*(self: TypeChecker, typ: Type): bool =
|
||||
## Returns whether the given type represents
|
||||
## a set of types
|
||||
case typ.kind:
|
||||
of Typevar, Union, Generic:
|
||||
return true
|
||||
else:
|
||||
return false
|
||||
|
||||
|
||||
proc constructorPayloadType*(resultType: Type): Type =
|
||||
## Returns the payload-bearing type for construction checks
|
||||
case resultType.kind:
|
||||
of Reference:
|
||||
return resultType.value
|
||||
else:
|
||||
return resultType
|
||||
|
||||
|
||||
proc containsDeferredGeneric*(typ: Type): bool =
|
||||
if typ.isNil():
|
||||
return false
|
||||
let target = typ.unwrapType()
|
||||
case target.kind
|
||||
of Generic:
|
||||
return true
|
||||
of Reference, Pointer, TypeKind.Lent, TypeKind.Const:
|
||||
return containsDeferredGeneric(target.value)
|
||||
of Array, UncheckedArray:
|
||||
return containsDeferredGeneric(target.elementType) or
|
||||
containsDeferredGeneric(target.lengthParam)
|
||||
of Structure, EnumEntry:
|
||||
for generic in target.generics.values():
|
||||
if containsDeferredGeneric(generic):
|
||||
return true
|
||||
return false
|
||||
of Function:
|
||||
if containsDeferredGeneric(target.returnType):
|
||||
return true
|
||||
for parameter in target.signature:
|
||||
if containsDeferredGeneric(parameter.kind):
|
||||
return true
|
||||
return false
|
||||
of Union:
|
||||
for constraint in target.constraints:
|
||||
if containsDeferredGeneric(constraint.kind):
|
||||
return true
|
||||
return false
|
||||
else:
|
||||
return false
|
||||
638
src/frontend/compiler/type_checking/types.nim
Normal file
638
src/frontend/compiler/type_checking/types.nim
Normal file
@@ -0,0 +1,638 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_system
|
||||
import frontend/compiler/module_loader
|
||||
import frontend/parsing/parser
|
||||
import frontend/parsing/lexer
|
||||
|
||||
import std/os
|
||||
import std/sets
|
||||
import std/tables
|
||||
import std/terminal
|
||||
import std/strutils
|
||||
import std/sequtils
|
||||
import std/strformat
|
||||
|
||||
export type_system # which re-exports ast, errors
|
||||
|
||||
|
||||
type
|
||||
UserDefine* = object
|
||||
explicit*: bool
|
||||
hasValue*: bool
|
||||
value*: string
|
||||
|
||||
CanonicalBuildMode* = enum
|
||||
DebugBuildMode,
|
||||
ReleaseBuildMode,
|
||||
DangerBuildMode
|
||||
|
||||
CheckDefineOverrides* = object
|
||||
hasBuildModeOverride*: bool
|
||||
buildMode*: CanonicalBuildMode
|
||||
hasChecksOverride*: bool
|
||||
checks*: bool
|
||||
hasLineTraceOverride*: bool
|
||||
lineTrace*: bool
|
||||
hasStackTraceOverride*: bool
|
||||
stackTrace*: bool
|
||||
hasBoundChecksOverride*: bool
|
||||
boundChecks*: bool
|
||||
hasOverflowCheckOverride*: bool
|
||||
overflowCheck*: bool
|
||||
hasFloatCheckOverride*: bool
|
||||
floatCheck*: bool
|
||||
|
||||
TypeCheckError* = ref object of PeonException
|
||||
## A typechecking error with location information
|
||||
node*: ASTNode
|
||||
function*: Declaration
|
||||
# The instance of the typechecker that
|
||||
# raised the error
|
||||
instance*: TypeChecker
|
||||
|
||||
PragmaFunc* = object
|
||||
## An internal function called
|
||||
## by pragmas
|
||||
kind*: PragmaKind
|
||||
handler*: proc (self: TypeChecker, pragma: Pragma, name: Name)
|
||||
|
||||
PragmaKind* = enum
|
||||
## An enumeration of pragma types
|
||||
Immediate,
|
||||
Delayed
|
||||
|
||||
LentOriginKind* = enum
|
||||
LocalLentOrigin,
|
||||
ParamLentOrigin,
|
||||
GlobalLentOrigin,
|
||||
UnknownLentOrigin
|
||||
|
||||
LentOrigin* = object
|
||||
kind*: LentOriginKind
|
||||
name*: Name
|
||||
paramIndex*: int
|
||||
|
||||
LentOriginSet* = seq[LentOrigin]
|
||||
|
||||
VarDefinePragmaKind* = enum
|
||||
NoDefinePragma,
|
||||
DefinePragma,
|
||||
BoolDefinePragma,
|
||||
ConflictingDefinePragmas
|
||||
|
||||
TypeComparePair* = tuple[left: uint, right: uint]
|
||||
|
||||
TypeChecker* = ref object
|
||||
## The Peon type checker
|
||||
current*: int # The current node we're looking at
|
||||
tree*: ParseTree # The AST for the current module
|
||||
scopeDepth*: int # The current scope depth (0 == global, > 0 == local)
|
||||
source*: string # The module's raw source code
|
||||
file*: string # The module's filename
|
||||
entryModulePath*: string
|
||||
isMainModule*: bool # Are we the main module?
|
||||
currentFunction*: Name # The current function we're checking
|
||||
currentModule*: Name # The current module we're checking
|
||||
disabledWarnings*: seq[WarningKind] # List of disabled warnings
|
||||
pushedDisabledWarnings*: seq[WarningKind]
|
||||
names*: seq[TableRef[string, seq[Name]]] # Maps scope depths to namespaces
|
||||
# Internal procedures called by pragmas
|
||||
pragmas*: TableRef[string, PragmaFunc]
|
||||
# Show full info about type mismatches when dispatching
|
||||
# function calls fails (we hide this under a boolean flag
|
||||
# because the output is quite verbose)
|
||||
showMismatches*: bool
|
||||
capabilities*: TableRef[string, seq[TypeCapabilities]]
|
||||
intrinsicInterfaces*: TableRef[string, seq[Type]]
|
||||
moduleLoader*: ModuleLoader
|
||||
typedModules*: TableRef[string, seq[TypedNode]]
|
||||
moduleExports*: TableRef[string, seq[Name]]
|
||||
knownModules*: TableRef[string, Name]
|
||||
resolvedImports*: TableRef[string, TypedImportStmt]
|
||||
typingModules*: HashSet[string]
|
||||
movedNames*: HashSet[uint]
|
||||
consumedMoves*: HashSet[uint]
|
||||
initializedNames*: HashSet[uint]
|
||||
nameInits*: TableRef[uint, TypedExpr]
|
||||
typedFunctions*: TableRef[uint, TypedFunDecl]
|
||||
lentReturnSummaries*: TableRef[uint, LentOriginSet]
|
||||
loopDepth*: int
|
||||
namedBlocks*: seq[string]
|
||||
lambdaCounter*: int
|
||||
baseUserDefines*: TableRef[string, UserDefine]
|
||||
userDefines*: TableRef[string, UserDefine]
|
||||
pushedUserDefines*: TableRef[string, UserDefine]
|
||||
pushedConfigNode*: ASTNode
|
||||
# Temporarily stores deferred overload candidates when a call
|
||||
# involves a Generic type parameter and multiple overloads
|
||||
# match equally. Consumed by lowerResolvedCall to populate
|
||||
# TypedCallExpr.deferredCandidates.
|
||||
pendingDeferredCandidates*: seq[Name]
|
||||
# Proc-var hooks to break import cycles between submodules.
|
||||
# Set by the main type_checking entry-point module.
|
||||
expressionImpl*: proc(self: TypeChecker, node: Expression): TypedExpr
|
||||
attachConstructContextImpl*: proc(self: TypeChecker, construct: TypedConstructExpr, expected: Type)
|
||||
inferConstructGenericArgsImpl*: proc(self: TypeChecker, callee: Name, expected: Type): seq[GenericArg]
|
||||
instantiateDeclaredTypeImpl*: proc(self: TypeChecker, templateDecl: Name, typ: Type,
|
||||
genericArgs: seq[GenericArg], node: ASTNode): Type
|
||||
dispatchPragmasImpl*: proc(self: TypeChecker, name: Name)
|
||||
matchImpl*: proc(self: TypeChecker, name: string, sig: TypeSignature,
|
||||
args: seq[TypedExpr], node: ASTNode): Name
|
||||
typecheckImpl*: proc(self: TypeChecker, node: ASTNode): TypedNode
|
||||
funDeclImpl*: proc(self: TypeChecker, node: FunDecl, name: Name): TypedFunDecl
|
||||
typecheckModuleImpl*: proc(self: TypeChecker, module: LoadedModule): seq[TypedNode]
|
||||
getOrCreateModuleImpl*: proc(self: TypeChecker, path: string): Name
|
||||
recordModuleExportImpl*: proc(self: TypeChecker, name: Name)
|
||||
findModuleExportsImpl*: proc(self: TypeChecker, modulePath, symbol: string): seq[Name]
|
||||
bindImportImpl*: proc(self: TypeChecker, imported: Name, importer: Name)
|
||||
declareModuleAliasImpl*: proc(self: TypeChecker, importNode: ImportStmt, importedModule: Name): Name
|
||||
declareImportedAliasImpl*: proc(self: TypeChecker, imported: Name, localName: IdentExpr): Name
|
||||
|
||||
|
||||
# Forward declaration needed by getCurrentNode
|
||||
proc done*(self: TypeChecker): bool {.inline.}
|
||||
|
||||
# Public getters for nicer error formatting
|
||||
proc getCurrentNode*(self: TypeChecker): ASTNode = (if self.done(): self.tree[^1] else: self.tree[self.current - 1])
|
||||
proc getCurrentFunction*(self: TypeChecker): Declaration {.inline.} = (if self.currentFunction.isNil(): nil else: self.currentFunction.node)
|
||||
proc getSource*(self: TypeChecker): string {.inline.} = self.source
|
||||
proc getFile*(self: TypeChecker): string {.inline.} = self.file
|
||||
proc setCurrentModule*(self: TypeChecker, module: Name) {.inline.} = self.currentModule = module
|
||||
|
||||
|
||||
proc newTypeCheckerBase*(resolvedDefines: TableRef[string, UserDefine] = nil): TypeChecker =
|
||||
new(result)
|
||||
result.current = 0
|
||||
result.tree = @[]
|
||||
result.scopeDepth = 0
|
||||
result.source = ""
|
||||
result.file = ""
|
||||
result.entryModulePath = ""
|
||||
result.isMainModule = false
|
||||
result.currentFunction = nil
|
||||
result.disabledWarnings = @[]
|
||||
result.names = @[]
|
||||
result.capabilities = newTable[string, seq[TypeCapabilities]]()
|
||||
result.intrinsicInterfaces = newTable[string, seq[Type]]()
|
||||
result.moduleLoader = nil
|
||||
result.typedModules = newTable[string, seq[TypedNode]]()
|
||||
result.moduleExports = newTable[string, seq[Name]]()
|
||||
result.knownModules = newTable[string, Name]()
|
||||
result.resolvedImports = newTable[string, TypedImportStmt]()
|
||||
result.typingModules = initHashSet[string]()
|
||||
result.movedNames = initHashSet[uint]()
|
||||
result.consumedMoves = initHashSet[uint]()
|
||||
result.initializedNames = initHashSet[uint]()
|
||||
result.nameInits = newTable[uint, TypedExpr]()
|
||||
result.typedFunctions = newTable[uint, TypedFunDecl]()
|
||||
result.lentReturnSummaries = newTable[uint, LentOriginSet]()
|
||||
result.loopDepth = 0
|
||||
result.namedBlocks = @[]
|
||||
result.lambdaCounter = 0
|
||||
result.baseUserDefines = resolvedDefines
|
||||
result.userDefines = resolvedDefines
|
||||
result.pushedUserDefines = nil
|
||||
result.pushedConfigNode = nil
|
||||
result.pushedDisabledWarnings = @[]
|
||||
result.pragmas = newTable[string, PragmaFunc]()
|
||||
result.showMismatches = false
|
||||
result.pendingDeferredCandidates = @[]
|
||||
result.expressionImpl = nil
|
||||
|
||||
|
||||
proc `$`*(self: Name): string = $(self[])
|
||||
proc `$`*(self: Type): string = $(self[])
|
||||
|
||||
|
||||
# Cursor helpers
|
||||
|
||||
proc done*(self: TypeChecker): bool {.inline.} = self.current == self.tree.len()
|
||||
|
||||
|
||||
proc peek*(self: TypeChecker): ASTNode {.inline.} =
|
||||
if self.tree.len() == 0:
|
||||
return nil
|
||||
if self.done():
|
||||
return self.tree[^1]
|
||||
return self.tree[self.current]
|
||||
|
||||
|
||||
proc step*(self: TypeChecker): ASTNode {.inline.} =
|
||||
if self.tree.len() == 0:
|
||||
return nil
|
||||
if self.done():
|
||||
return self.tree[^1]
|
||||
result = self.peek()
|
||||
inc(self.current)
|
||||
|
||||
|
||||
# LentOrigin constructors and set operations
|
||||
|
||||
func localLentOrigin*(name: Name = nil): LentOrigin {.inline.} =
|
||||
LentOrigin(kind: LocalLentOrigin, name: name)
|
||||
|
||||
|
||||
func paramLentOrigin*(index: int): LentOrigin {.inline.} =
|
||||
LentOrigin(kind: ParamLentOrigin, paramIndex: index)
|
||||
|
||||
|
||||
func globalLentOrigin*(name: Name = nil): LentOrigin {.inline.} =
|
||||
LentOrigin(kind: GlobalLentOrigin, name: name)
|
||||
|
||||
|
||||
func unknownLentOrigin*(): LentOrigin {.inline.} =
|
||||
LentOrigin(kind: UnknownLentOrigin)
|
||||
|
||||
|
||||
proc sameLentOrigin*(a, b: LentOrigin): bool =
|
||||
if a.kind != b.kind:
|
||||
return false
|
||||
case a.kind:
|
||||
of ParamLentOrigin:
|
||||
a.paramIndex == b.paramIndex
|
||||
of LocalLentOrigin, GlobalLentOrigin:
|
||||
a.name == b.name
|
||||
of UnknownLentOrigin:
|
||||
true
|
||||
|
||||
|
||||
proc addLentOrigin*(origins: var LentOriginSet, origin: LentOrigin) =
|
||||
for existing in origins:
|
||||
if sameLentOrigin(existing, origin):
|
||||
return
|
||||
origins.add(origin)
|
||||
|
||||
|
||||
proc addLentOrigins*(origins: var LentOriginSet, extra: LentOriginSet) =
|
||||
for origin in extra:
|
||||
origins.addLentOrigin(origin)
|
||||
|
||||
|
||||
proc sameLentOrigins*(a, b: LentOriginSet): bool =
|
||||
if a.len != b.len:
|
||||
return false
|
||||
for origin in a:
|
||||
var found = false
|
||||
for candidate in b:
|
||||
if sameLentOrigin(origin, candidate):
|
||||
found = true
|
||||
break
|
||||
if not found:
|
||||
return false
|
||||
true
|
||||
|
||||
|
||||
proc containsLocalLentOrigin*(origins: LentOriginSet): bool =
|
||||
for origin in origins:
|
||||
if origin.kind == LocalLentOrigin:
|
||||
return true
|
||||
false
|
||||
|
||||
|
||||
proc containsUnknownLentOrigin*(origins: LentOriginSet): bool =
|
||||
for origin in origins:
|
||||
if origin.kind == UnknownLentOrigin:
|
||||
return true
|
||||
false
|
||||
|
||||
|
||||
# Utility helpers
|
||||
|
||||
proc cloneDisabledWarnings*(warnings: seq[WarningKind]): seq[WarningKind] =
|
||||
for warning in warnings:
|
||||
result.add(warning)
|
||||
|
||||
|
||||
proc isEntryModule*(self: TypeChecker): bool =
|
||||
if self.entryModulePath.len == 0:
|
||||
return true
|
||||
if self.file in ["", "<string>"] or self.entryModulePath == "<string>":
|
||||
return self.file == self.entryModulePath
|
||||
absolutePath(self.file) == self.entryModulePath
|
||||
|
||||
|
||||
# Error and warning
|
||||
|
||||
proc error*(self: TypeChecker, message: string, node: ASTNode = nil) {.inline.} =
|
||||
## Raises a TypeCheckError exception
|
||||
let currentNode = self.getCurrentNode()
|
||||
let node = if node.isNil(): currentNode else: node
|
||||
let function = if not self.currentFunction.isNil(): self.currentFunction.node else: nil
|
||||
let line =
|
||||
if not node.isNil() and not node.token.isNil() and node.token.line > 0:
|
||||
node.token.line
|
||||
elif not currentNode.isNil() and not currentNode.token.isNil():
|
||||
currentNode.token.line
|
||||
else:
|
||||
-1
|
||||
let file =
|
||||
if not node.isNil() and node.file.len() > 0:
|
||||
node.file
|
||||
elif not currentNode.isNil() and currentNode.file.len() > 0:
|
||||
currentNode.file
|
||||
else:
|
||||
self.file
|
||||
raise TypeCheckError(msg: message, node: node, line: line, file: file, instance: self, function: function)
|
||||
|
||||
|
||||
proc warning*(self: TypeChecker, kind: WarningKind, message: string, name: Name = nil, node: ASTNode = nil) =
|
||||
## Raises a warning
|
||||
if kind in self.disabledWarnings:
|
||||
return
|
||||
var node: ASTNode = node
|
||||
var fn: Declaration
|
||||
if name.isNil():
|
||||
if node.isNil():
|
||||
node = self.getCurrentNode()
|
||||
fn = self.getCurrentFunction()
|
||||
else:
|
||||
node = name.node
|
||||
if node.isNil():
|
||||
node = self.getCurrentNode()
|
||||
if not name.owner.isNil():
|
||||
fn = name.owner.node
|
||||
else:
|
||||
fn = self.getCurrentFunction()
|
||||
var file = self.file
|
||||
if not name.isNil():
|
||||
file =
|
||||
if not name.owner.isNil():
|
||||
name.owner.file
|
||||
else:
|
||||
name.file
|
||||
var pos = node.getRelativeBoundaries()
|
||||
if file notin ["<string>", ""]:
|
||||
file = relativePath(file, getCurrentDir())
|
||||
stderr.styledWrite(fgYellow, styleBright, "Warning in ", fgRed, &"{file}:{node.token.line}:{pos.start}")
|
||||
if not fn.isNil() and fn.kind == funDecl:
|
||||
stderr.styledWrite(fgYellow, styleBright, " in function ", fgRed, FunDecl(fn).name.token.lexeme)
|
||||
stderr.styledWriteLine(styleBright, fgYellow, &" ({kind}): ", fgDefault, message)
|
||||
try:
|
||||
# We try to be as specific as possible with the warning message, pointing to the
|
||||
# line it belongs to, but since warnings are not always raised from the source
|
||||
# file they're generated in, we take into account the fact that retrieving the
|
||||
# exact warning location may fail and bail out silently if it does
|
||||
let line = self.source.splitLines()[node.token.line - 1].strip(chars={'\n'})
|
||||
stderr.styledWrite(fgYellow, styleBright, "Source line: ", resetStyle, fgDefault, line[0..<pos.start])
|
||||
stderr.styledWrite(fgYellow, styleUnderscore, line[pos.start..pos.stop])
|
||||
stderr.styledWriteLine(fgDefault, line[pos.stop + 1..^1])
|
||||
except IndexDefect:
|
||||
# Something probably went wrong (wrong line metadata): bad idea to crash!
|
||||
stderr.styledwriteLine(resetStyle, fgRed, "Failed to retrieve line information")
|
||||
|
||||
|
||||
# Initialization tracking
|
||||
|
||||
func requiresInitializationTracking*(name: Name): bool =
|
||||
not name.isNil() and
|
||||
name.kind == NameKind.Var and
|
||||
not name.node.isNil() and
|
||||
name.node.kind == NodeKind.varDecl
|
||||
|
||||
|
||||
proc markInitialized*(self: TypeChecker, name: Name) =
|
||||
if name.isNil():
|
||||
return
|
||||
self.initializedNames.incl(name.nameKey())
|
||||
|
||||
|
||||
proc clearInitialized*(self: TypeChecker, name: Name) =
|
||||
if name.isNil():
|
||||
return
|
||||
self.initializedNames.excl(name.nameKey())
|
||||
|
||||
|
||||
proc isInitialized*(self: TypeChecker, name: Name): bool =
|
||||
if not requiresInitializationTracking(name):
|
||||
return true
|
||||
name.nameKey() in self.initializedNames
|
||||
|
||||
|
||||
proc ensureInitialized*(self: TypeChecker, name: Name, node: ASTNode) =
|
||||
if self.isInitialized(name):
|
||||
return
|
||||
self.error(&"cannot use '{name.ident.token.lexeme}' before it is initialized", node)
|
||||
|
||||
|
||||
proc intersectInitialized*(a, b: HashSet[uint]): HashSet[uint] =
|
||||
result = initHashSet[uint]()
|
||||
for key in a:
|
||||
if key in b:
|
||||
result.incl(key)
|
||||
|
||||
|
||||
# Stringify overloads
|
||||
|
||||
proc stringify*(self: TypeChecker, typ: Type, verbose: bool = false): string =
|
||||
## Returns the string representation of a
|
||||
## type
|
||||
if typ.isNil():
|
||||
return "void"
|
||||
case typ.kind:
|
||||
of Char, Byte, String, TypeKind.Nan, Any:
|
||||
result &= ($typ.kind).toLowerAscii()
|
||||
of CInterop:
|
||||
result = cinteropName(typ.ckind)
|
||||
of Structure:
|
||||
result &= typ.name
|
||||
if typ.generics.len() > 0:
|
||||
result &= "["
|
||||
var i = 0
|
||||
for gen in typ.generics.keys():
|
||||
result &= &"{gen}: {self.stringify(typ.generics[gen], true)}"
|
||||
if i < typ.generics.len() - 1:
|
||||
result &= ", "
|
||||
inc(i)
|
||||
result &= "]"
|
||||
of EnumEntry:
|
||||
if not typ.parent.isNil() and typ.parent.name.len() > 0:
|
||||
result &= typ.parent.name & "."
|
||||
result &= typ.name
|
||||
of Boolean:
|
||||
result = "bool"
|
||||
of Infinity:
|
||||
result = "inf"
|
||||
of Integer:
|
||||
if not typ.signed:
|
||||
result &= "u"
|
||||
result &= &"int{int(typ.size)}"
|
||||
of Float:
|
||||
result &= "float"
|
||||
case typ.width:
|
||||
of Half:
|
||||
result &= "32"
|
||||
of Full:
|
||||
result &= "64"
|
||||
of Pointer:
|
||||
result &= &"ptr {self.stringify(typ.value)}"
|
||||
of Reference:
|
||||
if typ.mutable:
|
||||
result &= "mut "
|
||||
result &= &"ref {self.stringify(typ.value)}"
|
||||
of TypeKind.Const:
|
||||
result &= &"const {self.stringify(typ.value)}"
|
||||
of Array:
|
||||
let lengthLabel =
|
||||
if not typ.lengthParam.isNil():
|
||||
self.stringify(typ.lengthParam)
|
||||
else:
|
||||
$typ.length
|
||||
result &= &"array[{lengthLabel}, {self.stringify(typ.elementType)}]"
|
||||
of UncheckedArray:
|
||||
result &= &"UncheckedArray[{self.stringify(typ.elementType)}]"
|
||||
of Function:
|
||||
result &= "fn "
|
||||
if typ.generics.len() > 0:
|
||||
result &= "["
|
||||
var i = 0
|
||||
for gen in typ.generics.keys():
|
||||
result &= &"{gen}: {self.stringify(typ.generics[gen], verbose=true)}"
|
||||
if i < typ.generics.len() - 1:
|
||||
result &= ", "
|
||||
inc(i)
|
||||
result &= "]"
|
||||
result &= "("
|
||||
for i, (argName, argType, argDefault, argIsVar) in typ.signature:
|
||||
if argName.len() > 0:
|
||||
if argIsVar:
|
||||
result &= &"{argName}: var {self.stringify(argType)}"
|
||||
else:
|
||||
result &= &"{argName}: {self.stringify(argType)}"
|
||||
else:
|
||||
if argIsVar:
|
||||
result &= &"#{i}: var {self.stringify(argType)}"
|
||||
else:
|
||||
result &= &"#{i}: {self.stringify(argType)}"
|
||||
if not argDefault.isNil():
|
||||
result &= &" = {argDefault.kind}"
|
||||
if i < typ.signature.len() - 1:
|
||||
result &= ", "
|
||||
result &= ")"
|
||||
if not typ.returnType.isNil():
|
||||
result &= &": {self.stringify(typ.returnType)}"
|
||||
if typ.pragmas.len() > 0:
|
||||
result &= " {"
|
||||
var i = 0
|
||||
for name, pragma in typ.pragmas:
|
||||
result &= &"{name}"
|
||||
if pragma.args.len() > 0:
|
||||
result &= ": "
|
||||
for j, arg in pragma.args:
|
||||
result &= arg.token.lexeme
|
||||
if j < pragma.args.high():
|
||||
result &= ", "
|
||||
if i < typ.pragmas.len() - 1:
|
||||
result &= ", "
|
||||
else:
|
||||
result &= "}"
|
||||
inc(i)
|
||||
of TypeKind.Lent:
|
||||
if typ.mutable:
|
||||
result &= "mut "
|
||||
result &= &"lent {self.stringify(typ.value)}"
|
||||
of Generic, Union:
|
||||
if typ.displayName.len() > 0 and not verbose:
|
||||
return typ.displayName
|
||||
for i, condition in typ.constraints:
|
||||
if i > 0:
|
||||
result &= " | "
|
||||
if not condition.match:
|
||||
result &= "~"
|
||||
result &= self.stringify(condition.kind)
|
||||
of Typevar:
|
||||
result &= &"typevar[{self.stringify(typ.wrapped)}]"
|
||||
|
||||
|
||||
proc stringify*(self: TypeChecker, sig: TypeSignature, verbose = false): string =
|
||||
result &= "("
|
||||
for i, (argName, argType, argDefault, argIsVar) in sig:
|
||||
if argName.len() > 0:
|
||||
if argIsVar:
|
||||
result &= &"{argName}: var {self.stringify(argType, verbose)}"
|
||||
else:
|
||||
result &= &"{argName}: {self.stringify(argType, verbose)}"
|
||||
else:
|
||||
if argIsVar:
|
||||
result &= &"#{i}: var {self.stringify(argType, verbose)}"
|
||||
else:
|
||||
result &= &"#{i}: {self.stringify(argType, verbose)}"
|
||||
if not argDefault.isNil():
|
||||
result &= &" = {argDefault.kind}"
|
||||
if i < sig.len() - 1:
|
||||
result &= ", "
|
||||
result &= ")"
|
||||
|
||||
|
||||
proc stringify*(self: TypeChecker, typ: TypedNode, verbose = false): string =
|
||||
## Returns a string represesentation of the given
|
||||
## typed node
|
||||
if typ.node.isConst():
|
||||
return self.stringify(TypedExpr(typ).kind, verbose)
|
||||
case typ.node.kind:
|
||||
of NodeKind.funDecl, varDecl, typeDecl:
|
||||
result = self.stringify(TypedDecl(typ).name.valueType, verbose)
|
||||
of binaryExpr, unaryExpr, identExpr, callExpr, lentExpr,
|
||||
mutExpr, constExpr, ptrExpr, refExpr, genericExpr, sliceExpr, arrayExpr:
|
||||
result = self.stringify(TypedExpr(typ).kind, verbose)
|
||||
else:
|
||||
# TODO
|
||||
return &"?: {typ[]}"
|
||||
|
||||
|
||||
proc builtinMagic*(self: TypeChecker, fn: Name): string =
|
||||
result = fn.valueType.builtinOp
|
||||
if result.len() > 0 or fn.isNil() or fn.node.isNil():
|
||||
return
|
||||
for pragma in FunDecl(fn.node).pragmas:
|
||||
if pragma.name.token.lexeme == "magic":
|
||||
return pragma.args[0].token.lexeme[1..^2]
|
||||
|
||||
|
||||
proc hasPragma*(pragmas: seq[Pragma], label: string): bool =
|
||||
for pragma in pragmas:
|
||||
if pragma.name.token.lexeme == label:
|
||||
return true
|
||||
false
|
||||
|
||||
|
||||
proc markResolved*(self: TypeChecker, name: Name) =
|
||||
## Marks a name as used, propagating that state to any visible aliases
|
||||
## that refer to the same declaration node.
|
||||
if name.isNil():
|
||||
return
|
||||
name.resolved = true
|
||||
let resolvedType =
|
||||
if name.valueType.isNil():
|
||||
nil
|
||||
else:
|
||||
name.valueType.unwrapType()
|
||||
let parentEnum =
|
||||
if not resolvedType.isNil() and resolvedType.kind == TypeKind.EnumEntry:
|
||||
resolvedType.parent
|
||||
else:
|
||||
nil
|
||||
if name.node.isNil():
|
||||
if parentEnum.isNil():
|
||||
return
|
||||
for scope in self.names:
|
||||
for bucket in scope.values():
|
||||
for candidate in bucket:
|
||||
if candidate.node == name.node and candidate.kind == name.kind:
|
||||
candidate.resolved = true
|
||||
elif not parentEnum.isNil() and
|
||||
not candidate.valueType.isNil() and
|
||||
cast[uint](candidate.valueType.unwrapType()) == cast[uint](parentEnum):
|
||||
candidate.resolved = true
|
||||
428
src/frontend/compiler/type_checking/user_defines.nim
Normal file
428
src/frontend/compiler/type_checking/user_defines.nim
Normal file
@@ -0,0 +1,428 @@
|
||||
# Copyright 2026 Mattia Giambirtone & All Contributors
|
||||
#
|
||||
# Licensed under the Apache License, Version 2.0 (the "License");
|
||||
# you may not use this file except in compliance with the License.
|
||||
# You may obtain a copy of the License at
|
||||
#
|
||||
# http://www.apache.org/licenses/LICENSE-2.0
|
||||
#
|
||||
# Unless required by applicable law or agreed to in writing, software
|
||||
# distributed under the License is distributed on an "AS IS" BASIS,
|
||||
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
|
||||
# See the License for the specific language governing permissions and
|
||||
# limitations under the License.
|
||||
|
||||
import frontend/compiler/type_checking/types
|
||||
import frontend/compiler/type_checking/type_utils
|
||||
import frontend/compiler/type_checking/comparison
|
||||
import frontend/compiler/comptime_eval
|
||||
|
||||
import std/strutils
|
||||
import std/strformat
|
||||
import std/tables
|
||||
import std/sequtils
|
||||
import std/parseutils
|
||||
|
||||
|
||||
proc cloneUserDefines*(userDefines: TableRef[string, UserDefine]): TableRef[string, UserDefine]
|
||||
proc checkDefineEnabled*(userDefines: TableRef[string, UserDefine], defineName: string): bool
|
||||
|
||||
proc staticIntValue*(self: TypeChecker, node: Expression, context: string): int =
|
||||
if node.isNil() or node.kind notin {intExpr, hexExpr, octExpr, binExpr}:
|
||||
self.error(&"{context} requires a constant integer literal", node)
|
||||
let lexeme = node.token.lexeme.split("'")[0]
|
||||
try:
|
||||
result = parseInt(lexeme)
|
||||
except ValueError:
|
||||
self.error(&"{context} requires an integer literal in range", node)
|
||||
|
||||
|
||||
# builtinMagic moved to types.nim
|
||||
|
||||
|
||||
proc findVarDefinePragma*(node: VarDecl): VarDefinePragmaKind =
|
||||
if node.isNil():
|
||||
return NoDefinePragma
|
||||
for pragma in node.pragmas:
|
||||
case pragma.name.token.lexeme:
|
||||
of "define":
|
||||
if result == BoolDefinePragma:
|
||||
return ConflictingDefinePragmas
|
||||
if result == NoDefinePragma:
|
||||
result = DefinePragma
|
||||
of "booldefine":
|
||||
if result == DefinePragma:
|
||||
return ConflictingDefinePragmas
|
||||
if result == NoDefinePragma:
|
||||
result = BoolDefinePragma
|
||||
else:
|
||||
discard
|
||||
|
||||
|
||||
proc makeSyntheticToken*(self: TypeChecker, source: ASTNode, kind: TokenType, lexeme: string): Token =
|
||||
if source.isNil() or source.token.isNil():
|
||||
return Token(kind: kind, lexeme: lexeme, line: 1)
|
||||
Token(kind: kind,
|
||||
lexeme: lexeme,
|
||||
line: source.token.line,
|
||||
pos: source.token.pos,
|
||||
relPos: source.token.relPos)
|
||||
|
||||
|
||||
proc escapeDefineString*(value: string): string =
|
||||
result = "\""
|
||||
for ch in value:
|
||||
case ch
|
||||
of '\\':
|
||||
result &= "\\\\"
|
||||
of '\"':
|
||||
result &= "\\\""
|
||||
of '\n':
|
||||
result &= "\\n"
|
||||
of '\r':
|
||||
result &= "\\r"
|
||||
of '\t':
|
||||
result &= "\\t"
|
||||
else:
|
||||
result.add(ch)
|
||||
result &= "\""
|
||||
|
||||
|
||||
proc parseBoolDefineValue*(value: string): bool =
|
||||
case value.toLowerAscii()
|
||||
of "1", "true", "yes", "on":
|
||||
true
|
||||
of "0", "false", "no", "off":
|
||||
false
|
||||
else:
|
||||
raise newException(ValueError, "expected a boolean value")
|
||||
|
||||
|
||||
const canonicalCheckDefines = ["checks", "boundChecks", "overflowChecks", "floatChecks"]
|
||||
const canonicalTraceDefines = ["lineTrace", "stackTrace"]
|
||||
const canonicalBuildModeDefines = ["debug", "release", "danger"]
|
||||
const canonicalCompilerDefines = ["checks", "boundChecks", "overflowChecks", "floatChecks",
|
||||
"lineTrace", "stackTrace", "debug", "release", "danger"]
|
||||
|
||||
|
||||
proc canonicalCompilerDefineName*(name: string): string =
|
||||
case name
|
||||
of "overflowCheck":
|
||||
"overflowChecks"
|
||||
of "floatCheck":
|
||||
"floatChecks"
|
||||
else:
|
||||
name
|
||||
|
||||
|
||||
proc normalizeUserDefines*(userDefines: TableRef[string, UserDefine]): TableRef[string, UserDefine] =
|
||||
result = cloneUserDefines(userDefines)
|
||||
if result.isNil():
|
||||
return
|
||||
for (aliasName, canonicalName) in [("overflowCheck", "overflowChecks"),
|
||||
("floatCheck", "floatChecks")]:
|
||||
if not result.hasKey(aliasName):
|
||||
continue
|
||||
if result.hasKey(canonicalName):
|
||||
let aliasDefine = result[aliasName]
|
||||
let canonicalDefine = result[canonicalName]
|
||||
if aliasDefine != canonicalDefine:
|
||||
raise newException(ValueError,
|
||||
&"conflicting defines '{aliasName}' and '{canonicalName}'")
|
||||
else:
|
||||
result[canonicalName] = result[aliasName]
|
||||
result.del(aliasName)
|
||||
|
||||
|
||||
proc isCanonicalCheckDefine*(name: string): bool =
|
||||
canonicalCompilerDefineName(name) in canonicalCheckDefines
|
||||
|
||||
|
||||
proc isCanonicalCompilerDefine*(name: string): bool =
|
||||
canonicalCompilerDefineName(name) in canonicalCompilerDefines
|
||||
|
||||
|
||||
proc cloneUserDefines*(userDefines: TableRef[string, UserDefine]): TableRef[string, UserDefine] =
|
||||
result = newTable[string, UserDefine]()
|
||||
if userDefines.isNil():
|
||||
return
|
||||
for defineName, define in userDefines:
|
||||
result[defineName] = define
|
||||
|
||||
|
||||
proc snapshotActiveDefines*(userDefines: TableRef[string, UserDefine]): TableRef[string, string] =
|
||||
result = newTable[string, string]()
|
||||
if userDefines.isNil():
|
||||
return
|
||||
for defineName in canonicalCompilerDefines:
|
||||
if userDefines.hasKey(defineName):
|
||||
result[defineName] = userDefines[defineName].value
|
||||
|
||||
|
||||
proc boolUserDefine*(value: bool, explicit = false): UserDefine =
|
||||
UserDefine(explicit: explicit, hasValue: true, value: if value: "true" else: "false")
|
||||
|
||||
|
||||
proc parseCanonicalBoolDefine*(defineName: string, define: UserDefine): bool =
|
||||
if not define.hasValue:
|
||||
return true
|
||||
try:
|
||||
parseBoolDefineValue(define.value)
|
||||
except ValueError:
|
||||
raise newException(ValueError,
|
||||
&"invalid boolean value '{define.value}' for define '{defineName}'")
|
||||
|
||||
|
||||
proc defaultBuildMode*(): CanonicalBuildMode {.inline.} =
|
||||
CanonicalBuildMode.DebugBuildMode
|
||||
|
||||
|
||||
proc buildModeFromDefine*(defineName: string): CanonicalBuildMode =
|
||||
case defineName
|
||||
of "debug":
|
||||
CanonicalBuildMode.DebugBuildMode
|
||||
of "release":
|
||||
CanonicalBuildMode.ReleaseBuildMode
|
||||
of "danger":
|
||||
CanonicalBuildMode.DangerBuildMode
|
||||
else:
|
||||
raise newException(ValueError, &"'{defineName}' is not a build-mode define")
|
||||
|
||||
|
||||
proc buildModeDefineName*(mode: CanonicalBuildMode): string =
|
||||
case mode
|
||||
of CanonicalBuildMode.DebugBuildMode:
|
||||
"debug"
|
||||
of CanonicalBuildMode.ReleaseBuildMode:
|
||||
"release"
|
||||
of CanonicalBuildMode.DangerBuildMode:
|
||||
"danger"
|
||||
|
||||
|
||||
proc resolvedBuildMode*(userDefines: TableRef[string, UserDefine]): CanonicalBuildMode =
|
||||
if checkDefineEnabled(userDefines, "danger"):
|
||||
return CanonicalBuildMode.DangerBuildMode
|
||||
if checkDefineEnabled(userDefines, "release"):
|
||||
return CanonicalBuildMode.ReleaseBuildMode
|
||||
CanonicalBuildMode.DebugBuildMode
|
||||
|
||||
|
||||
proc resolveBuildMode*(userDefines: TableRef[string, UserDefine],
|
||||
overrides: CheckDefineOverrides): CanonicalBuildMode =
|
||||
result = defaultBuildMode()
|
||||
var enabledModes: seq[CanonicalBuildMode] = @[]
|
||||
if not userDefines.isNil():
|
||||
for defineName in canonicalBuildModeDefines:
|
||||
if not userDefines.hasKey(defineName):
|
||||
continue
|
||||
if not userDefines[defineName].explicit:
|
||||
continue
|
||||
let enabled = parseCanonicalBoolDefine(defineName, userDefines[defineName])
|
||||
if enabled:
|
||||
enabledModes.add(buildModeFromDefine(defineName))
|
||||
if enabledModes.len() > 1:
|
||||
let modes = enabledModes.mapIt(buildModeDefineName(it))
|
||||
let joinedModes = modes.join(", ")
|
||||
raise newException(ValueError,
|
||||
&"conflicting build-mode defines: {joinedModes}")
|
||||
if enabledModes.len() == 1:
|
||||
result = enabledModes[0]
|
||||
if overrides.hasBuildModeOverride:
|
||||
result = overrides.buildMode
|
||||
|
||||
|
||||
proc defaultCanonicalDefineEnabled*(defineName: string): bool =
|
||||
case canonicalCompilerDefineName(defineName)
|
||||
of "checks", "boundChecks", "overflowChecks", "floatChecks", "debug", "lineTrace", "stackTrace":
|
||||
true
|
||||
of "release", "danger":
|
||||
false
|
||||
else:
|
||||
false
|
||||
|
||||
|
||||
proc checkDefineEnabled*(userDefines: TableRef[string, UserDefine], defineName: string): bool =
|
||||
let normalizedName = canonicalCompilerDefineName(defineName)
|
||||
if userDefines.isNil() or not userDefines.hasKey(normalizedName):
|
||||
return defaultCanonicalDefineEnabled(normalizedName)
|
||||
if isCanonicalCompilerDefine(normalizedName):
|
||||
return parseCanonicalBoolDefine(normalizedName, userDefines[normalizedName])
|
||||
true
|
||||
|
||||
|
||||
proc resolveUserDefines*(userDefines: TableRef[string, UserDefine] = nil,
|
||||
checkOverrides: CheckDefineOverrides = CheckDefineOverrides()): TableRef[string, UserDefine] =
|
||||
result = normalizeUserDefines(userDefines)
|
||||
|
||||
let buildMode = resolveBuildMode(result, checkOverrides)
|
||||
let buildModeExplicit =
|
||||
checkOverrides.hasBuildModeOverride or
|
||||
(not userDefines.isNil() and canonicalBuildModeDefines.anyIt(
|
||||
userDefines.hasKey(it) and userDefines[it].explicit and parseCanonicalBoolDefine(it, userDefines[it])))
|
||||
|
||||
var checks = buildMode != CanonicalBuildMode.DangerBuildMode
|
||||
let checksExplicit = checkOverrides.hasChecksOverride or
|
||||
(not result.isNil() and result.hasKey("checks") and result["checks"].explicit)
|
||||
if result.hasKey("checks"):
|
||||
checks = parseCanonicalBoolDefine("checks", result["checks"])
|
||||
if checkOverrides.hasChecksOverride:
|
||||
checks = checkOverrides.checks
|
||||
|
||||
var lineTrace = buildMode == CanonicalBuildMode.DebugBuildMode
|
||||
let lineTraceExplicit = checkOverrides.hasLineTraceOverride or
|
||||
(not result.isNil() and result.hasKey("lineTrace") and result["lineTrace"].explicit)
|
||||
if result.hasKey("lineTrace"):
|
||||
lineTrace = parseCanonicalBoolDefine("lineTrace", result["lineTrace"])
|
||||
if checkOverrides.hasLineTraceOverride:
|
||||
lineTrace = checkOverrides.lineTrace
|
||||
|
||||
var stackTrace = buildMode == CanonicalBuildMode.DebugBuildMode
|
||||
let stackTraceExplicit = checkOverrides.hasStackTraceOverride or
|
||||
(not result.isNil() and result.hasKey("stackTrace") and result["stackTrace"].explicit)
|
||||
if result.hasKey("stackTrace"):
|
||||
stackTrace = parseCanonicalBoolDefine("stackTrace", result["stackTrace"])
|
||||
if checkOverrides.hasStackTraceOverride:
|
||||
stackTrace = checkOverrides.stackTrace
|
||||
|
||||
var boundChecks = checks
|
||||
let boundChecksExplicit = checkOverrides.hasBoundChecksOverride or
|
||||
(not result.isNil() and result.hasKey("boundChecks") and result["boundChecks"].explicit)
|
||||
if result.hasKey("boundChecks"):
|
||||
if result["boundChecks"].explicit:
|
||||
boundChecks = parseCanonicalBoolDefine("boundChecks", result["boundChecks"])
|
||||
if checkOverrides.hasBoundChecksOverride:
|
||||
boundChecks = checkOverrides.boundChecks
|
||||
|
||||
var overflowChecks = checks
|
||||
let overflowChecksExplicit = checkOverrides.hasOverflowCheckOverride or
|
||||
(not result.isNil() and result.hasKey("overflowChecks") and result["overflowChecks"].explicit)
|
||||
if result.hasKey("overflowChecks"):
|
||||
if result["overflowChecks"].explicit:
|
||||
overflowChecks = parseCanonicalBoolDefine("overflowChecks", result["overflowChecks"])
|
||||
if checkOverrides.hasOverflowCheckOverride:
|
||||
overflowChecks = checkOverrides.overflowCheck
|
||||
|
||||
var floatChecks = checks
|
||||
let floatChecksExplicit = checkOverrides.hasFloatCheckOverride or
|
||||
(not result.isNil() and result.hasKey("floatChecks") and result["floatChecks"].explicit)
|
||||
if result.hasKey("floatChecks"):
|
||||
if result["floatChecks"].explicit:
|
||||
floatChecks = parseCanonicalBoolDefine("floatChecks", result["floatChecks"])
|
||||
if checkOverrides.hasFloatCheckOverride:
|
||||
floatChecks = checkOverrides.floatCheck
|
||||
|
||||
result["debug"] = boolUserDefine(buildMode == CanonicalBuildMode.DebugBuildMode,
|
||||
explicit = buildModeExplicit)
|
||||
result["release"] = boolUserDefine(buildMode == CanonicalBuildMode.ReleaseBuildMode,
|
||||
explicit = buildModeExplicit)
|
||||
result["danger"] = boolUserDefine(buildMode == CanonicalBuildMode.DangerBuildMode,
|
||||
explicit = buildModeExplicit)
|
||||
result["lineTrace"] = boolUserDefine(lineTrace, explicit = lineTraceExplicit)
|
||||
result["stackTrace"] = boolUserDefine(stackTrace, explicit = stackTraceExplicit)
|
||||
result["checks"] = boolUserDefine(checks, explicit = checksExplicit)
|
||||
result["boundChecks"] = boolUserDefine(boundChecks, explicit = boundChecksExplicit)
|
||||
result["overflowChecks"] = boolUserDefine(overflowChecks, explicit = overflowChecksExplicit)
|
||||
result["floatChecks"] = boolUserDefine(floatChecks, explicit = floatChecksExplicit)
|
||||
|
||||
|
||||
proc defineLiteralExpr*(self: TypeChecker, source: ASTNode, typ: Type, defineName: string,
|
||||
define: UserDefine, pragmaKind: VarDefinePragmaKind): Expression =
|
||||
let target = typ.unwrapType()
|
||||
case target.kind:
|
||||
of Boolean:
|
||||
var boolValue: bool
|
||||
if pragmaKind == BoolDefinePragma and not define.hasValue:
|
||||
boolValue = true
|
||||
elif define.hasValue:
|
||||
try:
|
||||
boolValue = parseBoolDefineValue(define.value)
|
||||
except ValueError:
|
||||
self.error(&"invalid boolean value '{define.value}' for define '{defineName}'", source)
|
||||
else:
|
||||
self.error(&"define '{defineName}' requires an explicit value", source)
|
||||
let token = self.makeSyntheticToken(source,
|
||||
if boolValue: TokenType.True else: TokenType.False,
|
||||
if boolValue: "true" else: "false")
|
||||
result =
|
||||
if boolValue:
|
||||
newTrueExpr(token)
|
||||
else:
|
||||
newFalseExpr(token)
|
||||
of Integer:
|
||||
if not define.hasValue:
|
||||
self.error(&"define '{defineName}' requires an explicit value", source)
|
||||
let raw = define.value
|
||||
try:
|
||||
discard parseBiggestInt(raw)
|
||||
except ValueError:
|
||||
self.error(&"invalid integer value '{raw}' for define '{defineName}'", source)
|
||||
result = newIntExpr(self.makeSyntheticToken(source, TokenType.Integer, raw))
|
||||
of Float:
|
||||
if not define.hasValue:
|
||||
self.error(&"define '{defineName}' requires an explicit value", source)
|
||||
let raw = define.value
|
||||
try:
|
||||
discard parseFloat(raw)
|
||||
except ValueError:
|
||||
self.error(&"invalid float value '{raw}' for define '{defineName}'", source)
|
||||
result = newFloatExpr(self.makeSyntheticToken(source, TokenType.Float, raw))
|
||||
of String:
|
||||
if not define.hasValue:
|
||||
self.error(&"define '{defineName}' requires an explicit value", source)
|
||||
let raw = define.value
|
||||
result = newStrExpr(self.makeSyntheticToken(source, TokenType.String, escapeDefineString(raw)))
|
||||
else:
|
||||
self.error(&"'{defineName}' can only use primitive bool, integer, float, or string types", source)
|
||||
result.file = self.file
|
||||
|
||||
|
||||
proc comptimeContext*(self: TypeChecker): ComptimeEvalContext =
|
||||
result.resolveConstIdent = proc(expr: TypedIdentExpr): TypedExpr =
|
||||
let name = expr.name
|
||||
if name.kind != NameKind.Var or name.node.isNil() or name.node.kind != NodeKind.varDecl:
|
||||
self.error("when condition may only reference constant bindings", expr.node)
|
||||
let decl = VarDecl(name.node)
|
||||
if not decl.constant or decl.value.isNil():
|
||||
self.error(&"'{name.ident.token.lexeme}' is not available for compile-time evaluation", expr.node)
|
||||
if self.nameInits.hasKey(name.nameKey()):
|
||||
return self.nameInits[name.nameKey()]
|
||||
self.inferOrError(decl.value)
|
||||
|
||||
result.builtinForCall = proc(call: TypedCallExpr): string =
|
||||
let builtin = self.builtinMagic(call.getCallableName())
|
||||
if builtin.len() == 0:
|
||||
self.error("when condition may only call built-in operations for now", call.node)
|
||||
builtin
|
||||
|
||||
result.builtinForUnary = proc(node: UnaryExpr, arg: TypedExpr): string =
|
||||
if node.operator.lexeme == "not" and arg.kind.unwrapType().kind == Boolean:
|
||||
return "LogicalNot"
|
||||
var defaultExpr: TypedExpr
|
||||
let signature = @[("", arg.kind, defaultExpr, false)]
|
||||
self.builtinMagic(self.matchImpl(self, node.operator.lexeme, signature, @[arg], node))
|
||||
|
||||
result.builtinForBinary = proc(node: BinaryExpr, a, b: TypedExpr): string =
|
||||
var defaultExpr: TypedExpr
|
||||
let signature = @[("", a.kind, defaultExpr, false), ("", b.kind, defaultExpr, false)]
|
||||
self.builtinMagic(self.matchImpl(self, node.operator.lexeme, signature, @[a, b], node))
|
||||
|
||||
|
||||
proc evaluateComptime*(self: TypeChecker, expression: TypedExpr): ComptimeValue =
|
||||
try:
|
||||
result = self.comptimeContext().evalComptime(expression)
|
||||
except ComptimeEvalError as exc:
|
||||
self.error(exc.msg, exc.node)
|
||||
|
||||
|
||||
proc evaluateComptimeBool*(self: TypeChecker, expression: TypedExpr): bool =
|
||||
try:
|
||||
result = self.comptimeContext().evalComptimeBool(expression)
|
||||
except ComptimeEvalError as exc:
|
||||
self.error(exc.msg, exc.node)
|
||||
|
||||
|
||||
proc evaluateComptimeInt*(self: TypeChecker, expression: TypedExpr, context: string): BiggestInt =
|
||||
try:
|
||||
result = self.comptimeContext().evalComptimeInt(expression, context)
|
||||
except ComptimeEvalError as exc:
|
||||
self.error(exc.msg, exc.node)
|
||||
@@ -235,10 +235,11 @@ type
|
||||
genericArgs*: seq[GenericArg]
|
||||
|
||||
TypedUnaryExpr* = ref object of TypedExpr
|
||||
a*: TypedExpr
|
||||
|
||||
TypedBinaryExpr* = ref object of TypedUnaryExpr
|
||||
b*: TypedExpr
|
||||
operand*: TypedExpr
|
||||
|
||||
TypedBinaryExpr* = ref object of TypedExpr
|
||||
left*: TypedExpr
|
||||
right*: TypedExpr
|
||||
|
||||
TypedIdentExpr* = ref object of TypedExpr
|
||||
name*: Name
|
||||
@@ -517,11 +518,11 @@ proc newTypedIdentExpr*(node: IdentExpr, name: Name): TypedIdentExpr =
|
||||
proc newTypedNamedExpr*(node: Expression, name: Name): TypedIdentExpr =
|
||||
result = TypedIdentExpr(node: node, name: name, kind: name.valueType)
|
||||
|
||||
proc newTypedUnaryExpr*(node: UnaryExpr, kind: Type, a: TypedExpr): TypedUnaryExpr =
|
||||
result = TypedUnaryExpr(node: node, a: a, kind: kind)
|
||||
proc newTypedUnaryExpr*(node: UnaryExpr, kind: Type, operand: TypedExpr): TypedUnaryExpr =
|
||||
result = TypedUnaryExpr(node: node, operand: operand, kind: kind)
|
||||
|
||||
proc newTypedBinaryExpr*(node: UnaryExpr, kind: Type, a, b: TypedExpr): TypedBinaryExpr =
|
||||
result = TypedBinaryExpr(node: node, a: a, b: b, kind: kind)
|
||||
proc newTypedBinaryExpr*(node: BinaryExpr, kind: Type, left, right: TypedExpr): TypedBinaryExpr =
|
||||
result = TypedBinaryExpr(node: node, left: left, right: right, kind: kind)
|
||||
|
||||
proc newTypedCallExpr*(node: CallExpr, callee: TypedExpr,
|
||||
args: seq[TypedExpr], templateDecl: Name = nil,
|
||||
|
||||
@@ -134,7 +134,7 @@ type
|
||||
ident*: IdentExpr
|
||||
kind*: GenericParameterKind
|
||||
rawConstraint*: Expression
|
||||
constr*: Expression
|
||||
resolvedConstraint*: Expression
|
||||
|
||||
TypeGenerics* = OrderedTableRef[string, TypeGeneric]
|
||||
|
||||
@@ -194,15 +194,13 @@ type
|
||||
UnaryExpr* = ref object of Expression
|
||||
## A unary expression
|
||||
operator*: Token
|
||||
a*: Expression
|
||||
operand*: Expression
|
||||
|
||||
BinaryExpr* = ref object of UnaryExpr
|
||||
BinaryExpr* = ref object of Expression
|
||||
## A binary expression
|
||||
|
||||
# Binary expressions can be seen here as unary
|
||||
# expressions with an extra operand, so we just
|
||||
# inherit from that and add it
|
||||
b*: Expression
|
||||
operator*: Token
|
||||
left*: Expression
|
||||
right*: Expression
|
||||
|
||||
LambdaExpr* = ref object of Expression
|
||||
## A lambda expression. This is basically
|
||||
@@ -360,7 +358,7 @@ type
|
||||
name*: IdentExpr
|
||||
args*: seq[Expression]
|
||||
|
||||
Var* = ref object of Expression
|
||||
VarQualifier* = ref object of Expression
|
||||
value*: Expression
|
||||
|
||||
Ref* = ref object of Expression
|
||||
@@ -612,20 +610,20 @@ proc newSliceExpr*(expression: Expression, elements: seq[Expression], token: Tok
|
||||
result.token = token
|
||||
|
||||
|
||||
proc newUnaryExpr*(operator: Token, a: Expression): UnaryExpr =
|
||||
proc newUnaryExpr*(operator: Token, operand: Expression): UnaryExpr =
|
||||
new(result)
|
||||
result.kind = NodeKind.unaryExpr
|
||||
result.operator = operator
|
||||
result.a = a
|
||||
result.operand = operand
|
||||
result.token = result.operator
|
||||
|
||||
|
||||
proc newBinaryExpr*(a: Expression, operator: Token, b: Expression): BinaryExpr =
|
||||
proc newBinaryExpr*(left: Expression, operator: Token, right: Expression): BinaryExpr =
|
||||
new(result)
|
||||
result.kind = NodeKind.binaryExpr
|
||||
result.operator = operator
|
||||
result.a = a
|
||||
result.b = b
|
||||
result.left = left
|
||||
result.right = right
|
||||
result.token = operator
|
||||
|
||||
|
||||
@@ -855,10 +853,10 @@ func `$`*(self: ASTNode): string =
|
||||
result &= &"""Call({self.callee}, arguments=(positionals={self.arguments.positionals}, keyword={self.arguments.keyword}))"""
|
||||
of unaryExpr:
|
||||
var self = UnaryExpr(self)
|
||||
result &= &"Unary(Operator('{self.operator.lexeme}'), {self.a})"
|
||||
result &= &"Unary(Operator('{self.operator.lexeme}'), {self.operand})"
|
||||
of binaryExpr:
|
||||
var self = BinaryExpr(self)
|
||||
result &= &"Binary({self.a}, Operator('{self.operator.lexeme}'), {self.b})"
|
||||
result &= &"Binary({self.left}, Operator('{self.operator.lexeme}'), {self.right})"
|
||||
of assignExpr:
|
||||
var self = AssignExpr(self)
|
||||
result &= &"Assign(target={self.target}, value={self.value})"
|
||||
@@ -962,7 +960,7 @@ func `$`*(self: ASTNode): string =
|
||||
func `$`*(self: Parameter): string = &"Parameter(name={self.ident}, type={self.valueType}, default={self.default})"
|
||||
func `$`*(self: TypeField): string = &"Field(name={self.ident}, type={self.valueType}, default={self.default}, private={self.isPrivate})"
|
||||
func `$`*(self: TypeGeneric): string =
|
||||
&"Parameter(name={self.ident}, kind={self.kind}, rawConstraint={self.rawConstraint}, constraint={self.constr})"
|
||||
&"Parameter(name={self.ident}, kind={self.kind}, rawConstraint={self.rawConstraint}, constraint={self.resolvedConstraint})"
|
||||
|
||||
|
||||
func `==`*(self, other: IdentExpr): bool {.inline.} = self.token == other.token
|
||||
@@ -1001,10 +999,10 @@ func getRelativeBoundaries*(self: ASTNode): tuple[start, stop: int] =
|
||||
result = getRelativeBoundaries(ExprStmt(self).expression)
|
||||
of NodeKind.unaryExpr:
|
||||
var self = UnaryExpr(self)
|
||||
result = (self.operator.relPos.start, getRelativeBoundaries(self.a).stop)
|
||||
result = (self.operator.relPos.start, getRelativeBoundaries(self.operand).stop)
|
||||
of NodeKind.binaryExpr:
|
||||
var self = BinaryExpr(self)
|
||||
result = (getRelativeBoundaries(self.a).start, getRelativeBoundaries(self.b).stop)
|
||||
result = (getRelativeBoundaries(self.left).start, getRelativeBoundaries(self.right).stop)
|
||||
of NodeKind.assignExpr:
|
||||
var self = AssignExpr(self)
|
||||
result = (getRelativeBoundaries(self.target).start, getRelativeBoundaries(self.value).stop)
|
||||
|
||||
@@ -274,7 +274,7 @@ proc error(self: Lexer, message: string) =
|
||||
|
||||
|
||||
proc check(self: Lexer, s: string, distance: int = 0): bool =
|
||||
## Behaves Bike self.match(), without consuming the
|
||||
## Behaves like self.match(), without consuming the
|
||||
## token. False is returned if we're at EOF
|
||||
## regardless of what the token to check is.
|
||||
## The distance is passed directly to self.peek()
|
||||
|
||||
@@ -172,7 +172,7 @@ func beginScope(self: Parser) {.inline.} =
|
||||
|
||||
|
||||
func endScope(self: Parser) {.inline.} =
|
||||
## Ends a new lexical scope
|
||||
## Ends the current lexical scope
|
||||
dec(self.scopeDepth)
|
||||
|
||||
|
||||
@@ -288,7 +288,7 @@ func expect(self: Parser, kind: openarray[TokenType], message: string = "", toke
|
||||
## an error is raised only if none of the
|
||||
## given token kinds matches
|
||||
for k in kind:
|
||||
if self.match(kind):
|
||||
if self.match(k):
|
||||
return
|
||||
if message.len() == 0:
|
||||
self.error(&"""expecting any of the following tokens: {kind.join(", ")}, but got {self.peek().kind} instead""", token)
|
||||
@@ -345,7 +345,7 @@ func expect(self: Parser, kind: openarray[string], message: string = "", token:
|
||||
## an error is raised only if none of the
|
||||
## given strings matches
|
||||
for k in kind:
|
||||
if self.match(kind):
|
||||
if self.match(k):
|
||||
return
|
||||
if message.len() == 0:
|
||||
self.error(&"""expecting any of the following tokens: {kind.join(", ")}, but got {self.peek().kind} instead""", token)
|
||||
@@ -652,7 +652,7 @@ proc parseCmp(self: Parser): Expression =
|
||||
var right: Expression
|
||||
while self.check([Identifier, Symbol]) and self.operators.getPrecedence(self.peek().lexeme) == Compare:
|
||||
operator = self.step()
|
||||
right = self.parseAdd()
|
||||
right = self.parseBitwise()
|
||||
result = newBinaryExpr(result, operator, right)
|
||||
result.file = self.file
|
||||
|
||||
@@ -1122,12 +1122,12 @@ proc findLegacyAnyConstraint(self: Parser, constraint: Expression): Expression =
|
||||
return
|
||||
of binaryExpr:
|
||||