Merge remote-tracking branch 'origin/trunk' into builtins-in-roc

This commit is contained in:
Folkert 2022-03-05 20:55:15 +01:00
commit adf4ad22a5
No known key found for this signature in database
GPG key ID: 1F17F6FFD112B97C
90 changed files with 3197 additions and 2191 deletions

View file

@ -65,3 +65,6 @@ Mats Sigge <<mats.sigge@gmail.com>>
Drew Lazzeri <dlazzeri1@gmail.com> Drew Lazzeri <dlazzeri1@gmail.com>
Tom Dohrmann <erbse.13@gmx.de> Tom Dohrmann <erbse.13@gmx.de>
Elijah Schow <elijah.schow@gmail.com> Elijah Schow <elijah.schow@gmail.com>
Derek Gustafson <degustaf@gmail.com>
Philippe Vinchon <p.vinchon@gmail.com>
Pierre-Henri Trivier <phtrivier@yahoo.fr>

View file

@ -37,7 +37,7 @@ If you plan on using `nix-shell` regularly, check out [direnv](https://direnv.ne
### Editor ### Editor
The editor is a WIP and not ready yet to replace your favorite editor, although if you want to try it out on nix, read on. The editor is a :construction:WIP:construction: and not ready yet to replace your favorite editor, although if you want to try it out on nix, read on.
`cargo run edit` should work from NixOS, if you use a nix-shell from inside another OS, follow the instructions below. `cargo run edit` should work from NixOS, if you use a nix-shell from inside another OS, follow the instructions below.
#### Nvidia GPU #### Nvidia GPU
@ -196,20 +196,24 @@ export CPPFLAGS="-I/usr/local/opt/llvm/include"
### LLVM installation on Windows ### LLVM installation on Windows
Installing LLVM's prebuilt binaries doesn't seem to be enough for the `llvm-sys` crate that Roc depends on, so I had to build LLVM from source **Warning** While `cargo build` works on windows, linking roc programs does not yet, see issue #2608. This also means the repl, the editor and many tests will not work on windows.
on Windows. After lots of help from [**@IanMacKenzie**](https://github.com/IanMacKenzie) (thank you, Ian!), here's what worked for me: Installing LLVM's prebuilt binaries doesn't seem to be enough for the `llvm-sys` crate that Roc depends on, so I had to follow the steps below:
1. I downloaded and installed [Build Tools for Visual Studio 2019](https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=BuildTools&rel=16) (a full Visual Studio install should work tool; the Build Tools are just the CLI tools, which is all I wanted) 1. I downloaded and installed [Build Tools for Visual Studio 2019](https://visualstudio.microsoft.com/thank-you-downloading-visual-studio/?sku=BuildTools&rel=16) (a full Visual Studio install should work too; the Build Tools are just the CLI tools, which is all I wanted)
1. In the installation configuration, under "additional components" I had to check both "C++ ATL for latest v142 build tools (x86 & x64)" and also "C++/CLI support for v142 build tools" [note: as of September 2021 this should no longer be necessary - the next time anyone tries this, please try it without this step and make a PR to delete this step if it's no longer needed!] 1. Download the custom LLVM 7z archive [here](https://github.com/PLC-lang/llvm-package-windows/releases/tag/v12.0.1).
1. I launched the "x64 Native Tools Command Prompt for Visual Studio 2019" application (note: not the similarly-named "x86" one!) 1. [Download 7-zip](https://www.7-zip.org/) to be able to extract this archive.
1. Make sure [Python 2.7](https://www.python.org/) and [CMake 3.17](http://cmake.org/) are installed on your system. 1. Extract the 7z file to where you want to permanently keep the folder.
1. I followed most of the steps under LLVM's [building from source instructions](https://github.com/llvm/llvm-project#getting-the-source-code-and-building-llvm) up to the `cmake -G ...` command, which didn't work for me. Instead, at that point I did the following step. 1. In powershell, set the `LLVM_SYS_120_PREFIX` environment variable (check [here](https://docs.microsoft.com/en-us/powershell/module/microsoft.powershell.core/about/about_environment_variables?view=powershell-7.2#saving-changes-to-environment-variables) to make this a permanent environment variable):
1. I ran `cmake -G "NMake Makefiles" -DCMAKE_BUILD_TYPE=Release ../llvm` to generate a NMake makefile. ```
1. Once that completed, I ran `nmake` to build LLVM. (This took about 2 hours on my laptop.) [Environment]::SetEnvironmentVariable(
1. Finally, I set an environment variable `LLVM_SYS_100_PREFIX` to point to the `build` directory where I ran the `cmake` command. "Path",
[Environment]::GetEnvironmentVariable("Path", "User") + ";C:\Users\anton\Downloads\LLVM-12.0.1-win64\bin",
"User"
)
```
Once all that was done, `cargo` ran successfully for Roc! Once all that was done, `cargo build` ran successfully for Roc!
### Build speed on WSL/WSL2 ### Build speed on WSL/WSL2

33
Cargo.lock generated
View file

@ -1222,7 +1222,7 @@ checksum = "1d428afc93ad288f6dffc1fa5f4a78201ad2eec33c5a522e51c181009eb09061"
dependencies = [ dependencies = [
"byteorder", "byteorder",
"dynasm", "dynasm",
"memmap2 0.5.0", "memmap2 0.5.3",
] ]
[[package]] [[package]]
@ -2095,9 +2095,9 @@ dependencies = [
[[package]] [[package]]
name = "memmap2" name = "memmap2"
version = "0.5.0" version = "0.5.3"
source = "registry+https://github.com/rust-lang/crates.io-index" source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "4647a11b578fead29cdbb34d4adef8dd3dc35b876c9c6d5240d83f205abfe96e" checksum = "057a3db23999c867821a7a59feb06a578fcb03685e983dff90daf9e7d24ac08f"
dependencies = [ dependencies = [
"libc", "libc",
] ]
@ -3263,6 +3263,16 @@ dependencies = [
"libc", "libc",
] ]
[[package]]
name = "roc_alias_analysis"
version = "0.1.0"
dependencies = [
"morphic_lib",
"roc_collections",
"roc_module",
"roc_mono",
]
[[package]] [[package]]
name = "roc_ast" name = "roc_ast"
version = "0.1.0" version = "0.1.0"
@ -3286,6 +3296,7 @@ dependencies = [
"roc_unify", "roc_unify",
"snafu", "snafu",
"ven_graph", "ven_graph",
"winapi",
] ]
[[package]] [[package]]
@ -3346,6 +3357,7 @@ dependencies = [
"roc_problem", "roc_problem",
"roc_region", "roc_region",
"roc_types", "roc_types",
"static_assertions",
"ven_graph", "ven_graph",
] ]
@ -3415,6 +3427,7 @@ dependencies = [
name = "roc_constrain" name = "roc_constrain"
version = "0.1.0" version = "0.1.0"
dependencies = [ dependencies = [
"arrayvec 0.7.2",
"roc_builtins", "roc_builtins",
"roc_can", "roc_can",
"roc_collections", "roc_collections",
@ -3558,6 +3571,7 @@ dependencies = [
"bumpalo", "bumpalo",
"inkwell 0.1.0", "inkwell 0.1.0",
"morphic_lib", "morphic_lib",
"roc_alias_analysis",
"roc_builtins", "roc_builtins",
"roc_collections", "roc_collections",
"roc_error_macros", "roc_error_macros",
@ -3594,7 +3608,7 @@ dependencies = [
"bumpalo", "bumpalo",
"clap 3.0.0-beta.5", "clap 3.0.0-beta.5",
"iced-x86", "iced-x86",
"memmap2 0.5.0", "memmap2 0.5.3",
"object 0.26.2", "object 0.26.2",
"roc_build", "roc_build",
"roc_collections", "roc_collections",
@ -3822,13 +3836,6 @@ dependencies = [
[[package]] [[package]]
name = "roc_std" name = "roc_std"
version = "0.1.0" version = "0.1.0"
dependencies = [
"indoc",
"libc",
"pretty_assertions",
"quickcheck",
"quickcheck_macros",
]
[[package]] [[package]]
name = "roc_target" name = "roc_target"
@ -3927,7 +3934,7 @@ checksum = "61b3909d758bb75c79f23d4736fac9433868679d3ad2ea7a61e3c25cfda9a088"
[[package]] [[package]]
name = "rustyline" name = "rustyline"
version = "9.1.1" version = "9.1.1"
source = "git+https://github.com/rtfeldman/rustyline?tag=v9.1.1#7053ae0fe0ee710d38ed5845dd979113382994dc" source = "git+https://github.com/rtfeldman/rustyline?rev=e74333c#e74333c0d618896b88175bf06645108f996fe6d0"
dependencies = [ dependencies = [
"bitflags", "bitflags",
"cfg-if 1.0.0", "cfg-if 1.0.0",
@ -3950,7 +3957,7 @@ dependencies = [
[[package]] [[package]]
name = "rustyline-derive" name = "rustyline-derive"
version = "0.6.0" version = "0.6.0"
source = "git+https://github.com/rtfeldman/rustyline?tag=v9.1.1#7053ae0fe0ee710d38ed5845dd979113382994dc" source = "git+https://github.com/rtfeldman/rustyline?rev=e74333c#e74333c0d618896b88175bf06645108f996fe6d0"
dependencies = [ dependencies = [
"quote", "quote",
"syn", "syn",

View file

@ -15,6 +15,7 @@ members = [
"compiler/solve", "compiler/solve",
"compiler/fmt", "compiler/fmt",
"compiler/mono", "compiler/mono",
"compiler/alias_analysis",
"compiler/test_mono", "compiler/test_mono",
"compiler/load", "compiler/load",
"compiler/gen_llvm", "compiler/gen_llvm",
@ -38,7 +39,6 @@ members = [
"repl_eval", "repl_eval",
"repl_test", "repl_test",
"repl_wasm", "repl_wasm",
"roc_std",
"test_utils", "test_utils",
"utils", "utils",
"docs", "docs",
@ -50,6 +50,8 @@ exclude = [
# The tests will still correctly build them. # The tests will still correctly build them.
"cli_utils", "cli_utils",
"compiler/test_mono_macros", "compiler/test_mono_macros",
# `cargo build` would cause roc_std to be built with default features which errors on windows
"roc_std",
] ]
# Needed to be able to run `cargo run -p roc_cli --no-default-features` - # Needed to be able to run `cargo run -p roc_cli --no-default-features` -
# see www/build.sh for more. # see www/build.sh for more.

View file

@ -1,4 +1,4 @@
FROM rust:1.57.0-slim-bullseye # make sure to update nixpkgs-unstable in sources.json too so that it uses the same rust version > search for cargo on unstable here: https://search.nixos.org/packages FROM rust:1.58.0-slim-bullseye # make sure to update nixpkgs-unstable in sources.json too so that it uses the same rust version > search for cargo on unstable here: https://search.nixos.org/packages
WORKDIR /earthbuild WORKDIR /earthbuild
prep-debian: prep-debian:
@ -93,7 +93,7 @@ test-rust:
RUN --mount=type=cache,target=$SCCACHE_DIR \ RUN --mount=type=cache,target=$SCCACHE_DIR \
repl_test/test_wasm.sh && sccache --show-stats repl_test/test_wasm.sh && sccache --show-stats
# run i386 (32-bit linux) cli tests # run i386 (32-bit linux) cli tests
RUN echo "4" | cargo run --locked --release --features="target-x86" -- --backend=x86_32 examples/benchmarks/NQueens.roc RUN echo "4" | cargo run --locked --release --features="target-x86" -- --target=x86_32 examples/benchmarks/NQueens.roc
RUN --mount=type=cache,target=$SCCACHE_DIR \ RUN --mount=type=cache,target=$SCCACHE_DIR \
cargo test --locked --release --features with_sound --test cli_run i386 --features="i386-cli-run" && sccache --show-stats cargo test --locked --release --features with_sound --test cli_run i386 --features="i386-cli-run" && sccache --show-stats

View file

@ -1623,8 +1623,7 @@ If you like, you can always annotate your functions as accepting open records. H
always be the nicest choice. For example, let's say you have a `User` type alias, like so: always be the nicest choice. For example, let's say you have a `User` type alias, like so:
```coffee ```coffee
User : User : {
{
email : Str, email : Str,
firstName : Str, firstName : Str,
lastName : Str, lastName : Str,
@ -1661,8 +1660,7 @@ Since open records have a type variable (like `*` in `{ email : Str }*` or `a` i
type variable to the `User` type alias: type variable to the `User` type alias:
```coffee ```coffee
User a : User a : {
{
email : Str, email : Str,
firstName : Str, firstName : Str,
lastName : Str, lastName : Str,

View file

@ -21,10 +21,14 @@ roc_target = { path = "../compiler/roc_target" }
roc_error_macros = { path = "../error_macros" } roc_error_macros = { path = "../error_macros" }
arrayvec = "0.7.2" arrayvec = "0.7.2"
bumpalo = { version = "3.8.0", features = ["collections"] } bumpalo = { version = "3.8.0", features = ["collections"] }
libc = "0.2.106"
page_size = "0.4.2" page_size = "0.4.2"
snafu = { version = "0.6.10", features = ["backtraces"] } snafu = { version = "0.6.10", features = ["backtraces"] }
ven_graph = { path = "../vendor/pathfinding" } ven_graph = { path = "../vendor/pathfinding" }
libc = "0.2.106"
[dev-dependencies] [dev-dependencies]
indoc = "1.0.3" indoc = "1.0.3"
[target.'cfg(windows)'.dependencies]
winapi = { version = "0.3.9", features = ["memoryapi"]}

View file

@ -1,7 +1,7 @@
use bumpalo::{collections::Vec as BumpVec, Bump}; use bumpalo::{collections::Vec as BumpVec, Bump};
use roc_can::expected::{Expected, PExpected}; use roc_can::expected::{Expected, PExpected};
use roc_collections::all::{BumpMap, BumpMapDefault, Index, SendMap}; use roc_collections::all::{BumpMap, BumpMapDefault, HumanIndex, SendMap};
use roc_module::{ use roc_module::{
ident::{Lowercase, TagName}, ident::{Lowercase, TagName},
symbol::Symbol, symbol::Symbol,
@ -163,7 +163,7 @@ pub fn constrain_expr<'a>(
let elem_expected = Expected::ForReason( let elem_expected = Expected::ForReason(
Reason::ElemInList { Reason::ElemInList {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
list_elem_type.shallow_clone(), list_elem_type.shallow_clone(),
region, region,
@ -339,7 +339,7 @@ pub fn constrain_expr<'a>(
let reason = Reason::FnArg { let reason = Reason::FnArg {
name: opt_symbol, name: opt_symbol,
arg_index: Index::zero_based(index), arg_index: HumanIndex::zero_based(index),
}; };
let expected_arg = Expected::ForReason(reason, arg_type.shallow_clone(), region); let expected_arg = Expected::ForReason(reason, arg_type.shallow_clone(), region);
@ -538,7 +538,7 @@ pub fn constrain_expr<'a>(
name.clone(), name.clone(),
arity, arity,
AnnotationSource::TypedIfBranch { AnnotationSource::TypedIfBranch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
num_branches, num_branches,
region: ann_source.region(), region: ann_source.region(),
}, },
@ -559,7 +559,7 @@ pub fn constrain_expr<'a>(
name, name,
arity, arity,
AnnotationSource::TypedIfBranch { AnnotationSource::TypedIfBranch {
index: Index::zero_based(branches.len()), index: HumanIndex::zero_based(branches.len()),
num_branches, num_branches,
region: ann_source.region(), region: ann_source.region(),
}, },
@ -596,7 +596,7 @@ pub fn constrain_expr<'a>(
body, body,
Expected::ForReason( Expected::ForReason(
Reason::IfBranch { Reason::IfBranch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
total_branches: branches.len(), total_branches: branches.len(),
}, },
Type2::Variable(*expr_var), Type2::Variable(*expr_var),
@ -616,7 +616,7 @@ pub fn constrain_expr<'a>(
final_else_expr, final_else_expr,
Expected::ForReason( Expected::ForReason(
Reason::IfBranch { Reason::IfBranch {
index: Index::zero_based(branches.len()), index: HumanIndex::zero_based(branches.len()),
total_branches: branches.len() + 1, total_branches: branches.len() + 1,
}, },
Type2::Variable(*expr_var), Type2::Variable(*expr_var),
@ -691,7 +691,7 @@ pub fn constrain_expr<'a>(
when_branch, when_branch,
PExpected::ForReason( PExpected::ForReason(
PReason::WhenMatch { PReason::WhenMatch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
cond_type.shallow_clone(), cond_type.shallow_clone(),
pattern_region, pattern_region,
@ -700,7 +700,7 @@ pub fn constrain_expr<'a>(
name.clone(), name.clone(),
*arity, *arity,
AnnotationSource::TypedWhenBranch { AnnotationSource::TypedWhenBranch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
region: ann_source.region(), region: ann_source.region(),
}, },
typ.shallow_clone(), typ.shallow_clone(),
@ -733,14 +733,14 @@ pub fn constrain_expr<'a>(
when_branch, when_branch,
PExpected::ForReason( PExpected::ForReason(
PReason::WhenMatch { PReason::WhenMatch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
cond_type.shallow_clone(), cond_type.shallow_clone(),
pattern_region, pattern_region,
), ),
Expected::ForReason( Expected::ForReason(
Reason::WhenBranch { Reason::WhenBranch {
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
branch_type.shallow_clone(), branch_type.shallow_clone(),
// TODO: when_branch.value.region, // TODO: when_branch.value.region,
@ -1065,7 +1065,7 @@ pub fn constrain_expr<'a>(
let reason = Reason::LowLevelOpArg { let reason = Reason::LowLevelOpArg {
op: *op, op: *op,
arg_index: Index::zero_based(index), arg_index: HumanIndex::zero_based(index),
}; };
let expected_arg = let expected_arg =
Expected::ForReason(reason, arg_type.shallow_clone(), Region::zero()); Expected::ForReason(reason, arg_type.shallow_clone(), Region::zero());
@ -1681,7 +1681,7 @@ fn constrain_tag_pattern<'a>(
let expected = PExpected::ForReason( let expected = PExpected::ForReason(
PReason::TagArg { PReason::TagArg {
tag_name: tag_name.clone(), tag_name: tag_name.clone(),
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
pattern_type, pattern_type,
region, region,

View file

@ -10,12 +10,10 @@
/// ///
/// Pages also use the node value 0 (all 0 bits) to mark nodes as unoccupied. /// Pages also use the node value 0 (all 0 bits) to mark nodes as unoccupied.
/// This is important for performance. /// This is important for performance.
use libc::{MAP_ANONYMOUS, MAP_PRIVATE, PROT_READ, PROT_WRITE};
use std::any::type_name; use std::any::type_name;
use std::ffi::c_void; use std::ffi::c_void;
use std::marker::PhantomData; use std::marker::PhantomData;
use std::mem::{align_of, size_of, MaybeUninit}; use std::mem::{align_of, size_of, MaybeUninit};
use std::ptr::null;
pub const NODE_BYTES: usize = 32; pub const NODE_BYTES: usize = 32;
@ -108,14 +106,32 @@ impl Pool {
// addresses from the OS which will be lazily translated into // addresses from the OS which will be lazily translated into
// physical memory one 4096-byte page at a time, once we actually // physical memory one 4096-byte page at a time, once we actually
// try to read or write in that page's address range. // try to read or write in that page's address range.
#[cfg(unix)]
{
use libc::{MAP_ANONYMOUS, MAP_PRIVATE, PROT_READ, PROT_WRITE};
libc::mmap( libc::mmap(
null::<c_void>() as *mut c_void, std::ptr::null_mut(),
bytes_to_mmap, bytes_to_mmap,
PROT_READ | PROT_WRITE, PROT_READ | PROT_WRITE,
MAP_PRIVATE | MAP_ANONYMOUS, MAP_PRIVATE | MAP_ANONYMOUS,
0, 0,
0, 0,
) )
}
#[cfg(windows)]
{
use winapi::um::memoryapi::VirtualAlloc;
use winapi::um::winnt::PAGE_READWRITE;
use winapi::um::winnt::{MEM_COMMIT, MEM_RESERVE};
VirtualAlloc(
std::ptr::null_mut(),
bytes_to_mmap,
MEM_COMMIT | MEM_RESERVE,
PAGE_READWRITE,
)
}
} as *mut [MaybeUninit<u8>; NODE_BYTES]; } as *mut [MaybeUninit<u8>; NODE_BYTES];
// This is our actual capacity, in nodes. // This is our actual capacity, in nodes.
@ -230,10 +246,24 @@ impl<T> std::ops::IndexMut<NodeId<T>> for Pool {
impl Drop for Pool { impl Drop for Pool {
fn drop(&mut self) { fn drop(&mut self) {
unsafe { unsafe {
#[cfg(unix)]
{
libc::munmap( libc::munmap(
self.nodes as *mut c_void, self.nodes as *mut c_void,
NODE_BYTES * self.capacity as usize, NODE_BYTES * self.capacity as usize,
); );
} }
#[cfg(windows)]
{
use winapi::um::memoryapi::VirtualFree;
use winapi::um::winnt::MEM_RELEASE;
VirtualFree(
self.nodes as *mut c_void,
NODE_BYTES * self.capacity as usize,
MEM_RELEASE,
);
}
}
} }
} }

View file

@ -1,133 +0,0 @@
use bumpalo::collections::Vec;
use bumpalo::Bump;
use roc_fmt::def::fmt_def;
use roc_fmt::module::fmt_module;
use roc_parse::ast::{Def, Module};
use roc_parse::module::module_defs;
use roc_parse::parser;
use roc_parse::parser::{Parser, SyntaxError};
use roc_region::all::Located;
use std::ffi::OsStr;
use std::path::Path;
use std::{fs, io};
#[derive(Debug)]
pub struct File<'a> {
path: &'a Path,
module_header: Module<'a>,
content: Vec<'a, Located<Def<'a>>>,
}
#[derive(Debug)]
pub enum ReadError<'a> {
Read(std::io::Error),
ParseDefs(SyntaxError<'a>),
ParseHeader(SyntaxError<'a>),
DoesntHaveRocExtension,
}
impl<'a> File<'a> {
pub fn read(path: &'a Path, arena: &'a Bump) -> Result<File<'a>, ReadError<'a>> {
if path.extension() != Some(OsStr::new("roc")) {
return Err(ReadError::DoesntHaveRocExtension);
}
let bytes = fs::read(path).map_err(ReadError::Read)?;
let allocation = arena.alloc(bytes);
let module_parse_state = parser::State::new(allocation);
let parsed_module = roc_parse::module::parse_header(arena, module_parse_state);
match parsed_module {
Ok((module, state)) => {
let parsed_defs = module_defs().parse(arena, state);
match parsed_defs {
Ok((_, defs, _)) => Ok(File {
path,
module_header: module,
content: defs,
}),
Err((_, error, _)) => Err(ReadError::ParseDefs(error)),
}
}
Err(error) => Err(ReadError::ParseHeader(SyntaxError::Header(error))),
}
}
pub fn fmt(&self) -> String {
let arena = Bump::new();
let mut formatted_file = String::new();
let mut module_header_buf = bumpalo::collections::String::new_in(&arena);
fmt_module(&mut module_header_buf, &self.module_header);
formatted_file.push_str(module_header_buf.as_str());
for def in &self.content {
let mut def_buf = bumpalo::collections::String::new_in(&arena);
fmt_def(&mut def_buf, &def.value, 0);
formatted_file.push_str(def_buf.as_str());
}
formatted_file
}
pub fn fmt_then_write_to(&self, write_path: &'a Path) -> io::Result<()> {
let formatted_file = self.fmt();
fs::write(write_path, formatted_file)
}
pub fn fmt_then_write_with_name(&self, new_name: &str) -> io::Result<()> {
self.fmt_then_write_to(
self.path
.with_file_name(new_name)
.with_extension("roc")
.as_path(),
)
}
pub fn fmt_then_write(&self) -> io::Result<()> {
self.fmt_then_write_to(self.path)
}
}
#[cfg(test)]
mod test_file {
use crate::lang::roc_file;
use bumpalo::Bump;
use std::path::Path;
#[test]
fn read_and_fmt_simple_roc_module() {
let simple_module_path = Path::new("./tests/modules/SimpleUnformatted.roc");
let arena = Bump::new();
let file = roc_file::File::read(simple_module_path, &arena)
.expect("Could not read SimpleUnformatted.roc in test_file test");
assert_eq!(
file.fmt(),
indoc!(
r#"
interface Simple
exposes [
v, x
]
imports []
v : Str
v = "Value!"
x : Int
x = 4"#
)
);
}
}

View file

@ -34,7 +34,7 @@ pub const FLAG_DEV: &str = "dev";
pub const FLAG_OPTIMIZE: &str = "optimize"; pub const FLAG_OPTIMIZE: &str = "optimize";
pub const FLAG_OPT_SIZE: &str = "opt-size"; pub const FLAG_OPT_SIZE: &str = "opt-size";
pub const FLAG_LIB: &str = "lib"; pub const FLAG_LIB: &str = "lib";
pub const FLAG_BACKEND: &str = "backend"; pub const FLAG_TARGET: &str = "target";
pub const FLAG_TIME: &str = "time"; pub const FLAG_TIME: &str = "time";
pub const FLAG_LINK: &str = "roc-linker"; pub const FLAG_LINK: &str = "roc-linker";
pub const FLAG_PRECOMPILED: &str = "precompiled-host"; pub const FLAG_PRECOMPILED: &str = "precompiled-host";
@ -42,7 +42,6 @@ pub const FLAG_VALGRIND: &str = "valgrind";
pub const FLAG_CHECK: &str = "check"; pub const FLAG_CHECK: &str = "check";
pub const ROC_FILE: &str = "ROC_FILE"; pub const ROC_FILE: &str = "ROC_FILE";
pub const ROC_DIR: &str = "ROC_DIR"; pub const ROC_DIR: &str = "ROC_DIR";
pub const BACKEND: &str = "BACKEND";
pub const DIRECTORY_OR_FILES: &str = "DIRECTORY_OR_FILES"; pub const DIRECTORY_OR_FILES: &str = "DIRECTORY_OR_FILES";
pub const ARGS_FOR_APP: &str = "ARGS_FOR_APP"; pub const ARGS_FOR_APP: &str = "ARGS_FOR_APP";
@ -76,12 +75,11 @@ pub fn build_app<'a>() -> App<'a> {
.required(false), .required(false),
) )
.arg( .arg(
Arg::new(FLAG_BACKEND) Arg::new(FLAG_TARGET)
.long(FLAG_BACKEND) .long(FLAG_TARGET)
.about("Choose a different backend") .about("Choose a different target")
// .requires(BACKEND) .default_value(Target::default().as_str())
.default_value(Backend::default().as_str()) .possible_values(Target::OPTIONS)
.possible_values(Backend::OPTIONS)
.required(false), .required(false),
) )
.arg( .arg(
@ -212,12 +210,11 @@ pub fn build_app<'a>() -> App<'a> {
.required(false), .required(false),
) )
.arg( .arg(
Arg::new(FLAG_BACKEND) Arg::new(FLAG_TARGET)
.long(FLAG_BACKEND) .long(FLAG_TARGET)
.about("Choose a different backend") .about("Choose a different target")
// .requires(BACKEND) .default_value(Target::default().as_str())
.default_value(Backend::default().as_str()) .possible_values(Target::OPTIONS)
.possible_values(Backend::OPTIONS)
.required(false), .required(false),
) )
.arg( .arg(
@ -273,12 +270,12 @@ pub fn build(matches: &ArgMatches, config: BuildConfig) -> io::Result<i32> {
use std::str::FromStr; use std::str::FromStr;
use BuildConfig::*; use BuildConfig::*;
let backend = match matches.value_of(FLAG_BACKEND) { let target = match matches.value_of(FLAG_TARGET) {
Some(name) => Backend::from_str(name).unwrap(), Some(name) => Target::from_str(name).unwrap(),
None => Backend::default(), None => Target::default(),
}; };
let target = backend.to_triple(); let triple = target.to_triple();
let arena = Bump::new(); let arena = Bump::new();
let filename = matches.value_of(ROC_FILE).unwrap(); let filename = matches.value_of(ROC_FILE).unwrap();
@ -306,10 +303,10 @@ pub fn build(matches: &ArgMatches, config: BuildConfig) -> io::Result<i32> {
let surgically_link = matches.is_present(FLAG_LINK); let surgically_link = matches.is_present(FLAG_LINK);
let precompiled = matches.is_present(FLAG_PRECOMPILED); let precompiled = matches.is_present(FLAG_PRECOMPILED);
if surgically_link && !roc_linker::supported(&link_type, &target) { if surgically_link && !roc_linker::supported(&link_type, &triple) {
panic!( panic!(
"Link type, {:?}, with target, {}, not supported by roc linker", "Link type, {:?}, with target, {}, not supported by roc linker",
link_type, target link_type, triple
); );
} }
@ -338,7 +335,7 @@ pub fn build(matches: &ArgMatches, config: BuildConfig) -> io::Result<i32> {
let target_valgrind = matches.is_present(FLAG_VALGRIND); let target_valgrind = matches.is_present(FLAG_VALGRIND);
let res_binary_path = build_file( let res_binary_path = build_file(
&arena, &arena,
&target, &triple,
src_dir, src_dir,
path, path,
opt_level, opt_level,
@ -377,7 +374,7 @@ pub fn build(matches: &ArgMatches, config: BuildConfig) -> io::Result<i32> {
Ok(outcome.status_code()) Ok(outcome.status_code())
} }
BuildAndRun { roc_file_arg_index } => { BuildAndRun { roc_file_arg_index } => {
let mut cmd = match target.architecture { let mut cmd = match triple.architecture {
Architecture::Wasm32 => { Architecture::Wasm32 => {
// If possible, report the generated executable name relative to the current dir. // If possible, report the generated executable name relative to the current dir.
let generated_filename = binary_path let generated_filename = binary_path
@ -398,7 +395,7 @@ pub fn build(matches: &ArgMatches, config: BuildConfig) -> io::Result<i32> {
_ => Command::new(&binary_path), _ => Command::new(&binary_path),
}; };
if let Architecture::Wasm32 = target.architecture { if let Architecture::Wasm32 = triple.architecture {
cmd.arg(binary_path); cmd.arg(binary_path);
} }
@ -503,43 +500,43 @@ fn run_with_wasmer(_wasm_path: &std::path::Path, _args: &[String]) {
println!("Running wasm files not support"); println!("Running wasm files not support");
} }
enum Backend { enum Target {
Host, Host,
X86_32, X86_32,
X86_64, X86_64,
Wasm32, Wasm32,
} }
impl Default for Backend { impl Default for Target {
fn default() -> Self { fn default() -> Self {
Backend::Host Target::Host
} }
} }
impl Backend { impl Target {
const fn as_str(&self) -> &'static str { const fn as_str(&self) -> &'static str {
match self { match self {
Backend::Host => "host", Target::Host => "host",
Backend::X86_32 => "x86_32", Target::X86_32 => "x86_32",
Backend::X86_64 => "x86_64", Target::X86_64 => "x86_64",
Backend::Wasm32 => "wasm32", Target::Wasm32 => "wasm32",
} }
} }
/// NOTE keep up to date! /// NOTE keep up to date!
const OPTIONS: &'static [&'static str] = &[ const OPTIONS: &'static [&'static str] = &[
Backend::Host.as_str(), Target::Host.as_str(),
Backend::X86_32.as_str(), Target::X86_32.as_str(),
Backend::X86_64.as_str(), Target::X86_64.as_str(),
Backend::Wasm32.as_str(), Target::Wasm32.as_str(),
]; ];
fn to_triple(&self) -> Triple { fn to_triple(&self) -> Triple {
let mut triple = Triple::unknown(); let mut triple = Triple::unknown();
match self { match self {
Backend::Host => Triple::host(), Target::Host => Triple::host(),
Backend::X86_32 => { Target::X86_32 => {
triple.architecture = Architecture::X86_32(X86_32Architecture::I386); triple.architecture = Architecture::X86_32(X86_32Architecture::I386);
triple.binary_format = BinaryFormat::Elf; triple.binary_format = BinaryFormat::Elf;
@ -548,13 +545,13 @@ impl Backend {
triple triple
} }
Backend::X86_64 => { Target::X86_64 => {
triple.architecture = Architecture::X86_64; triple.architecture = Architecture::X86_64;
triple.binary_format = BinaryFormat::Elf; triple.binary_format = BinaryFormat::Elf;
triple triple
} }
Backend::Wasm32 => { Target::Wasm32 => {
triple.architecture = Architecture::Wasm32; triple.architecture = Architecture::Wasm32;
triple.binary_format = BinaryFormat::Wasm; triple.binary_format = BinaryFormat::Wasm;
@ -564,21 +561,21 @@ impl Backend {
} }
} }
impl std::fmt::Display for Backend { impl std::fmt::Display for Target {
fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result { fn fmt(&self, f: &mut std::fmt::Formatter) -> std::fmt::Result {
write!(f, "{}", self.as_str()) write!(f, "{}", self.as_str())
} }
} }
impl std::str::FromStr for Backend { impl std::str::FromStr for Target {
type Err = (); type Err = ();
fn from_str(s: &str) -> Result<Self, Self::Err> { fn from_str(s: &str) -> Result<Self, Self::Err> {
match s { match s {
"host" => Ok(Backend::Host), "host" => Ok(Target::Host),
"x86_32" => Ok(Backend::X86_32), "x86_32" => Ok(Target::X86_32),
"x86_64" => Ok(Backend::X86_64), "x86_64" => Ok(Target::X86_64),
"wasm32" => Ok(Backend::Wasm32), "wasm32" => Ok(Target::Wasm32),
_ => Err(()), _ => Err(()),
} }
} }

View file

@ -194,7 +194,7 @@ mod cli_run {
) { ) {
assert_eq!(input_file, None, "Wasm does not support input files"); assert_eq!(input_file, None, "Wasm does not support input files");
let mut flags = flags.to_vec(); let mut flags = flags.to_vec();
flags.push("--backend=wasm32"); flags.push("--target=wasm32");
let compile_out = run_roc(&[&["build", file.to_str().unwrap()], flags.as_slice()].concat()); let compile_out = run_roc(&[&["build", file.to_str().unwrap()], flags.as_slice()].concat());
if !compile_out.stderr.is_empty() { if !compile_out.stderr.is_empty() {
@ -565,7 +565,7 @@ mod cli_run {
&file_name, &file_name,
benchmark.stdin, benchmark.stdin,
benchmark.executable_filename, benchmark.executable_filename,
&["--backend=x86_32"], &["--target=x86_32"],
benchmark.input_file.and_then(|file| Some(examples_dir("benchmarks").join(file))), benchmark.input_file.and_then(|file| Some(examples_dir("benchmarks").join(file))),
benchmark.expected_ending, benchmark.expected_ending,
benchmark.use_valgrind, benchmark.use_valgrind,
@ -575,7 +575,7 @@ mod cli_run {
&file_name, &file_name,
benchmark.stdin, benchmark.stdin,
benchmark.executable_filename, benchmark.executable_filename,
&["--backend=x86_32", "--optimize"], &["--target=x86_32", "--optimize"],
benchmark.input_file.and_then(|file| Some(examples_dir("benchmarks").join(file))), benchmark.input_file.and_then(|file| Some(examples_dir("benchmarks").join(file))),
benchmark.expected_ending, benchmark.expected_ending,
benchmark.use_valgrind, benchmark.use_valgrind,

View file

@ -20,4 +20,6 @@ serde = { version = "1.0.130", features = ["derive"] }
serde-xml-rs = "0.5.1" serde-xml-rs = "0.5.1"
strip-ansi-escapes = "0.1.1" strip-ansi-escapes = "0.1.1"
tempfile = "3.2.0" tempfile = "3.2.0"
[target.'cfg(unix)'.dependencies]
rlimit = "0.6.2" rlimit = "0.6.2"

View file

@ -1,11 +1,12 @@
use crate::helpers::{example_file, run_cmd, run_roc}; use crate::helpers::{example_file, run_cmd, run_roc};
use criterion::{black_box, measurement::Measurement, BenchmarkGroup}; use criterion::{black_box, measurement::Measurement, BenchmarkGroup};
use rlimit::{setrlimit, Resource}; use std::{path::Path, thread};
use std::path::Path;
const CFOLD_STACK_SIZE: usize = 8192 * 100000;
fn exec_bench_w_input<T: Measurement>( fn exec_bench_w_input<T: Measurement>(
file: &Path, file: &Path,
stdin_str: &str, stdin_str: &'static str,
executable_filename: &str, executable_filename: &str,
expected_ending: &str, expected_ending: &str,
bench_group_opt: Option<&mut BenchmarkGroup<T>>, bench_group_opt: Option<&mut BenchmarkGroup<T>>,
@ -31,7 +32,7 @@ fn exec_bench_w_input<T: Measurement>(
fn check_cmd_output( fn check_cmd_output(
file: &Path, file: &Path,
stdin_str: &str, stdin_str: &'static str,
executable_filename: &str, executable_filename: &str,
expected_ending: &str, expected_ending: &str,
) { ) {
@ -41,11 +42,16 @@ fn check_cmd_output(
.unwrap() .unwrap()
.to_string(); .to_string();
if cmd_str.contains("cfold") { let out = if cmd_str.contains("cfold") {
increase_stack_limit(); let child = thread::Builder::new()
} .stack_size(CFOLD_STACK_SIZE)
.spawn(move || run_cmd(&cmd_str, &[stdin_str], &[]))
.unwrap();
let out = run_cmd(&cmd_str, &[stdin_str], &[]); child.join().unwrap()
} else {
run_cmd(&cmd_str, &[stdin_str], &[])
};
if !&out.stdout.ends_with(expected_ending) { if !&out.stdout.ends_with(expected_ending) {
panic!( panic!(
@ -69,7 +75,20 @@ fn bench_cmd<T: Measurement>(
.to_string(); .to_string();
if cmd_str.contains("cfold") { if cmd_str.contains("cfold") {
increase_stack_limit(); #[cfg(unix)]
use rlimit::{setrlimit, Resource};
#[cfg(unix)]
setrlimit(
Resource::STACK,
CFOLD_STACK_SIZE as u64,
CFOLD_STACK_SIZE as u64,
)
.expect("Failed to increase stack limit.");
#[cfg(windows)]
println!("Skipping the cfold benchmark on windows, I can't adjust the stack size and use criterion at the same time.");
#[cfg(windows)]
return;
} }
if let Some(bench_group) = bench_group_opt { if let Some(bench_group) = bench_group_opt {
@ -85,12 +104,6 @@ fn bench_cmd<T: Measurement>(
} }
} }
fn increase_stack_limit() {
let new_stack_limit = 8192 * 100000;
setrlimit(Resource::STACK, new_stack_limit, new_stack_limit)
.expect("Failed to increase stack limit.");
}
pub fn bench_nqueens<T: Measurement>(bench_group_opt: Option<&mut BenchmarkGroup<T>>) { pub fn bench_nqueens<T: Measurement>(bench_group_opt: Option<&mut BenchmarkGroup<T>>) {
exec_bench_w_input( exec_bench_w_input(
&example_file("benchmarks", "NQueens.roc"), &example_file("benchmarks", "NQueens.roc"),

View file

@ -0,0 +1,13 @@
[package]
authors = ["The Roc Contributors"]
edition = "2018"
license = "UPL-1.0"
name = "roc_alias_analysis"
version = "0.1.0"
[dependencies]
morphic_lib = {path = "../../vendor/morphic_lib"}
roc_collections = {path = "../collections"}
roc_module = {path = "../module"}
roc_mono = {path = "../mono"}

View file

@ -8,11 +8,11 @@ use roc_collections::all::{MutMap, MutSet};
use roc_module::low_level::LowLevel; use roc_module::low_level::LowLevel;
use roc_module::symbol::Symbol; use roc_module::symbol::Symbol;
use crate::ir::{ use roc_mono::ir::{
Call, CallType, Expr, HigherOrderLowLevel, HostExposedLayouts, ListLiteralElement, Literal, Call, CallType, Expr, HigherOrderLowLevel, HostExposedLayouts, ListLiteralElement, Literal,
ModifyRc, OptLevel, Proc, Stmt, ModifyRc, OptLevel, Proc, Stmt,
}; };
use crate::layout::{Builtin, Layout, RawFunctionLayout, UnionLayout}; use roc_mono::layout::{Builtin, Layout, RawFunctionLayout, UnionLayout};
// just using one module for now // just using one module for now
pub const MOD_APP: ModName = ModName(b"UserApp"); pub const MOD_APP: ModName = ModName(b"UserApp");
@ -110,7 +110,7 @@ fn bytes_as_ascii(bytes: &[u8]) -> String {
pub fn spec_program<'a, I>( pub fn spec_program<'a, I>(
opt_level: OptLevel, opt_level: OptLevel,
entry_point: crate::ir::EntryPoint<'a>, entry_point: roc_mono::ir::EntryPoint<'a>,
procs: I, procs: I,
) -> Result<morphic_lib::Solutions> ) -> Result<morphic_lib::Solutions>
where where
@ -266,7 +266,7 @@ fn terrible_hack(builder: &mut FuncDefBuilder, block: BlockId, type_id: TypeId)
} }
fn build_entry_point( fn build_entry_point(
layout: crate::ir::ProcLayout, layout: roc_mono::ir::ProcLayout,
func_name: FuncName, func_name: FuncName,
host_exposed_functions: &[([u8; SIZE], &[Layout])], host_exposed_functions: &[([u8; SIZE], &[Layout])],
) -> Result<FuncDef> { ) -> Result<FuncDef> {
@ -363,7 +363,7 @@ fn proc_spec<'a>(proc: &Proc<'a>) -> Result<(FuncDef, MutSet<UnionLayout<'a>>)>
#[derive(Default)] #[derive(Default)]
struct Env<'a> { struct Env<'a> {
symbols: MutMap<Symbol, ValueId>, symbols: MutMap<Symbol, ValueId>,
join_points: MutMap<crate::ir::JoinPointId, morphic_lib::ContinuationId>, join_points: MutMap<roc_mono::ir::JoinPointId, morphic_lib::ContinuationId>,
type_names: MutSet<UnionLayout<'a>>, type_names: MutSet<UnionLayout<'a>>,
} }
@ -711,7 +711,7 @@ fn call_spec(
passed_function, passed_function,
.. ..
}) => { }) => {
use crate::low_level::HigherOrder::*; use roc_mono::low_level::HigherOrder::*;
let array = passed_function.specialization_id.to_bytes(); let array = passed_function.specialization_id.to_bytes();
let spec_var = CalleeSpecVar(&array); let spec_var = CalleeSpecVar(&array);
@ -1196,7 +1196,7 @@ fn lowlevel_spec(
block: BlockId, block: BlockId,
layout: &Layout, layout: &Layout,
op: &LowLevel, op: &LowLevel,
update_mode: crate::ir::UpdateModeId, update_mode: roc_mono::ir::UpdateModeId,
arguments: &[Symbol], arguments: &[Symbol],
) -> Result<ValueId> { ) -> Result<ValueId> {
use LowLevel::*; use LowLevel::*;
@ -1258,22 +1258,21 @@ fn lowlevel_spec(
builder.add_bag_get(block, bag) builder.add_bag_get(block, bag)
} }
ListSet => { ListReplaceUnsafe => {
let list = env.symbols[&arguments[0]]; let list = env.symbols[&arguments[0]];
let to_insert = env.symbols[&arguments[2]]; let to_insert = env.symbols[&arguments[2]];
let bag = builder.add_get_tuple_field(block, list, LIST_BAG_INDEX)?; let bag = builder.add_get_tuple_field(block, list, LIST_BAG_INDEX)?;
let cell = builder.add_get_tuple_field(block, list, LIST_CELL_INDEX)?; let cell = builder.add_get_tuple_field(block, list, LIST_CELL_INDEX)?;
// decrement the overwritten element let _unit1 = builder.add_touch(block, cell)?;
let overwritten = builder.add_bag_get(block, bag)?; let _unit2 = builder.add_update(block, update_mode_var, cell)?;
let _unit = builder.add_recursive_touch(block, overwritten)?;
let _unit = builder.add_update(block, update_mode_var, cell)?;
builder.add_bag_insert(block, bag, to_insert)?; builder.add_bag_insert(block, bag, to_insert)?;
with_new_heap_cell(builder, block, bag) let old_value = builder.add_bag_get(block, bag)?;
let new_list = with_new_heap_cell(builder, block, bag)?;
builder.add_make_tuple(block, &[new_list, old_value])
} }
ListSwap => { ListSwap => {
let list = env.symbols[&arguments[0]]; let list = env.symbols[&arguments[0]];

View file

@ -46,6 +46,10 @@ pub fn link(
operating_system: OperatingSystem::Darwin, operating_system: OperatingSystem::Darwin,
.. ..
} => link_macos(target, output_path, input_paths, link_type), } => link_macos(target, output_path, input_paths, link_type),
Triple {
operating_system: OperatingSystem::Windows,
..
} => link_windows(target, output_path, input_paths, link_type),
_ => panic!("TODO gracefully handle unsupported target: {:?}", target), _ => panic!("TODO gracefully handle unsupported target: {:?}", target),
} }
} }
@ -1049,6 +1053,15 @@ fn link_wasm32(
Ok((child, output_path)) Ok((child, output_path))
} }
fn link_windows(
_target: &Triple,
_output_path: PathBuf,
_input_paths: &[&str],
_link_type: LinkType,
) -> io::Result<(Child, PathBuf)> {
todo!("Add windows support to the surgical linker. See issue #2608.")
}
#[cfg(feature = "llvm")] #[cfg(feature = "llvm")]
pub fn module_to_dylib( pub fn module_to_dylib(
module: &inkwell::module::Module, module: &inkwell::module::Module,

View file

@ -41,6 +41,11 @@ pub fn target_triple_str(target: &Triple) -> &'static str {
operating_system: OperatingSystem::Darwin, operating_system: OperatingSystem::Darwin,
.. ..
} => "x86_64-unknown-darwin10", } => "x86_64-unknown-darwin10",
Triple {
architecture: Architecture::X86_64,
operating_system: OperatingSystem::Windows,
..
} => "x86_64-pc-windows-gnu",
_ => panic!("TODO gracefully handle unsupported target: {:?}", target), _ => panic!("TODO gracefully handle unsupported target: {:?}", target),
} }
} }

View file

@ -7,7 +7,7 @@ To add a builtin:
2. Make sure the function is public with the `pub` keyword and uses the C calling convention. This is really easy, just add `pub` and `callconv(.C)` to the function declaration like so: `pub fn atan(num: f64) callconv(.C) f64 { ... }` 2. Make sure the function is public with the `pub` keyword and uses the C calling convention. This is really easy, just add `pub` and `callconv(.C)` to the function declaration like so: `pub fn atan(num: f64) callconv(.C) f64 { ... }`
3. In `src/main.zig`, export the function. This is also organized by module. For example, for a `Num` function find the `Num` section and add: `comptime { exportNumFn(num.atan, "atan"); }`. The first argument is the function, the second is the name of it in LLVM. 3. In `src/main.zig`, export the function. This is also organized by module. For example, for a `Num` function find the `Num` section and add: `comptime { exportNumFn(num.atan, "atan"); }`. The first argument is the function, the second is the name of it in LLVM.
4. In `compiler/builtins/src/bitcode.rs`, add a constant for the new function. This is how we use it in Rust. Once again, this is organized by module, so just find the relevant area and add your new function. 4. In `compiler/builtins/src/bitcode.rs`, add a constant for the new function. This is how we use it in Rust. Once again, this is organized by module, so just find the relevant area and add your new function.
5. You can now your function in Rust using `call_bitcode_fn` in `llvm/src/build.rs`! 5. You can now use your function in Rust using `call_bitcode_fn` in `llvm/src/build.rs`!
## How it works ## How it works

View file

@ -26,21 +26,19 @@ pub const RocDec = extern struct {
return .{ .num = num * one_point_zero_i128 }; return .{ .num = num * one_point_zero_i128 };
} }
// TODO: There's got to be a better way to do this other than converting to Str
pub fn fromF64(num: f64) ?RocDec { pub fn fromF64(num: f64) ?RocDec {
var digit_bytes: [19]u8 = undefined; // 19 = max f64 digits + '.' + '-' var result: f64 = num * comptime @intToFloat(f64, one_point_zero_i128);
var fbs = std.io.fixedBufferStream(digit_bytes[0..]); if (result > comptime @intToFloat(f64, math.maxInt(i128))) {
std.fmt.formatFloatDecimal(num, .{}, fbs.writer()) catch
return null;
var dec = RocDec.fromStr(RocStr.init(&digit_bytes, fbs.pos));
if (dec) |d| {
return d;
} else {
return null; return null;
} }
if (result < comptime @intToFloat(f64, math.minInt(i128))) {
return null;
}
var ret: RocDec = .{ .num = @floatToInt(i128, result) };
return ret;
} }
pub fn fromStr(roc_str: RocStr) ?RocDec { pub fn fromStr(roc_str: RocStr) ?RocDec {
@ -729,6 +727,11 @@ test "fromF64" {
try expectEqual(RocDec{ .num = 25500000000000000000 }, dec.?); try expectEqual(RocDec{ .num = 25500000000000000000 }, dec.?);
} }
test "fromF64 overflow" {
var dec = RocDec.fromF64(1e308);
try expectEqual(dec, null);
}
test "fromStr: empty" { test "fromStr: empty" {
var roc_str = RocStr.init("", 0); var roc_str = RocStr.init("", 0);
var dec = RocDec.fromStr(roc_str); var dec = RocDec.fromStr(roc_str);

View file

@ -1256,95 +1256,56 @@ pub fn listConcat(list_a: RocList, list_b: RocList, alignment: u32, element_widt
return output; return output;
} }
pub fn listSetInPlace( pub fn listReplaceInPlace(
bytes: ?[*]u8, list: RocList,
index: usize, index: usize,
element: Opaque, element: Opaque,
element_width: usize, element_width: usize,
dec: Dec, out_element: ?[*]u8,
) callconv(.C) ?[*]u8 { ) callconv(.C) RocList {
// INVARIANT: bounds checking happens on the roc side // INVARIANT: bounds checking happens on the roc side
// //
// at the time of writing, the function is implemented roughly as // at the time of writing, the function is implemented roughly as
// `if inBounds then LowLevelListGet input index item else input` // `if inBounds then LowLevelListReplace input index item else input`
// so we don't do a bounds check here. Hence, the list is also non-empty, // so we don't do a bounds check here. Hence, the list is also non-empty,
// because inserting into an empty list is always out of bounds // because inserting into an empty list is always out of bounds
return listReplaceInPlaceHelp(list, index, element, element_width, out_element);
return listSetInPlaceHelp(bytes, index, element, element_width, dec);
} }
pub fn listSet( pub fn listReplace(
bytes: ?[*]u8, list: RocList,
length: usize,
alignment: u32, alignment: u32,
index: usize, index: usize,
element: Opaque, element: Opaque,
element_width: usize, element_width: usize,
dec: Dec, out_element: ?[*]u8,
) callconv(.C) ?[*]u8 { ) callconv(.C) RocList {
// INVARIANT: bounds checking happens on the roc side // INVARIANT: bounds checking happens on the roc side
// //
// at the time of writing, the function is implemented roughly as // at the time of writing, the function is implemented roughly as
// `if inBounds then LowLevelListGet input index item else input` // `if inBounds then LowLevelListReplace input index item else input`
// so we don't do a bounds check here. Hence, the list is also non-empty, // so we don't do a bounds check here. Hence, the list is also non-empty,
// because inserting into an empty list is always out of bounds // because inserting into an empty list is always out of bounds
const ptr: [*]usize = @ptrCast([*]usize, @alignCast(@alignOf(usize), bytes)); return listReplaceInPlaceHelp(list.makeUnique(alignment, element_width), index, element, element_width, out_element);
if ((ptr - 1)[0] == utils.REFCOUNT_ONE) {
return listSetInPlaceHelp(bytes, index, element, element_width, dec);
} else {
return listSetImmutable(bytes, length, alignment, index, element, element_width, dec);
}
} }
inline fn listSetInPlaceHelp( inline fn listReplaceInPlaceHelp(
bytes: ?[*]u8, list: RocList,
index: usize, index: usize,
element: Opaque, element: Opaque,
element_width: usize, element_width: usize,
dec: Dec, out_element: ?[*]u8,
) ?[*]u8 { ) RocList {
// the element we will replace // the element we will replace
var element_at_index = (bytes orelse undefined) + (index * element_width); var element_at_index = (list.bytes orelse undefined) + (index * element_width);
// decrement its refcount // copy out the old element
dec(element_at_index); @memcpy(out_element orelse undefined, element_at_index, element_width);
// copy in the new element // copy in the new element
@memcpy(element_at_index, element orelse undefined, element_width); @memcpy(element_at_index, element orelse undefined, element_width);
return bytes; return list;
}
inline fn listSetImmutable(
old_bytes: ?[*]u8,
length: usize,
alignment: u32,
index: usize,
element: Opaque,
element_width: usize,
dec: Dec,
) ?[*]u8 {
const data_bytes = length * element_width;
var new_bytes = utils.allocateWithRefcount(data_bytes, alignment);
@memcpy(new_bytes, old_bytes orelse undefined, data_bytes);
// the element we will replace
var element_at_index = new_bytes + (index * element_width);
// decrement its refcount
dec(element_at_index);
// copy in the new element
@memcpy(element_at_index, element orelse undefined, element_width);
// consume RC token of original
utils.decref(old_bytes, data_bytes, alignment);
//return list;
return new_bytes;
} }
pub fn listFindUnsafe( pub fn listFindUnsafe(

View file

@ -49,8 +49,8 @@ comptime {
exportListFn(list.listConcat, "concat"); exportListFn(list.listConcat, "concat");
exportListFn(list.listSublist, "sublist"); exportListFn(list.listSublist, "sublist");
exportListFn(list.listDropAt, "drop_at"); exportListFn(list.listDropAt, "drop_at");
exportListFn(list.listSet, "set"); exportListFn(list.listReplace, "replace");
exportListFn(list.listSetInPlace, "set_in_place"); exportListFn(list.listReplaceInPlace, "replace_in_place");
exportListFn(list.listSwap, "swap"); exportListFn(list.listSwap, "swap");
exportListFn(list.listAny, "any"); exportListFn(list.listAny, "any");
exportListFn(list.listAll, "all"); exportListFn(list.listAll, "all");

View file

@ -50,12 +50,17 @@ fn main() {
); );
// OBJECT FILES // OBJECT FILES
#[cfg(windows)]
const BUILTINS_HOST_FILE: &str = "builtins-host.obj";
#[cfg(not(windows))]
const BUILTINS_HOST_FILE: &str = "builtins-host.o";
generate_object_file( generate_object_file(
&bitcode_path, &bitcode_path,
"BUILTINS_HOST_O", "BUILTINS_HOST_O",
"object", "object",
"builtins-host.o", BUILTINS_HOST_FILE,
); );
generate_object_file( generate_object_file(
@ -104,7 +109,7 @@ fn generate_object_file(
println!("Moving zig object `{}` to: {}", zig_object, dest_obj); println!("Moving zig object `{}` to: {}", zig_object, dest_obj);
// we store this .o file in rust's `target` folder (for wasm we need to leave a copy here too) // we store this .o file in rust's `target` folder (for wasm we need to leave a copy here too)
run_command(&bitcode_path, "cp", &[src_obj, dest_obj]); fs::copy(src_obj, dest_obj).expect("Failed to copy object file.");
} }
fn generate_bc_file( fn generate_bc_file(

View file

@ -354,8 +354,8 @@ pub const LIST_RANGE: &str = "roc_builtins.list.range";
pub const LIST_REVERSE: &str = "roc_builtins.list.reverse"; pub const LIST_REVERSE: &str = "roc_builtins.list.reverse";
pub const LIST_SORT_WITH: &str = "roc_builtins.list.sort_with"; pub const LIST_SORT_WITH: &str = "roc_builtins.list.sort_with";
pub const LIST_CONCAT: &str = "roc_builtins.list.concat"; pub const LIST_CONCAT: &str = "roc_builtins.list.concat";
pub const LIST_SET: &str = "roc_builtins.list.set"; pub const LIST_REPLACE: &str = "roc_builtins.list.replace";
pub const LIST_SET_IN_PLACE: &str = "roc_builtins.list.set_in_place"; pub const LIST_REPLACE_IN_PLACE: &str = "roc_builtins.list.replace_in_place";
pub const LIST_ANY: &str = "roc_builtins.list.any"; pub const LIST_ANY: &str = "roc_builtins.list.any";
pub const LIST_ALL: &str = "roc_builtins.list.all"; pub const LIST_ALL: &str = "roc_builtins.list.all";
pub const LIST_FIND_UNSAFE: &str = "roc_builtins.list.find_unsafe"; pub const LIST_FIND_UNSAFE: &str = "roc_builtins.list.find_unsafe";

View file

@ -1056,6 +1056,19 @@ pub fn types() -> MutMap<Symbol, (SolvedType, Region)> {
Box::new(result_type(flex(TVAR1), list_was_empty.clone())), Box::new(result_type(flex(TVAR1), list_was_empty.clone())),
); );
// replace : List elem, Nat, elem -> { list: List elem, value: elem }
add_top_level_function_type!(
Symbol::LIST_REPLACE,
vec![list_type(flex(TVAR1)), nat_type(), flex(TVAR1)],
Box::new(SolvedType::Record {
fields: vec![
("list".into(), RecordField::Required(list_type(flex(TVAR1)))),
("value".into(), RecordField::Required(flex(TVAR1))),
],
ext: Box::new(SolvedType::EmptyRecord),
}),
);
// set : List elem, Nat, elem -> List elem // set : List elem, Nat, elem -> List elem
add_top_level_function_type!( add_top_level_function_type!(
Symbol::LIST_SET, Symbol::LIST_SET,

View file

@ -16,6 +16,7 @@ roc_types = { path = "../types" }
roc_builtins = { path = "../builtins" } roc_builtins = { path = "../builtins" }
ven_graph = { path = "../../vendor/pathfinding" } ven_graph = { path = "../../vendor/pathfinding" }
bumpalo = { version = "3.8.0", features = ["collections"] } bumpalo = { version = "3.8.0", features = ["collections"] }
static_assertions = "1.1.0"
[dev-dependencies] [dev-dependencies]
pretty_assertions = "1.0.0" pretty_assertions = "1.0.0"

View file

@ -29,6 +29,8 @@ pub struct IntroducedVariables {
// but a variable can only have one name. Therefore // but a variable can only have one name. Therefore
// `ftv : SendMap<Variable, Lowercase>`. // `ftv : SendMap<Variable, Lowercase>`.
pub wildcards: Vec<Variable>, pub wildcards: Vec<Variable>,
pub lambda_sets: Vec<Variable>,
pub inferred: Vec<Variable>,
pub var_by_name: SendMap<Lowercase, Variable>, pub var_by_name: SendMap<Lowercase, Variable>,
pub name_by_var: SendMap<Variable, Lowercase>, pub name_by_var: SendMap<Variable, Lowercase>,
pub host_exposed_aliases: MutMap<Symbol, Variable>, pub host_exposed_aliases: MutMap<Symbol, Variable>,
@ -44,12 +46,22 @@ impl IntroducedVariables {
self.wildcards.push(var); self.wildcards.push(var);
} }
pub fn insert_inferred(&mut self, var: Variable) {
self.inferred.push(var);
}
fn insert_lambda_set(&mut self, var: Variable) {
self.lambda_sets.push(var);
}
pub fn insert_host_exposed_alias(&mut self, symbol: Symbol, var: Variable) { pub fn insert_host_exposed_alias(&mut self, symbol: Symbol, var: Variable) {
self.host_exposed_aliases.insert(symbol, var); self.host_exposed_aliases.insert(symbol, var);
} }
pub fn union(&mut self, other: &Self) { pub fn union(&mut self, other: &Self) {
self.wildcards.extend(other.wildcards.iter().cloned()); self.wildcards.extend(other.wildcards.iter().cloned());
self.lambda_sets.extend(other.lambda_sets.iter().cloned());
self.inferred.extend(other.inferred.iter().cloned());
self.var_by_name.extend(other.var_by_name.clone()); self.var_by_name.extend(other.var_by_name.clone());
self.name_by_var.extend(other.name_by_var.clone()); self.name_by_var.extend(other.name_by_var.clone());
self.host_exposed_aliases self.host_exposed_aliases
@ -280,7 +292,9 @@ fn can_annotation_help(
references, references,
); );
let closure = Type::Variable(var_store.fresh()); let lambda_set = var_store.fresh();
introduced_variables.insert_lambda_set(lambda_set);
let closure = Type::Variable(lambda_set);
Type::Function(args, Box::new(closure), Box::new(ret)) Type::Function(args, Box::new(closure), Box::new(ret))
} }
@ -326,6 +340,7 @@ fn can_annotation_help(
let (type_arguments, lambda_set_variables, actual) = let (type_arguments, lambda_set_variables, actual) =
instantiate_and_freshen_alias_type( instantiate_and_freshen_alias_type(
var_store, var_store,
introduced_variables,
&alias.type_variables, &alias.type_variables,
args, args,
&alias.lambda_set_variables, &alias.lambda_set_variables,
@ -612,6 +627,9 @@ fn can_annotation_help(
// Inference variables aren't bound to a rigid or a wildcard, so all we have to do is // Inference variables aren't bound to a rigid or a wildcard, so all we have to do is
// make a fresh unconstrained variable, and let the type solver fill it in for us 🤠 // make a fresh unconstrained variable, and let the type solver fill it in for us 🤠
let var = var_store.fresh(); let var = var_store.fresh();
introduced_variables.insert_inferred(var);
Type::Variable(var) Type::Variable(var)
} }
Malformed(string) => { Malformed(string) => {
@ -628,6 +646,7 @@ fn can_annotation_help(
pub fn instantiate_and_freshen_alias_type( pub fn instantiate_and_freshen_alias_type(
var_store: &mut VarStore, var_store: &mut VarStore,
introduced_variables: &mut IntroducedVariables,
type_variables: &[Loc<(Lowercase, Variable)>], type_variables: &[Loc<(Lowercase, Variable)>],
type_arguments: Vec<Type>, type_arguments: Vec<Type>,
lambda_set_variables: &[LambdaSet], lambda_set_variables: &[LambdaSet],
@ -657,6 +676,7 @@ pub fn instantiate_and_freshen_alias_type(
if let Type::Variable(var) = typ.0 { if let Type::Variable(var) = typ.0 {
let fresh = var_store.fresh(); let fresh = var_store.fresh();
substitutions.insert(var, Type::Variable(fresh)); substitutions.insert(var, Type::Variable(fresh));
introduced_variables.insert_lambda_set(fresh);
new_lambda_set_variables.push(LambdaSet(Type::Variable(fresh))); new_lambda_set_variables.push(LambdaSet(Type::Variable(fresh)));
} else { } else {
unreachable!("at this point there should be only vars in there"); unreachable!("at this point there should be only vars in there");
@ -681,8 +701,12 @@ pub fn freshen_opaque_def(
.map(|_| Type::Variable(var_store.fresh())) .map(|_| Type::Variable(var_store.fresh()))
.collect(); .collect();
// TODO this gets ignored; is that a problem
let mut introduced_variables = IntroducedVariables::default();
instantiate_and_freshen_alias_type( instantiate_and_freshen_alias_type(
var_store, var_store,
&mut introduced_variables,
&opaque.type_variables, &opaque.type_variables,
fresh_arguments, fresh_arguments,
&opaque.lambda_set_variables, &opaque.lambda_set_variables,

View file

@ -57,6 +57,7 @@ pub fn builtin_dependencies(symbol: Symbol) -> &'static [Symbol] {
Symbol::LIST_PRODUCT => &[Symbol::LIST_WALK, Symbol::NUM_MUL], Symbol::LIST_PRODUCT => &[Symbol::LIST_WALK, Symbol::NUM_MUL],
Symbol::LIST_SUM => &[Symbol::LIST_WALK, Symbol::NUM_ADD], Symbol::LIST_SUM => &[Symbol::LIST_WALK, Symbol::NUM_ADD],
Symbol::LIST_JOIN_MAP => &[Symbol::LIST_WALK, Symbol::LIST_CONCAT], Symbol::LIST_JOIN_MAP => &[Symbol::LIST_WALK, Symbol::LIST_CONCAT],
Symbol::LIST_SET => &[Symbol::LIST_REPLACE],
_ => &[], _ => &[],
} }
} }
@ -102,6 +103,7 @@ pub fn builtin_defs_map(symbol: Symbol, var_store: &mut VarStore) -> Option<Def>
STR_TO_I8 => str_to_num, STR_TO_I8 => str_to_num,
LIST_LEN => list_len, LIST_LEN => list_len,
LIST_GET => list_get, LIST_GET => list_get,
LIST_REPLACE => list_replace,
LIST_SET => list_set, LIST_SET => list_set,
LIST_APPEND => list_append, LIST_APPEND => list_append,
LIST_FIRST => list_first, LIST_FIRST => list_first,
@ -2304,6 +2306,91 @@ fn list_get(symbol: Symbol, var_store: &mut VarStore) -> Def {
) )
} }
/// List.replace : List elem, Nat, elem -> { list: List elem, value: elem }
fn list_replace(symbol: Symbol, var_store: &mut VarStore) -> Def {
let arg_list = Symbol::ARG_1;
let arg_index = Symbol::ARG_2;
let arg_elem = Symbol::ARG_3;
let bool_var = var_store.fresh();
let len_var = var_store.fresh();
let elem_var = var_store.fresh();
let list_arg_var = var_store.fresh();
let ret_record_var = var_store.fresh();
let ret_result_var = var_store.fresh();
let list_field = Field {
var: list_arg_var,
region: Region::zero(),
loc_expr: Box::new(Loc::at_zero(Expr::Var(arg_list))),
};
let value_field = Field {
var: elem_var,
region: Region::zero(),
loc_expr: Box::new(Loc::at_zero(Expr::Var(arg_elem))),
};
// Perform a bounds check. If it passes, run LowLevel::ListReplaceUnsafe.
// Otherwise, return the list unmodified.
let body = If {
cond_var: bool_var,
branch_var: ret_result_var,
branches: vec![(
// if-condition
no_region(
// index < List.len list
RunLowLevel {
op: LowLevel::NumLt,
args: vec![
(len_var, Var(arg_index)),
(
len_var,
RunLowLevel {
op: LowLevel::ListLen,
args: vec![(list_arg_var, Var(arg_list))],
ret_var: len_var,
},
),
],
ret_var: bool_var,
},
),
// then-branch
no_region(
// List.replaceUnsafe list index elem
RunLowLevel {
op: LowLevel::ListReplaceUnsafe,
args: vec![
(list_arg_var, Var(arg_list)),
(len_var, Var(arg_index)),
(elem_var, Var(arg_elem)),
],
ret_var: ret_record_var,
},
),
)],
final_else: Box::new(
// else-branch
no_region(record(
vec![("list".into(), list_field), ("value".into(), value_field)],
var_store,
)),
),
};
defn(
symbol,
vec![
(list_arg_var, Symbol::ARG_1),
(len_var, Symbol::ARG_2),
(elem_var, Symbol::ARG_3),
],
var_store,
body,
ret_result_var,
)
}
/// List.set : List elem, Nat, elem -> List elem /// List.set : List elem, Nat, elem -> List elem
/// ///
/// List.set : /// List.set :
@ -2318,9 +2405,27 @@ fn list_set(symbol: Symbol, var_store: &mut VarStore) -> Def {
let bool_var = var_store.fresh(); let bool_var = var_store.fresh();
let len_var = var_store.fresh(); let len_var = var_store.fresh();
let elem_var = var_store.fresh(); let elem_var = var_store.fresh();
let replace_record_var = var_store.fresh();
let list_arg_var = var_store.fresh(); // Uniqueness type Attr differs between let list_arg_var = var_store.fresh(); // Uniqueness type Attr differs between
let list_ret_var = var_store.fresh(); // the arg list and the returned list let list_ret_var = var_store.fresh(); // the arg list and the returned list
let replace_function = (
var_store.fresh(),
Loc::at_zero(Expr::Var(Symbol::LIST_REPLACE)),
var_store.fresh(),
replace_record_var,
);
let replace_call = Expr::Call(
Box::new(replace_function),
vec![
(list_arg_var, Loc::at_zero(Var(arg_list))),
(len_var, Loc::at_zero(Var(arg_index))),
(elem_var, Loc::at_zero(Var(arg_elem))),
],
CalledVia::Space,
);
// Perform a bounds check. If it passes, run LowLevel::ListSet. // Perform a bounds check. If it passes, run LowLevel::ListSet.
// Otherwise, return the list unmodified. // Otherwise, return the list unmodified.
let body = If { let body = If {
@ -2347,18 +2452,16 @@ fn list_set(symbol: Symbol, var_store: &mut VarStore) -> Def {
}, },
), ),
// then-branch // then-branch
no_region( no_region(Access {
// List.setUnsafe list index record_var: replace_record_var,
RunLowLevel { ext_var: var_store.fresh(),
op: LowLevel::ListSet, field_var: list_ret_var,
args: vec![ loc_expr: Box::new(no_region(
(list_arg_var, Var(arg_list)), // List.replaceUnsafe list index elem
(len_var, Var(arg_index)), replace_call,
(elem_var, Var(arg_elem)), )),
], field: "list".into(),
ret_var: list_ret_var, }),
},
),
)], )],
final_else: Box::new( final_else: Box::new(
// else-branch // else-branch

View file

@ -1,175 +1,482 @@
use crate::expected::{Expected, PExpected}; use crate::expected::{Expected, PExpected};
use roc_collections::all::{MutSet, SendMap}; use roc_collections::soa::{Index, Slice};
use roc_module::{ident::TagName, symbol::Symbol}; use roc_module::ident::TagName;
use roc_module::symbol::Symbol;
use roc_region::all::{Loc, Region}; use roc_region::all::{Loc, Region};
use roc_types::subs::Variable;
use roc_types::types::{Category, PatternCategory, Type}; use roc_types::types::{Category, PatternCategory, Type};
use roc_types::{subs::Variable, types::VariableDetail};
/// A presence constraint is an additive constraint that defines the lower bound #[derive(Debug)]
/// of a type. For example, `Present(t1, IncludesTag(A, []))` means that the pub struct Constraints {
/// type `t1` must contain at least the tag `A`. The additive nature of these pub constraints: Vec<Constraint>,
/// constraints makes them behaviorally different from unification-based constraints. pub types: Vec<Type>,
#[derive(Debug, Clone, PartialEq)] pub variables: Vec<Variable>,
pub enum PresenceConstraint { pub def_types: Vec<(Symbol, Loc<Index<Type>>)>,
IncludesTag(TagName, Vec<Type>, Region, PatternCategory), pub let_constraints: Vec<LetConstraint>,
IsOpen, pub categories: Vec<Category>,
Pattern(Region, PatternCategory, PExpected<Type>), pub pattern_categories: Vec<PatternCategory>,
pub expectations: Vec<Expected<Type>>,
pub pattern_expectations: Vec<PExpected<Type>>,
pub includes_tags: Vec<IncludesTag>,
pub strings: Vec<&'static str>,
} }
impl Default for Constraints {
fn default() -> Self {
Self::new()
}
}
impl Constraints {
pub fn new() -> Self {
let constraints = Vec::new();
let mut types = Vec::new();
let variables = Vec::new();
let def_types = Vec::new();
let let_constraints = Vec::new();
let mut categories = Vec::with_capacity(16);
let mut pattern_categories = Vec::with_capacity(16);
let expectations = Vec::new();
let pattern_expectations = Vec::new();
let includes_tags = Vec::new();
let strings = Vec::new();
types.extend([Type::EmptyRec, Type::EmptyTagUnion]);
categories.extend([
Category::Record,
Category::ForeignCall,
Category::OpaqueArg,
Category::Lambda,
Category::ClosureSize,
Category::StrInterpolation,
Category::If,
Category::When,
Category::Float,
Category::Int,
Category::Num,
Category::List,
Category::Str,
Category::Character,
]);
pattern_categories.extend([
PatternCategory::Record,
PatternCategory::EmptyRecord,
PatternCategory::PatternGuard,
PatternCategory::PatternDefault,
PatternCategory::Set,
PatternCategory::Map,
PatternCategory::Str,
PatternCategory::Num,
PatternCategory::Int,
PatternCategory::Float,
PatternCategory::Character,
]);
Self {
constraints,
types,
variables,
def_types,
let_constraints,
categories,
pattern_categories,
expectations,
pattern_expectations,
includes_tags,
strings,
}
}
pub const EMPTY_RECORD: Index<Type> = Index::new(0);
pub const EMPTY_TAG_UNION: Index<Type> = Index::new(1);
pub const CATEGORY_RECORD: Index<Category> = Index::new(0);
pub const CATEGORY_FOREIGNCALL: Index<Category> = Index::new(1);
pub const CATEGORY_OPAQUEARG: Index<Category> = Index::new(2);
pub const CATEGORY_LAMBDA: Index<Category> = Index::new(3);
pub const CATEGORY_CLOSURESIZE: Index<Category> = Index::new(4);
pub const CATEGORY_STRINTERPOLATION: Index<Category> = Index::new(5);
pub const CATEGORY_IF: Index<Category> = Index::new(6);
pub const CATEGORY_WHEN: Index<Category> = Index::new(7);
pub const CATEGORY_FLOAT: Index<Category> = Index::new(8);
pub const CATEGORY_INT: Index<Category> = Index::new(9);
pub const CATEGORY_NUM: Index<Category> = Index::new(10);
pub const CATEGORY_LIST: Index<Category> = Index::new(11);
pub const CATEGORY_STR: Index<Category> = Index::new(12);
pub const CATEGORY_CHARACTER: Index<Category> = Index::new(13);
pub const PCATEGORY_RECORD: Index<PatternCategory> = Index::new(0);
pub const PCATEGORY_EMPTYRECORD: Index<PatternCategory> = Index::new(1);
pub const PCATEGORY_PATTERNGUARD: Index<PatternCategory> = Index::new(2);
pub const PCATEGORY_PATTERNDEFAULT: Index<PatternCategory> = Index::new(3);
pub const PCATEGORY_SET: Index<PatternCategory> = Index::new(4);
pub const PCATEGORY_MAP: Index<PatternCategory> = Index::new(5);
pub const PCATEGORY_STR: Index<PatternCategory> = Index::new(6);
pub const PCATEGORY_NUM: Index<PatternCategory> = Index::new(7);
pub const PCATEGORY_INT: Index<PatternCategory> = Index::new(8);
pub const PCATEGORY_FLOAT: Index<PatternCategory> = Index::new(9);
pub const PCATEGORY_CHARACTER: Index<PatternCategory> = Index::new(10);
#[inline(always)]
pub fn push_type(&mut self, typ: Type) -> Index<Type> {
match typ {
Type::EmptyRec => Self::EMPTY_RECORD,
Type::EmptyTagUnion => Self::EMPTY_TAG_UNION,
other => Index::push_new(&mut self.types, other),
}
}
#[inline(always)]
pub fn push_expected_type(&mut self, expected: Expected<Type>) -> Index<Expected<Type>> {
Index::push_new(&mut self.expectations, expected)
}
#[inline(always)]
pub fn push_category(&mut self, category: Category) -> Index<Category> {
match category {
Category::Record => Self::CATEGORY_RECORD,
Category::ForeignCall => Self::CATEGORY_FOREIGNCALL,
Category::OpaqueArg => Self::CATEGORY_OPAQUEARG,
Category::Lambda => Self::CATEGORY_LAMBDA,
Category::ClosureSize => Self::CATEGORY_CLOSURESIZE,
Category::StrInterpolation => Self::CATEGORY_STRINTERPOLATION,
Category::If => Self::CATEGORY_IF,
Category::When => Self::CATEGORY_WHEN,
Category::Float => Self::CATEGORY_FLOAT,
Category::Int => Self::CATEGORY_INT,
Category::Num => Self::CATEGORY_NUM,
Category::List => Self::CATEGORY_LIST,
Category::Str => Self::CATEGORY_STR,
Category::Character => Self::CATEGORY_CHARACTER,
other => Index::push_new(&mut self.categories, other),
}
}
#[inline(always)]
pub fn push_pattern_category(&mut self, category: PatternCategory) -> Index<PatternCategory> {
match category {
PatternCategory::Record => Self::PCATEGORY_RECORD,
PatternCategory::EmptyRecord => Self::PCATEGORY_EMPTYRECORD,
PatternCategory::PatternGuard => Self::PCATEGORY_PATTERNGUARD,
PatternCategory::PatternDefault => Self::PCATEGORY_PATTERNDEFAULT,
PatternCategory::Set => Self::PCATEGORY_SET,
PatternCategory::Map => Self::PCATEGORY_MAP,
PatternCategory::Str => Self::PCATEGORY_STR,
PatternCategory::Num => Self::PCATEGORY_NUM,
PatternCategory::Int => Self::PCATEGORY_INT,
PatternCategory::Float => Self::PCATEGORY_FLOAT,
PatternCategory::Character => Self::PCATEGORY_CHARACTER,
other => Index::push_new(&mut self.pattern_categories, other),
}
}
pub fn equal_types(
&mut self,
typ: Type,
expected: Expected<Type>,
category: Category,
region: Region,
) -> Constraint {
let type_index = Index::push_new(&mut self.types, typ);
let expected_index = Index::push_new(&mut self.expectations, expected);
let category_index = Self::push_category(self, category);
Constraint::Eq(type_index, expected_index, category_index, region)
}
pub fn equal_pattern_types(
&mut self,
typ: Type,
expected: PExpected<Type>,
category: PatternCategory,
region: Region,
) -> Constraint {
let type_index = Index::push_new(&mut self.types, typ);
let expected_index = Index::push_new(&mut self.pattern_expectations, expected);
let category_index = Self::push_pattern_category(self, category);
Constraint::Pattern(type_index, expected_index, category_index, region)
}
pub fn pattern_presence(
&mut self,
typ: Type,
expected: PExpected<Type>,
category: PatternCategory,
region: Region,
) -> Constraint {
let type_index = Index::push_new(&mut self.types, typ);
let expected_index = Index::push_new(&mut self.pattern_expectations, expected);
let category_index = Index::push_new(&mut self.pattern_categories, category);
Constraint::PatternPresence(type_index, expected_index, category_index, region)
}
pub fn is_open_type(&mut self, typ: Type) -> Constraint {
let type_index = Index::push_new(&mut self.types, typ);
Constraint::IsOpenType(type_index)
}
pub fn includes_tag<I>(
&mut self,
typ: Type,
tag_name: TagName,
types: I,
category: PatternCategory,
region: Region,
) -> Constraint
where
I: IntoIterator<Item = Type>,
{
let type_index = Index::push_new(&mut self.types, typ);
let category_index = Index::push_new(&mut self.pattern_categories, category);
let types_slice = Slice::extend_new(&mut self.types, types);
let includes_tag = IncludesTag {
type_index,
tag_name,
types: types_slice,
pattern_category: category_index,
region,
};
let includes_tag_index = Index::push_new(&mut self.includes_tags, includes_tag);
Constraint::IncludesTag(includes_tag_index)
}
fn variable_slice<I>(&mut self, it: I) -> Slice<Variable>
where
I: IntoIterator<Item = Variable>,
{
let start = self.variables.len();
self.variables.extend(it);
let length = self.variables.len() - start;
Slice::new(start as _, length as _)
}
fn def_types_slice<I>(&mut self, it: I) -> Slice<(Symbol, Loc<Index<Type>>)>
where
I: IntoIterator<Item = (Symbol, Loc<Type>)>,
{
let start = self.def_types.len();
for (symbol, loc_type) in it {
let Loc { region, value } = loc_type;
let type_index = Index::push_new(&mut self.types, value);
self.def_types.push((symbol, Loc::at(region, type_index)));
}
let length = self.def_types.len() - start;
Slice::new(start as _, length as _)
}
pub fn exists<I>(&mut self, flex_vars: I, defs_constraint: Constraint) -> Constraint
where
I: IntoIterator<Item = Variable>,
{
let defs_and_ret_constraint = Index::new(self.constraints.len() as _);
self.constraints.push(defs_constraint);
self.constraints.push(Constraint::True);
let let_contraint = LetConstraint {
rigid_vars: Slice::default(),
flex_vars: self.variable_slice(flex_vars),
def_types: Slice::default(),
defs_and_ret_constraint,
};
let let_index = Index::new(self.let_constraints.len() as _);
self.let_constraints.push(let_contraint);
Constraint::Let(let_index)
}
pub fn exists_many<I, C>(&mut self, flex_vars: I, defs_constraint: C) -> Constraint
where
I: IntoIterator<Item = Variable>,
C: IntoIterator<Item = Constraint>,
C::IntoIter: ExactSizeIterator,
{
let defs_constraint = self.and_constraint(defs_constraint);
let defs_and_ret_constraint = Index::new(self.constraints.len() as _);
self.constraints.push(defs_constraint);
self.constraints.push(Constraint::True);
let let_contraint = LetConstraint {
rigid_vars: Slice::default(),
flex_vars: self.variable_slice(flex_vars),
def_types: Slice::default(),
defs_and_ret_constraint,
};
let let_index = Index::new(self.let_constraints.len() as _);
self.let_constraints.push(let_contraint);
Constraint::Let(let_index)
}
pub fn let_constraint<I1, I2, I3>(
&mut self,
rigid_vars: I1,
flex_vars: I2,
def_types: I3,
defs_constraint: Constraint,
ret_constraint: Constraint,
) -> Constraint
where
I1: IntoIterator<Item = Variable>,
I2: IntoIterator<Item = Variable>,
I3: IntoIterator<Item = (Symbol, Loc<Type>)>,
{
let defs_and_ret_constraint = Index::new(self.constraints.len() as _);
self.constraints.push(defs_constraint);
self.constraints.push(ret_constraint);
let let_contraint = LetConstraint {
rigid_vars: self.variable_slice(rigid_vars),
flex_vars: self.variable_slice(flex_vars),
def_types: self.def_types_slice(def_types),
defs_and_ret_constraint,
};
let let_index = Index::new(self.let_constraints.len() as _);
self.let_constraints.push(let_contraint);
Constraint::Let(let_index)
}
pub fn and_constraint<I>(&mut self, constraints: I) -> Constraint
where
I: IntoIterator<Item = Constraint>,
I::IntoIter: ExactSizeIterator,
{
let mut it = constraints.into_iter();
match it.len() {
0 => Constraint::True,
1 => it.next().unwrap(),
_ => {
let start = self.constraints.len() as u32;
self.constraints.extend(it);
let end = self.constraints.len() as u32;
let slice = Slice::new(start, (end - start) as u16);
Constraint::And(slice)
}
}
}
pub fn lookup(
&mut self,
symbol: Symbol,
expected: Expected<Type>,
region: Region,
) -> Constraint {
Constraint::Lookup(
symbol,
Index::push_new(&mut self.expectations, expected),
region,
)
}
pub fn contains_save_the_environment(&self, constraint: &Constraint) -> bool {
match constraint {
Constraint::Eq(..) => false,
Constraint::Store(..) => false,
Constraint::Lookup(..) => false,
Constraint::Pattern(..) => false,
Constraint::True => false,
Constraint::SaveTheEnvironment => true,
Constraint::Let(index) => {
let let_constraint = &self.let_constraints[index.index()];
let offset = let_constraint.defs_and_ret_constraint.index();
let defs_constraint = &self.constraints[offset];
let ret_constraint = &self.constraints[offset + 1];
self.contains_save_the_environment(defs_constraint)
|| self.contains_save_the_environment(ret_constraint)
}
Constraint::And(slice) => {
let constraints = &self.constraints[slice.indices()];
constraints
.iter()
.any(|c| self.contains_save_the_environment(c))
}
Constraint::IsOpenType(_) => false,
Constraint::IncludesTag(_) => false,
Constraint::PatternPresence(_, _, _, _) => false,
}
}
pub fn store(
&mut self,
typ: Type,
variable: Variable,
filename: &'static str,
line_number: u32,
) -> Constraint {
let type_index = Index::push_new(&mut self.types, typ);
let string_index = Index::push_new(&mut self.strings, filename);
Constraint::Store(type_index, variable, string_index, line_number)
}
}
static_assertions::assert_eq_size!([u8; 3 * 8], Constraint);
#[derive(Debug, Clone, PartialEq)] #[derive(Debug, Clone, PartialEq)]
pub enum Constraint { pub enum Constraint {
Eq(Type, Expected<Type>, Category, Region), Eq(Index<Type>, Index<Expected<Type>>, Index<Category>, Region),
Store(Type, Variable, &'static str, u32), Store(Index<Type>, Variable, Index<&'static str>, u32),
Lookup(Symbol, Expected<Type>, Region), Lookup(Symbol, Index<Expected<Type>>, Region),
Pattern(Region, PatternCategory, Type, PExpected<Type>), Pattern(
Index<Type>,
Index<PExpected<Type>>,
Index<PatternCategory>,
Region,
),
True, // Used for things that always unify, e.g. blanks and runtime errors True, // Used for things that always unify, e.g. blanks and runtime errors
SaveTheEnvironment, SaveTheEnvironment,
Let(Box<LetConstraint>), Let(Index<LetConstraint>),
And(Vec<Constraint>), And(Slice<Constraint>),
Present(Type, PresenceConstraint), /// Presence constraints
IsOpenType(Index<Type>), // Theory; always applied to a variable? if yes the use that
IncludesTag(Index<IncludesTag>),
PatternPresence(
Index<Type>,
Index<PExpected<Type>>,
Index<PatternCategory>,
Region,
),
} }
#[derive(Debug, Clone, PartialEq)] #[derive(Debug, Clone, PartialEq)]
pub struct LetConstraint { pub struct LetConstraint {
pub rigid_vars: Vec<Variable>, pub rigid_vars: Slice<Variable>,
pub flex_vars: Vec<Variable>, pub flex_vars: Slice<Variable>,
pub def_types: SendMap<Symbol, Loc<Type>>, pub def_types: Slice<(Symbol, Loc<Index<Type>>)>,
pub defs_constraint: Constraint, pub defs_and_ret_constraint: Index<(Constraint, Constraint)>,
pub ret_constraint: Constraint,
} }
// VALIDATE #[derive(Debug, Clone, PartialEq)]
pub struct IncludesTag {
#[derive(Default, Clone)] pub type_index: Index<Type>,
struct Declared { pub tag_name: TagName,
pub rigid_vars: MutSet<Variable>, pub types: Slice<Type>,
pub flex_vars: MutSet<Variable>, pub pattern_category: Index<PatternCategory>,
} pub region: Region,
impl Constraint {
pub fn validate(&self) -> bool {
let mut unbound = Default::default();
validate_help(self, &Declared::default(), &mut unbound);
if !unbound.type_variables.is_empty() {
panic!("found unbound type variables {:?}", &unbound.type_variables);
}
if !unbound.lambda_set_variables.is_empty() {
panic!(
"found unbound lambda set variables {:?}",
&unbound.lambda_set_variables
);
}
if !unbound.recursion_variables.is_empty() {
panic!(
"found unbound recursion variables {:?}",
&unbound.recursion_variables
);
}
true
}
pub fn contains_save_the_environment(&self) -> bool {
match self {
Constraint::Eq(_, _, _, _) => false,
Constraint::Store(_, _, _, _) => false,
Constraint::Lookup(_, _, _) => false,
Constraint::Pattern(_, _, _, _) => false,
Constraint::True => false,
Constraint::SaveTheEnvironment => true,
Constraint::Let(boxed) => {
boxed.ret_constraint.contains_save_the_environment()
|| boxed.defs_constraint.contains_save_the_environment()
}
Constraint::And(cs) => cs.iter().any(|c| c.contains_save_the_environment()),
Constraint::Present(_, _) => false,
}
}
}
fn subtract(declared: &Declared, detail: &VariableDetail, accum: &mut VariableDetail) {
for var in &detail.type_variables {
if !(declared.rigid_vars.contains(var) || declared.flex_vars.contains(var)) {
accum.type_variables.insert(*var);
}
}
// lambda set variables are always flex
for var in &detail.lambda_set_variables {
if declared.rigid_vars.contains(var) {
panic!("lambda set variable {:?} is declared as rigid", var);
}
if !declared.flex_vars.contains(var) {
accum.lambda_set_variables.push(*var);
}
}
// recursion vars should be always rigid
for var in &detail.recursion_variables {
if declared.flex_vars.contains(var) {
panic!("recursion variable {:?} is declared as flex", var);
}
if !declared.rigid_vars.contains(var) {
accum.recursion_variables.insert(*var);
}
}
}
fn validate_help(constraint: &Constraint, declared: &Declared, accum: &mut VariableDetail) {
use Constraint::*;
match constraint {
True | SaveTheEnvironment | Lookup(_, _, _) => { /* nothing */ }
Store(typ, var, _, _) => {
subtract(declared, &typ.variables_detail(), accum);
if !declared.flex_vars.contains(var) {
accum.type_variables.insert(*var);
}
}
Constraint::Eq(typ, expected, _, _) => {
subtract(declared, &typ.variables_detail(), accum);
subtract(declared, &expected.get_type_ref().variables_detail(), accum);
}
Constraint::Pattern(_, _, typ, expected) => {
subtract(declared, &typ.variables_detail(), accum);
subtract(declared, &expected.get_type_ref().variables_detail(), accum);
}
Constraint::Let(letcon) => {
let mut declared = declared.clone();
declared
.rigid_vars
.extend(letcon.rigid_vars.iter().copied());
declared.flex_vars.extend(letcon.flex_vars.iter().copied());
validate_help(&letcon.defs_constraint, &declared, accum);
validate_help(&letcon.ret_constraint, &declared, accum);
}
Constraint::And(inner) => {
for c in inner {
validate_help(c, declared, accum);
}
}
Constraint::Present(typ, constr) => {
subtract(declared, &typ.variables_detail(), accum);
match constr {
PresenceConstraint::IncludesTag(_, tys, _, _) => {
for ty in tys {
subtract(declared, &ty.variables_detail(), accum);
}
}
PresenceConstraint::IsOpen => {}
PresenceConstraint::Pattern(_, _, expected) => {
subtract(declared, &typ.variables_detail(), accum);
subtract(declared, &expected.get_type_ref().variables_detail(), accum);
}
}
}
}
} }

View file

@ -399,7 +399,7 @@ pub fn sort_can_defs(
) -> (Result<Vec<Declaration>, RuntimeError>, Output) { ) -> (Result<Vec<Declaration>, RuntimeError>, Output) {
let CanDefs { let CanDefs {
refs_by_symbol, refs_by_symbol,
can_defs_by_symbol, mut can_defs_by_symbol,
aliases, aliases,
} = defs; } = defs;
@ -583,7 +583,7 @@ pub fn sort_can_defs(
&group, &group,
&env.closures, &env.closures,
&mut all_successors_with_self, &mut all_successors_with_self,
&can_defs_by_symbol, &mut can_defs_by_symbol,
&mut declarations, &mut declarations,
); );
} }
@ -717,7 +717,7 @@ pub fn sort_can_defs(
group, group,
&env.closures, &env.closures,
&mut all_successors_with_self, &mut all_successors_with_self,
&can_defs_by_symbol, &mut can_defs_by_symbol,
&mut declarations, &mut declarations,
); );
} }
@ -739,7 +739,7 @@ fn group_to_declaration(
group: &[Symbol], group: &[Symbol],
closures: &MutMap<Symbol, References>, closures: &MutMap<Symbol, References>,
successors: &mut dyn FnMut(&Symbol) -> ImSet<Symbol>, successors: &mut dyn FnMut(&Symbol) -> ImSet<Symbol>,
can_defs_by_symbol: &MutMap<Symbol, Def>, can_defs_by_symbol: &mut MutMap<Symbol, Def>,
declarations: &mut Vec<Declaration>, declarations: &mut Vec<Declaration>,
) { ) {
use Declaration::*; use Declaration::*;
@ -759,15 +759,14 @@ fn group_to_declaration(
// Can bind multiple symbols. When not incorrectly recursive (which is guaranteed in this function), // Can bind multiple symbols. When not incorrectly recursive (which is guaranteed in this function),
// normally `someDef` would be inserted twice. We use the region of the pattern as a unique key // normally `someDef` would be inserted twice. We use the region of the pattern as a unique key
// for a definition, so every definition is only inserted (thus typechecked and emitted) once // for a definition, so every definition is only inserted (thus typechecked and emitted) once
let mut seen_pattern_regions: ImSet<Region> = ImSet::default(); let mut seen_pattern_regions: Vec<Region> = Vec::with_capacity(2);
for cycle in strongly_connected_components(group, filtered_successors) { for cycle in strongly_connected_components(group, filtered_successors) {
if cycle.len() == 1 { if cycle.len() == 1 {
let symbol = &cycle[0]; let symbol = &cycle[0];
if let Some(can_def) = can_defs_by_symbol.get(symbol) { match can_defs_by_symbol.remove(symbol) {
let mut new_def = can_def.clone(); Some(mut new_def) => {
// Determine recursivity of closures that are not tail-recursive // Determine recursivity of closures that are not tail-recursive
if let Closure(ClosureData { if let Closure(ClosureData {
recursive: recursive @ Recursive::NotRecursive, recursive: recursive @ Recursive::NotRecursive,
@ -780,22 +779,24 @@ fn group_to_declaration(
let is_recursive = successors(symbol).contains(symbol); let is_recursive = successors(symbol).contains(symbol);
if !seen_pattern_regions.contains(&new_def.loc_pattern.region) { if !seen_pattern_regions.contains(&new_def.loc_pattern.region) {
seen_pattern_regions.push(new_def.loc_pattern.region);
if is_recursive { if is_recursive {
declarations.push(DeclareRec(vec![new_def.clone()])); declarations.push(DeclareRec(vec![new_def]));
} else { } else {
declarations.push(Declare(new_def.clone())); declarations.push(Declare(new_def));
} }
seen_pattern_regions.insert(new_def.loc_pattern.region);
} }
} }
None => roc_error_macros::internal_error!("def not available {:?}", symbol),
}
} else { } else {
let mut can_defs = Vec::new(); let mut can_defs = Vec::new();
// Topological sort gives us the reverse of the sorting we want! // Topological sort gives us the reverse of the sorting we want!
for symbol in cycle.into_iter().rev() { for symbol in cycle.into_iter().rev() {
if let Some(can_def) = can_defs_by_symbol.get(&symbol) { match can_defs_by_symbol.remove(&symbol) {
let mut new_def = can_def.clone(); Some(mut new_def) => {
// Determine recursivity of closures that are not tail-recursive // Determine recursivity of closures that are not tail-recursive
if let Closure(ClosureData { if let Closure(ClosureData {
recursive: recursive @ Recursive::NotRecursive, recursive: recursive @ Recursive::NotRecursive,
@ -806,10 +807,12 @@ fn group_to_declaration(
} }
if !seen_pattern_regions.contains(&new_def.loc_pattern.region) { if !seen_pattern_regions.contains(&new_def.loc_pattern.region) {
can_defs.push(new_def.clone()); seen_pattern_regions.push(new_def.loc_pattern.region);
}
seen_pattern_regions.insert(new_def.loc_pattern.region); can_defs.push(new_def);
}
}
None => roc_error_macros::internal_error!("def not available {:?}", symbol),
} }
} }
@ -861,6 +864,32 @@ fn pattern_to_vars_by_symbol(
} }
} }
fn single_can_def(
loc_can_pattern: Loc<Pattern>,
loc_can_expr: Loc<Expr>,
expr_var: Variable,
opt_loc_annotation: Option<Loc<crate::annotation::Annotation>>,
pattern_vars: SendMap<Symbol, Variable>,
) -> Def {
let def_annotation = opt_loc_annotation.map(|loc_annotation| Annotation {
signature: loc_annotation.value.typ,
introduced_variables: loc_annotation.value.introduced_variables,
aliases: loc_annotation.value.aliases,
region: loc_annotation.region,
});
Def {
expr_var,
loc_pattern: loc_can_pattern,
loc_expr: Loc {
region: loc_can_expr.region,
value: loc_can_expr.value,
},
pattern_vars,
annotation: def_annotation,
}
}
// TODO trim down these arguments! // TODO trim down these arguments!
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
#[allow(clippy::cognitive_complexity)] #[allow(clippy::cognitive_complexity)]
@ -884,25 +913,25 @@ fn canonicalize_pending_def<'a>(
AnnotationOnly(_, loc_can_pattern, loc_ann) => { AnnotationOnly(_, loc_can_pattern, loc_ann) => {
// annotation sans body cannot introduce new rigids that are visible in other annotations // annotation sans body cannot introduce new rigids that are visible in other annotations
// but the rigids can show up in type error messages, so still register them // but the rigids can show up in type error messages, so still register them
let ann = let type_annotation =
canonicalize_annotation(env, scope, &loc_ann.value, loc_ann.region, var_store); canonicalize_annotation(env, scope, &loc_ann.value, loc_ann.region, var_store);
// Record all the annotation's references in output.references.lookups // Record all the annotation's references in output.references.lookups
for symbol in ann.references { for symbol in type_annotation.references.iter() {
output.references.lookups.insert(symbol); output.references.lookups.insert(*symbol);
output.references.referenced_type_defs.insert(symbol); output.references.referenced_type_defs.insert(*symbol);
} }
aliases.extend(ann.aliases.clone()); aliases.extend(type_annotation.aliases.clone());
output.introduced_variables.union(&ann.introduced_variables); output
.introduced_variables
.union(&type_annotation.introduced_variables);
pattern_to_vars_by_symbol(&mut vars_by_symbol, &loc_can_pattern.value, expr_var); pattern_to_vars_by_symbol(&mut vars_by_symbol, &loc_can_pattern.value, expr_var);
let typ = ann.typ; let arity = type_annotation.typ.arity();
let arity = typ.arity();
let problem = match &loc_can_pattern.value { let problem = match &loc_can_pattern.value {
Pattern::Identifier(symbol) => RuntimeError::NoImplementationNamed { Pattern::Identifier(symbol) => RuntimeError::NoImplementationNamed {
@ -960,6 +989,16 @@ fn canonicalize_pending_def<'a>(
} }
}; };
if let Pattern::Identifier(symbol) = loc_can_pattern.value {
let def = single_can_def(
loc_can_pattern,
loc_can_expr,
expr_var,
Some(Loc::at(loc_ann.region, type_annotation)),
vars_by_symbol.clone(),
);
can_defs_by_symbol.insert(symbol, def);
} else {
for (_, (symbol, _)) in scope.idents() { for (_, (symbol, _)) in scope.idents() {
if !vars_by_symbol.contains_key(symbol) { if !vars_by_symbol.contains_key(symbol) {
continue; continue;
@ -980,38 +1019,39 @@ fn canonicalize_pending_def<'a>(
}, },
pattern_vars: vars_by_symbol.clone(), pattern_vars: vars_by_symbol.clone(),
annotation: Some(Annotation { annotation: Some(Annotation {
signature: typ.clone(), signature: type_annotation.typ.clone(),
introduced_variables: output.introduced_variables.clone(), introduced_variables: output.introduced_variables.clone(),
aliases: ann.aliases.clone(), aliases: type_annotation.aliases.clone(),
region: loc_ann.region, region: loc_ann.region,
}), }),
}, },
); );
} }
} }
}
Alias { .. } => unreachable!("Aliases are handled in a separate pass"), Alias { .. } => unreachable!("Aliases are handled in a separate pass"),
InvalidAlias { .. } => { InvalidAlias { .. } => {
// invalid aliases and opaques (shadowed, incorrect patterns) get ignored // invalid aliases and opaques (shadowed, incorrect patterns) get ignored
} }
TypedBody(loc_pattern, loc_can_pattern, loc_ann, loc_expr) => { TypedBody(_loc_pattern, loc_can_pattern, loc_ann, loc_expr) => {
let ann = let type_annotation =
canonicalize_annotation(env, scope, &loc_ann.value, loc_ann.region, var_store); canonicalize_annotation(env, scope, &loc_ann.value, loc_ann.region, var_store);
// Record all the annotation's references in output.references.lookups // Record all the annotation's references in output.references.lookups
for symbol in ann.references { for symbol in type_annotation.references.iter() {
output.references.lookups.insert(symbol); output.references.lookups.insert(*symbol);
output.references.referenced_type_defs.insert(symbol); output.references.referenced_type_defs.insert(*symbol);
} }
let typ = ann.typ; for (symbol, alias) in type_annotation.aliases.clone() {
for (symbol, alias) in ann.aliases.clone() {
aliases.insert(symbol, alias); aliases.insert(symbol, alias);
} }
output.introduced_variables.union(&ann.introduced_variables); output
.introduced_variables
.union(&type_annotation.introduced_variables);
// bookkeeping for tail-call detection. If we're assigning to an // bookkeeping for tail-call detection. If we're assigning to an
// identifier (e.g. `f = \x -> ...`), then this symbol can be tail-called. // identifier (e.g. `f = \x -> ...`), then this symbol can be tail-called.
@ -1038,42 +1078,31 @@ fn canonicalize_pending_def<'a>(
// reset the tailcallable_symbol // reset the tailcallable_symbol
env.tailcallable_symbol = outer_identifier; env.tailcallable_symbol = outer_identifier;
// see below: a closure needs a fresh References!
let mut is_closure = false;
// First, make sure we are actually assigning an identifier instead of (for example) a tag. // First, make sure we are actually assigning an identifier instead of (for example) a tag.
// //
// If we're assigning (UserId userId) = ... then this is certainly not a closure declaration, // If we're assigning (UserId userId) = ... then this is certainly not a closure declaration,
// which also implies it's not a self tail call! // which also implies it's not a self tail call!
// //
// Only defs of the form (foo = ...) can be closure declarations or self tail calls. // Only defs of the form (foo = ...) can be closure declarations or self tail calls.
if let ( if let Pattern::Identifier(symbol) = loc_can_pattern.value {
&ast::Pattern::Identifier(_name), if let &Closure(ClosureData {
&Pattern::Identifier(ref defined_symbol),
&Closure(ClosureData {
function_type, function_type,
closure_type, closure_type,
closure_ext_var, closure_ext_var,
return_type, return_type,
name: ref symbol, name: ref closure_name,
ref arguments, ref arguments,
loc_body: ref body, loc_body: ref body,
ref captured_symbols, ref captured_symbols,
.. ..
}), }) = &loc_can_expr.value
) = ( {
&loc_pattern.value,
&loc_can_pattern.value,
&loc_can_expr.value,
) {
is_closure = true;
// Since everywhere in the code it'll be referred to by its defined name, // Since everywhere in the code it'll be referred to by its defined name,
// remove its generated name from the closure map. (We'll re-insert it later.) // remove its generated name from the closure map. (We'll re-insert it later.)
let references = env.closures.remove(symbol).unwrap_or_else(|| { let references = env.closures.remove(closure_name).unwrap_or_else(|| {
panic!( panic!(
"Tried to remove symbol {:?} from procedures, but it was not found: {:?}", "Tried to remove symbol {:?} from procedures, but it was not found: {:?}",
symbol, env.closures closure_name, env.closures
) )
}); });
@ -1081,52 +1110,59 @@ fn canonicalize_pending_def<'a>(
// closures don't have a name, and therefore pick a fresh symbol. But in this // closures don't have a name, and therefore pick a fresh symbol. But in this
// case, the closure has a proper name (e.g. `foo` in `foo = \x y -> ...` // case, the closure has a proper name (e.g. `foo` in `foo = \x y -> ...`
// and we want to reference it by that name. // and we want to reference it by that name.
env.closures.insert(*defined_symbol, references); env.closures.insert(symbol, references);
// The closure is self tail recursive iff it tail calls itself (by defined name). // The closure is self tail recursive iff it tail calls itself (by defined name).
let is_recursive = match can_output.tail_call { let is_recursive = match can_output.tail_call {
Some(ref symbol) if symbol == defined_symbol => Recursive::TailRecursive, Some(tail_symbol) if tail_symbol == symbol => Recursive::TailRecursive,
_ => Recursive::NotRecursive, _ => Recursive::NotRecursive,
}; };
// Recursion doesn't count as referencing. (If it did, all recursive functions // Recursion doesn't count as referencing. (If it did, all recursive functions
// would result in circular def errors!) // would result in circular def errors!)
refs_by_symbol refs_by_symbol.entry(symbol).and_modify(|(_, refs)| {
.entry(*defined_symbol) refs.lookups = refs.lookups.without(&symbol);
.and_modify(|(_, refs)| {
refs.lookups = refs.lookups.without(defined_symbol);
}); });
// renamed_closure_def = Some(&defined_symbol); // renamed_closure_def = Some(&symbol);
loc_can_expr.value = Closure(ClosureData { loc_can_expr.value = Closure(ClosureData {
function_type, function_type,
closure_type, closure_type,
closure_ext_var, closure_ext_var,
return_type, return_type,
name: *defined_symbol, name: symbol,
captured_symbols: captured_symbols.clone(), captured_symbols: captured_symbols.clone(),
recursive: is_recursive, recursive: is_recursive,
arguments: arguments.clone(), arguments: arguments.clone(),
loc_body: body.clone(), loc_body: body.clone(),
}); });
// Functions' references don't count in defs.
// See 3d5a2560057d7f25813112dfa5309956c0f9e6a9 and its
// parent commit for the bug this fixed!
let refs = References::new();
refs_by_symbol.insert(symbol, (loc_can_pattern.region, refs));
} else {
let refs = can_output.references;
refs_by_symbol.insert(symbol, (loc_ann.region, refs));
} }
// Store the referenced locals in the refs_by_symbol map, so we can later figure out let def = single_can_def(
// which defined names reference each other. loc_can_pattern,
loc_can_expr,
expr_var,
Some(Loc::at(loc_ann.region, type_annotation)),
vars_by_symbol.clone(),
);
can_defs_by_symbol.insert(symbol, def);
} else {
for (_, (symbol, region)) in scope.idents() { for (_, (symbol, region)) in scope.idents() {
if !vars_by_symbol.contains_key(symbol) { if !vars_by_symbol.contains_key(symbol) {
continue; continue;
} }
let refs = let refs = can_output.references.clone();
// Functions' references don't count in defs.
// See 3d5a2560057d7f25813112dfa5309956c0f9e6a9 and its
// parent commit for the bug this fixed!
if is_closure {
References::new()
} else {
can_output.references.clone()
};
refs_by_symbol.insert(*symbol, (*region, refs)); refs_by_symbol.insert(*symbol, (*region, refs));
@ -1143,15 +1179,16 @@ fn canonicalize_pending_def<'a>(
}, },
pattern_vars: vars_by_symbol.clone(), pattern_vars: vars_by_symbol.clone(),
annotation: Some(Annotation { annotation: Some(Annotation {
signature: typ.clone(), signature: type_annotation.typ.clone(),
introduced_variables: output.introduced_variables.clone(), introduced_variables: type_annotation.introduced_variables.clone(),
aliases: ann.aliases.clone(), aliases: type_annotation.aliases.clone(),
region: loc_ann.region, region: loc_ann.region,
}), }),
}, },
); );
} }
} }
}
// If we have a pattern, then the def has a body (that is, it's not a // If we have a pattern, then the def has a body (that is, it's not a
// standalone annotation), so we need to canonicalize the pattern and expr. // standalone annotation), so we need to canonicalize the pattern and expr.
Body(loc_pattern, loc_can_pattern, loc_expr) => { Body(loc_pattern, loc_can_pattern, loc_expr) => {
@ -1181,42 +1218,31 @@ fn canonicalize_pending_def<'a>(
// reset the tailcallable_symbol // reset the tailcallable_symbol
env.tailcallable_symbol = outer_identifier; env.tailcallable_symbol = outer_identifier;
// see below: a closure needs a fresh References!
let mut is_closure = false;
// First, make sure we are actually assigning an identifier instead of (for example) a tag. // First, make sure we are actually assigning an identifier instead of (for example) a tag.
// //
// If we're assigning (UserId userId) = ... then this is certainly not a closure declaration, // If we're assigning (UserId userId) = ... then this is certainly not a closure declaration,
// which also implies it's not a self tail call! // which also implies it's not a self tail call!
// //
// Only defs of the form (foo = ...) can be closure declarations or self tail calls. // Only defs of the form (foo = ...) can be closure declarations or self tail calls.
if let ( if let Pattern::Identifier(symbol) = loc_can_pattern.value {
&ast::Pattern::Identifier(_name), if let &Closure(ClosureData {
&Pattern::Identifier(ref defined_symbol),
&Closure(ClosureData {
function_type, function_type,
closure_type, closure_type,
closure_ext_var, closure_ext_var,
return_type, return_type,
name: ref symbol, name: ref closure_name,
ref arguments, ref arguments,
loc_body: ref body, loc_body: ref body,
ref captured_symbols, ref captured_symbols,
.. ..
}), }) = &loc_can_expr.value
) = ( {
&loc_pattern.value,
&loc_can_pattern.value,
&loc_can_expr.value,
) {
is_closure = true;
// Since everywhere in the code it'll be referred to by its defined name, // Since everywhere in the code it'll be referred to by its defined name,
// remove its generated name from the closure map. (We'll re-insert it later.) // remove its generated name from the closure map. (We'll re-insert it later.)
let references = env.closures.remove(symbol).unwrap_or_else(|| { let references = env.closures.remove(closure_name).unwrap_or_else(|| {
panic!( panic!(
"Tried to remove symbol {:?} from procedures, but it was not found: {:?}", "Tried to remove symbol {:?} from procedures, but it was not found: {:?}",
symbol, env.closures closure_name, env.closures
) )
}); });
@ -1224,20 +1250,18 @@ fn canonicalize_pending_def<'a>(
// closures don't have a name, and therefore pick a fresh symbol. But in this // closures don't have a name, and therefore pick a fresh symbol. But in this
// case, the closure has a proper name (e.g. `foo` in `foo = \x y -> ...` // case, the closure has a proper name (e.g. `foo` in `foo = \x y -> ...`
// and we want to reference it by that name. // and we want to reference it by that name.
env.closures.insert(*defined_symbol, references); env.closures.insert(symbol, references);
// The closure is self tail recursive iff it tail calls itself (by defined name). // The closure is self tail recursive iff it tail calls itself (by defined name).
let is_recursive = match can_output.tail_call { let is_recursive = match can_output.tail_call {
Some(ref symbol) if symbol == defined_symbol => Recursive::TailRecursive, Some(tail_symbol) if tail_symbol == symbol => Recursive::TailRecursive,
_ => Recursive::NotRecursive, _ => Recursive::NotRecursive,
}; };
// Recursion doesn't count as referencing. (If it did, all recursive functions // Recursion doesn't count as referencing. (If it did, all recursive functions
// would result in circular def errors!) // would result in circular def errors!)
refs_by_symbol refs_by_symbol.entry(symbol).and_modify(|(_, refs)| {
.entry(*defined_symbol) refs.lookups = refs.lookups.without(&symbol);
.and_modify(|(_, refs)| {
refs.lookups = refs.lookups.without(defined_symbol);
}); });
loc_can_expr.value = Closure(ClosureData { loc_can_expr.value = Closure(ClosureData {
@ -1245,27 +1269,36 @@ fn canonicalize_pending_def<'a>(
closure_type, closure_type,
closure_ext_var, closure_ext_var,
return_type, return_type,
name: *defined_symbol, name: symbol,
captured_symbols: captured_symbols.clone(), captured_symbols: captured_symbols.clone(),
recursive: is_recursive, recursive: is_recursive,
arguments: arguments.clone(), arguments: arguments.clone(),
loc_body: body.clone(), loc_body: body.clone(),
}); });
}
// Store the referenced locals in the refs_by_symbol map, so we can later figure out
// which defined names reference each other.
for (symbol, region) in bindings_from_patterns(std::iter::once(&loc_can_pattern)) {
let refs =
// Functions' references don't count in defs. // Functions' references don't count in defs.
// See 3d5a2560057d7f25813112dfa5309956c0f9e6a9 and its // See 3d5a2560057d7f25813112dfa5309956c0f9e6a9 and its
// parent commit for the bug this fixed! // parent commit for the bug this fixed!
if is_closure { let refs = References::new();
References::new() refs_by_symbol.insert(symbol, (loc_pattern.region, refs));
} else { } else {
can_output.references.clone() let refs = can_output.references.clone();
}; refs_by_symbol.insert(symbol, (loc_pattern.region, refs));
}
let def = single_can_def(
loc_can_pattern,
loc_can_expr,
expr_var,
None,
vars_by_symbol.clone(),
);
can_defs_by_symbol.insert(symbol, def);
} else {
// Store the referenced locals in the refs_by_symbol map, so we can later figure out
// which defined names reference each other.
for (symbol, region) in bindings_from_patterns(std::iter::once(&loc_can_pattern)) {
let refs = can_output.references.clone();
refs_by_symbol.insert(symbol, (region, refs)); refs_by_symbol.insert(symbol, (region, refs));
can_defs_by_symbol.insert( can_defs_by_symbol.insert(
@ -1284,6 +1317,7 @@ fn canonicalize_pending_def<'a>(
}, },
); );
} }
}
output.union(can_output); output.union(can_output);
} }

View file

@ -161,13 +161,13 @@ where
} }
#[derive(Clone, Copy, PartialEq, Eq, Debug)] #[derive(Clone, Copy, PartialEq, Eq, Debug)]
pub struct Index(usize); pub struct HumanIndex(usize);
impl Index { impl HumanIndex {
pub const FIRST: Self = Index(0); pub const FIRST: Self = HumanIndex(0);
pub fn zero_based(i: usize) -> Self { pub fn zero_based(i: usize) -> Self {
Index(i) HumanIndex(i)
} }
pub fn to_zero_based(self) -> usize { pub fn to_zero_based(self) -> usize {
@ -175,7 +175,7 @@ impl Index {
} }
pub fn one_based(i: usize) -> Self { pub fn one_based(i: usize) -> Self {
Index(i - 1) HumanIndex(i - 1)
} }
pub fn ordinal(self) -> std::string::String { pub fn ordinal(self) -> std::string::String {

View file

@ -3,3 +3,4 @@
#![allow(clippy::large_enum_variant)] #![allow(clippy::large_enum_variant)]
pub mod all; pub mod all;
pub mod soa;

View file

@ -0,0 +1,119 @@
use std::usize;
#[derive(PartialEq, Eq)]
pub struct Index<T> {
index: u32,
_marker: std::marker::PhantomData<T>,
}
impl<T> Clone for Index<T> {
fn clone(&self) -> Self {
Self {
index: self.index,
_marker: self._marker,
}
}
}
impl<T> Copy for Index<T> {}
impl<T> std::fmt::Debug for Index<T> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "Index({})", self.index)
}
}
impl<T> Index<T> {
pub const fn new(index: u32) -> Self {
Self {
index,
_marker: std::marker::PhantomData,
}
}
pub const fn index(&self) -> usize {
self.index as usize
}
pub fn push_new(vector: &mut Vec<T>, value: T) -> Index<T> {
let index = Self::new(vector.len() as _);
vector.push(value);
index
}
}
#[derive(PartialEq, Eq)]
pub struct Slice<T> {
start: u32,
length: u16,
_marker: std::marker::PhantomData<T>,
}
impl<T> Clone for Slice<T> {
fn clone(&self) -> Self {
Self {
start: self.start,
length: self.length,
_marker: self._marker,
}
}
}
impl<T> Copy for Slice<T> {}
impl<T> std::fmt::Debug for Slice<T> {
fn fmt(&self, f: &mut std::fmt::Formatter<'_>) -> std::fmt::Result {
write!(f, "Slice(start = {}, length = {})", self.start, self.length)
}
}
impl<T> Default for Slice<T> {
fn default() -> Self {
Self {
start: Default::default(),
length: Default::default(),
_marker: Default::default(),
}
}
}
impl<T> Slice<T> {
pub const fn new(start: u32, length: u16) -> Self {
Self {
start,
length,
_marker: std::marker::PhantomData,
}
}
pub fn extend_new<I>(vector: &mut Vec<T>, values: I) -> Slice<T>
where
I: IntoIterator<Item = T>,
{
let start = vector.len() as u32;
vector.extend(values);
let end = vector.len() as u32;
Self::new(start, (end - start) as u16)
}
pub const fn len(&self) -> usize {
self.length as _
}
pub const fn is_empty(&self) -> bool {
self.length == 0
}
pub const fn indices(&self) -> std::ops::Range<usize> {
self.start as usize..(self.start as usize + self.length as usize)
}
pub fn into_iter(&self) -> impl Iterator<Item = Index<T>> {
self.indices().map(|i| Index::new(i as _))
}
}

View file

@ -14,3 +14,4 @@ roc_parse = { path = "../parse" }
roc_types = { path = "../types" } roc_types = { path = "../types" }
roc_can = { path = "../can" } roc_can = { path = "../can" }
roc_builtins = { path = "../builtins" } roc_builtins = { path = "../builtins" }
arrayvec = "0.7.2"

View file

@ -1,8 +1,7 @@
use roc_can::constraint::Constraint::{self, *}; use arrayvec::ArrayVec;
use roc_can::constraint::LetConstraint; use roc_can::constraint::{Constraint, Constraints};
use roc_can::expected::Expected::{self, *}; use roc_can::expected::Expected::{self, *};
use roc_can::num::{FloatBound, FloatWidth, IntBound, IntWidth, NumericBound, SignDemand}; use roc_can::num::{FloatBound, FloatWidth, IntBound, IntWidth, NumericBound, SignDemand};
use roc_collections::all::SendMap;
use roc_module::ident::{Lowercase, TagName}; use roc_module::ident::{Lowercase, TagName};
use roc_module::symbol::Symbol; use roc_module::symbol::Symbol;
use roc_region::all::Region; use roc_region::all::Region;
@ -12,8 +11,10 @@ use roc_types::types::Type::{self, *};
use roc_types::types::{AliasKind, Category}; use roc_types::types::{AliasKind, Category};
#[must_use] #[must_use]
#[inline(always)]
pub fn add_numeric_bound_constr( pub fn add_numeric_bound_constr(
constrs: &mut Vec<Constraint>, constraints: &mut Constraints,
num_constraints: &mut impl Extend<Constraint>,
num_type: Type, num_type: Type,
bound: impl TypedNumericBound, bound: impl TypedNumericBound,
region: Region, region: Region,
@ -27,12 +28,12 @@ pub fn add_numeric_bound_constr(
0 => total_num_type, 0 => total_num_type,
1 => { 1 => {
let actual_type = Variable(range[0]); let actual_type = Variable(range[0]);
constrs.push(Eq( let expected = Expected::ForReason(Reason::NumericLiteralSuffix, actual_type, region);
total_num_type.clone(), let because_suffix =
Expected::ForReason(Reason::NumericLiteralSuffix, actual_type, region), constraints.equal_types(total_num_type.clone(), expected, category, region);
category,
region, num_constraints.extend([because_suffix]);
));
total_num_type total_num_type
} }
_ => RangedNumber(Box::new(total_num_type), range), _ => RangedNumber(Box::new(total_num_type), range),
@ -41,6 +42,7 @@ pub fn add_numeric_bound_constr(
#[inline(always)] #[inline(always)]
pub fn int_literal( pub fn int_literal(
constraints: &mut Constraints,
num_var: Variable, num_var: Variable,
precision_var: Variable, precision_var: Variable,
expected: Expected<Type>, expected: Expected<Type>,
@ -49,31 +51,35 @@ pub fn int_literal(
) -> Constraint { ) -> Constraint {
let reason = Reason::IntLiteral; let reason = Reason::IntLiteral;
let mut constrs = Vec::with_capacity(3); // Always add the bound first; this improves the resolved type quality in case it's an alias like "U8".
// Always add the bound first; this improves the resolved type quality in case it's an alias let mut constrs = ArrayVec::<_, 3>::new();
// like "U8".
let num_type = add_numeric_bound_constr( let num_type = add_numeric_bound_constr(
constraints,
&mut constrs, &mut constrs,
Variable(num_var), Variable(num_var),
bound, bound,
region, region,
Category::Num, Category::Num,
); );
constrs.extend(vec![
Eq( constrs.extend([
constraints.equal_types(
num_type.clone(), num_type.clone(),
ForReason(reason, num_int(Type::Variable(precision_var)), region), ForReason(reason, num_int(Type::Variable(precision_var)), region),
Category::Int, Category::Int,
region, region,
), ),
Eq(num_type, expected, Category::Int, region), constraints.equal_types(num_type, expected, Category::Int, region),
]); ]);
exists(vec![num_var], And(constrs)) // TODO the precision_var is not part of the exists here; for float it is. Which is correct?
let and_constraint = constraints.and_constraint(constrs);
constraints.exists([num_var], and_constraint)
} }
#[inline(always)] #[inline(always)]
pub fn float_literal( pub fn float_literal(
constraints: &mut Constraints,
num_var: Variable, num_var: Variable,
precision_var: Variable, precision_var: Variable,
expected: Expected<Type>, expected: Expected<Type>,
@ -82,29 +88,33 @@ pub fn float_literal(
) -> Constraint { ) -> Constraint {
let reason = Reason::FloatLiteral; let reason = Reason::FloatLiteral;
let mut constrs = Vec::with_capacity(3); let mut constrs = ArrayVec::<_, 3>::new();
let num_type = add_numeric_bound_constr( let num_type = add_numeric_bound_constr(
constraints,
&mut constrs, &mut constrs,
Variable(num_var), Variable(num_var),
bound, bound,
region, region,
Category::Float, Category::Float,
); );
constrs.extend(vec![
Eq( constrs.extend([
constraints.equal_types(
num_type.clone(), num_type.clone(),
ForReason(reason, num_float(Type::Variable(precision_var)), region), ForReason(reason, num_float(Type::Variable(precision_var)), region),
Category::Float, Category::Float,
region, region,
), ),
Eq(num_type, expected, Category::Float, region), constraints.equal_types(num_type, expected, Category::Float, region),
]); ]);
exists(vec![num_var, precision_var], And(constrs)) let and_constraint = constraints.and_constraint(constrs);
constraints.exists([num_var, precision_var], and_constraint)
} }
#[inline(always)] #[inline(always)]
pub fn num_literal( pub fn num_literal(
constraints: &mut Constraints,
num_var: Variable, num_var: Variable,
expected: Expected<Type>, expected: Expected<Type>,
region: Region, region: Region,
@ -112,23 +122,20 @@ pub fn num_literal(
) -> Constraint { ) -> Constraint {
let open_number_type = crate::builtins::num_num(Type::Variable(num_var)); let open_number_type = crate::builtins::num_num(Type::Variable(num_var));
let mut constrs = Vec::with_capacity(3); let mut constrs = ArrayVec::<_, 2>::new();
let num_type = let num_type = add_numeric_bound_constr(
add_numeric_bound_constr(&mut constrs, open_number_type, bound, region, Category::Num); constraints,
constrs.extend(vec![Eq(num_type, expected, Category::Num, region)]); &mut constrs,
open_number_type,
bound,
region,
Category::Num,
);
exists(vec![num_var], And(constrs)) constrs.extend([constraints.equal_types(num_type, expected, Category::Num, region)]);
}
#[inline(always)] let and_constraint = constraints.and_constraint(constrs);
pub fn exists(flex_vars: Vec<Variable>, constraint: Constraint) -> Constraint { constraints.exists([num_var], and_constraint)
Let(Box::new(LetConstraint {
rigid_vars: Vec::new(),
flex_vars,
def_types: SendMap::default(),
defs_constraint: constraint,
ret_constraint: Constraint::True,
}))
} }
#[inline(always)] #[inline(always)]

File diff suppressed because it is too large Load diff

View file

@ -1,6 +1,5 @@
use crate::expr::constrain_decls;
use roc_builtins::std::StdLib; use roc_builtins::std::StdLib;
use roc_can::constraint::{Constraint, LetConstraint}; use roc_can::constraint::{Constraint, Constraints};
use roc_can::def::Declaration; use roc_can::def::Declaration;
use roc_collections::all::{MutMap, MutSet, SendMap}; use roc_collections::all::{MutMap, MutSet, SendMap};
use roc_module::symbol::{ModuleId, Symbol}; use roc_module::symbol::{ModuleId, Symbol};
@ -17,13 +16,12 @@ pub enum ExposedModuleTypes {
Valid(MutMap<Symbol, SolvedType>, MutMap<Symbol, Alias>), Valid(MutMap<Symbol, SolvedType>, MutMap<Symbol, Alias>),
} }
pub struct ConstrainedModule { pub fn constrain_module(
pub unused_imports: MutMap<ModuleId, Region>, constraints: &mut Constraints,
pub constraint: Constraint, declarations: &[Declaration],
} home: ModuleId,
) -> Constraint {
pub fn constrain_module(declarations: &[Declaration], home: ModuleId) -> Constraint { crate::expr::constrain_decls(constraints, home, declarations)
constrain_decls(home, declarations)
} }
#[derive(Debug, Clone)] #[derive(Debug, Clone)]
@ -33,11 +31,11 @@ pub struct Import {
} }
pub fn constrain_imported_values( pub fn constrain_imported_values(
constraints: &mut Constraints,
imports: Vec<Import>, imports: Vec<Import>,
body_con: Constraint, body_con: Constraint,
var_store: &mut VarStore, var_store: &mut VarStore,
) -> (Vec<Variable>, Constraint) { ) -> (Vec<Variable>, Constraint) {
use Constraint::*;
let mut def_types = SendMap::default(); let mut def_types = SendMap::default();
let mut rigid_vars = Vec::new(); let mut rigid_vars = Vec::new();
@ -84,24 +82,19 @@ pub fn constrain_imported_values(
( (
rigid_vars.clone(), rigid_vars.clone(),
Let(Box::new(LetConstraint { constraints.let_constraint(rigid_vars, [], def_types, Constraint::True, body_con),
rigid_vars,
flex_vars: Vec::new(),
def_types,
defs_constraint: True,
ret_constraint: body_con,
})),
) )
} }
/// Run pre_constrain_imports to get imported_symbols and imported_aliases. /// Run pre_constrain_imports to get imported_symbols and imported_aliases.
pub fn constrain_imports( pub fn constrain_imports(
constraints: &mut Constraints,
imported_symbols: Vec<Import>, imported_symbols: Vec<Import>,
constraint: Constraint, constraint: Constraint,
var_store: &mut VarStore, var_store: &mut VarStore,
) -> Constraint { ) -> Constraint {
let (_introduced_rigids, constraint) = let (_introduced_rigids, constraint) =
constrain_imported_values(imported_symbols, constraint, var_store); constrain_imported_values(constraints, imported_symbols, constraint, var_store);
// TODO determine what to do with those rigids // TODO determine what to do with those rigids
// for var in introduced_rigids { // for var in introduced_rigids {

View file

@ -1,10 +1,10 @@
use crate::builtins; use crate::builtins;
use crate::expr::{constrain_expr, Env}; use crate::expr::{constrain_expr, Env};
use roc_can::constraint::{Constraint, PresenceConstraint}; use roc_can::constraint::{Constraint, Constraints};
use roc_can::expected::{Expected, PExpected}; use roc_can::expected::{Expected, PExpected};
use roc_can::pattern::Pattern::{self, *}; use roc_can::pattern::Pattern::{self, *};
use roc_can::pattern::{DestructType, RecordDestruct}; use roc_can::pattern::{DestructType, RecordDestruct};
use roc_collections::all::{Index, SendMap}; use roc_collections::all::{HumanIndex, SendMap};
use roc_module::ident::Lowercase; use roc_module::ident::Lowercase;
use roc_module::symbol::Symbol; use roc_module::symbol::Symbol;
use roc_region::all::{Loc, Region}; use roc_region::all::{Loc, Region};
@ -27,7 +27,7 @@ pub struct PatternState {
/// definition has an annotation, we instead now add `x => Int`. /// definition has an annotation, we instead now add `x => Int`.
pub fn headers_from_annotation( pub fn headers_from_annotation(
pattern: &Pattern, pattern: &Pattern,
annotation: &Loc<Type>, annotation: &Loc<&Type>,
) -> Option<SendMap<Symbol, Loc<Type>>> { ) -> Option<SendMap<Symbol, Loc<Type>>> {
let mut headers = SendMap::default(); let mut headers = SendMap::default();
// Check that the annotation structurally agrees with the pattern, preventing e.g. `{ x, y } : Int` // Check that the annotation structurally agrees with the pattern, preventing e.g. `{ x, y } : Int`
@ -44,12 +44,13 @@ pub fn headers_from_annotation(
fn headers_from_annotation_help( fn headers_from_annotation_help(
pattern: &Pattern, pattern: &Pattern,
annotation: &Loc<Type>, annotation: &Loc<&Type>,
headers: &mut SendMap<Symbol, Loc<Type>>, headers: &mut SendMap<Symbol, Loc<Type>>,
) -> bool { ) -> bool {
match pattern { match pattern {
Identifier(symbol) | Shadowed(_, _, symbol) => { Identifier(symbol) | Shadowed(_, _, symbol) => {
headers.insert(*symbol, annotation.clone()); let typ = Loc::at(annotation.region, annotation.value.clone());
headers.insert(*symbol, typ);
true true
} }
Underscore Underscore
@ -106,7 +107,7 @@ fn headers_from_annotation_help(
.all(|(arg_pattern, arg_type)| { .all(|(arg_pattern, arg_type)| {
headers_from_annotation_help( headers_from_annotation_help(
&arg_pattern.1.value, &arg_pattern.1.value,
&Loc::at(annotation.region, arg_type.clone()), &Loc::at(annotation.region, arg_type),
headers, headers,
) )
}) })
@ -135,12 +136,13 @@ fn headers_from_annotation_help(
&& type_arguments.len() == pat_type_arguments.len() && type_arguments.len() == pat_type_arguments.len()
&& lambda_set_variables.len() == pat_lambda_set_variables.len() => && lambda_set_variables.len() == pat_lambda_set_variables.len() =>
{ {
headers.insert(*opaque, annotation.clone()); let typ = Loc::at(annotation.region, annotation.value.clone());
headers.insert(*opaque, typ);
let (_, argument_pat) = &**argument; let (_, argument_pat) = &**argument;
headers_from_annotation_help( headers_from_annotation_help(
&argument_pat.value, &argument_pat.value,
&Loc::at(annotation.region, (**actual).clone()), &Loc::at(annotation.region, actual),
headers, headers,
) )
} }
@ -153,6 +155,7 @@ fn headers_from_annotation_help(
/// initialize the Vecs in PatternState using with_capacity /// initialize the Vecs in PatternState using with_capacity
/// based on its knowledge of their lengths. /// based on its knowledge of their lengths.
pub fn constrain_pattern( pub fn constrain_pattern(
constraints: &mut Constraints,
env: &Env, env: &Env,
pattern: &Pattern, pattern: &Pattern,
region: Region, region: Region,
@ -167,20 +170,18 @@ pub fn constrain_pattern(
// A -> "" // A -> ""
// _ -> "" // _ -> ""
// so, we know that "x" (in this case, a tag union) must be open. // so, we know that "x" (in this case, a tag union) must be open.
state.constraints.push(Constraint::Present( state
expected.get_type(), .constraints
PresenceConstraint::IsOpen, .push(constraints.is_open_type(expected.get_type()));
));
} }
UnsupportedPattern(_) | MalformedPattern(_, _) | OpaqueNotInScope(..) => { UnsupportedPattern(_) | MalformedPattern(_, _) | OpaqueNotInScope(..) => {
// Erroneous patterns don't add any constraints. // Erroneous patterns don't add any constraints.
} }
Identifier(symbol) | Shadowed(_, _, symbol) => { Identifier(symbol) | Shadowed(_, _, symbol) => {
state.constraints.push(Constraint::Present( state
expected.get_type_ref().clone(), .constraints
PresenceConstraint::IsOpen, .push(constraints.is_open_type(expected.get_type_ref().clone()));
));
state.headers.insert( state.headers.insert(
*symbol, *symbol,
Loc { Loc {
@ -196,6 +197,7 @@ pub fn constrain_pattern(
let num_type = builtins::num_num(Type::Variable(var)); let num_type = builtins::num_num(Type::Variable(var));
let num_type = builtins::add_numeric_bound_constr( let num_type = builtins::add_numeric_bound_constr(
constraints,
&mut state.constraints, &mut state.constraints,
num_type, num_type,
bound, bound,
@ -203,11 +205,11 @@ pub fn constrain_pattern(
Category::Num, Category::Num,
); );
state.constraints.push(Constraint::Pattern( state.constraints.push(constraints.equal_pattern_types(
region,
PatternCategory::Num,
num_type, num_type,
expected, expected,
PatternCategory::Num,
region,
)); ));
} }
@ -215,6 +217,7 @@ pub fn constrain_pattern(
// First constraint on the free num var; this improves the resolved type quality in // First constraint on the free num var; this improves the resolved type quality in
// case the bound is an alias. // case the bound is an alias.
let num_type = builtins::add_numeric_bound_constr( let num_type = builtins::add_numeric_bound_constr(
constraints,
&mut state.constraints, &mut state.constraints,
Type::Variable(num_var), Type::Variable(num_var),
bound, bound,
@ -225,7 +228,7 @@ pub fn constrain_pattern(
// Link the free num var with the int var and our expectation. // Link the free num var with the int var and our expectation.
let int_type = builtins::num_int(Type::Variable(precision_var)); let int_type = builtins::num_int(Type::Variable(precision_var));
state.constraints.push(Constraint::Eq( state.constraints.push(constraints.equal_types(
num_type, // TODO check me if something breaks! num_type, // TODO check me if something breaks!
Expected::NoExpectation(int_type), Expected::NoExpectation(int_type),
Category::Int, Category::Int,
@ -233,11 +236,11 @@ pub fn constrain_pattern(
)); ));
// Also constrain the pattern against the num var, again to reuse aliases if they're present. // Also constrain the pattern against the num var, again to reuse aliases if they're present.
state.constraints.push(Constraint::Pattern( state.constraints.push(constraints.equal_pattern_types(
region,
PatternCategory::Int,
Type::Variable(num_var), Type::Variable(num_var),
expected, expected,
PatternCategory::Int,
region,
)); ));
} }
@ -245,6 +248,7 @@ pub fn constrain_pattern(
// First constraint on the free num var; this improves the resolved type quality in // First constraint on the free num var; this improves the resolved type quality in
// case the bound is an alias. // case the bound is an alias.
let num_type = builtins::add_numeric_bound_constr( let num_type = builtins::add_numeric_bound_constr(
constraints,
&mut state.constraints, &mut state.constraints,
Type::Variable(num_var), Type::Variable(num_var),
bound, bound,
@ -255,7 +259,7 @@ pub fn constrain_pattern(
// Link the free num var with the float var and our expectation. // Link the free num var with the float var and our expectation.
let float_type = builtins::num_float(Type::Variable(precision_var)); let float_type = builtins::num_float(Type::Variable(precision_var));
state.constraints.push(Constraint::Eq( state.constraints.push(constraints.equal_types(
num_type.clone(), // TODO check me if something breaks! num_type.clone(), // TODO check me if something breaks!
Expected::NoExpectation(float_type), Expected::NoExpectation(float_type),
Category::Float, Category::Float,
@ -263,29 +267,29 @@ pub fn constrain_pattern(
)); ));
// Also constrain the pattern against the num var, again to reuse aliases if they're present. // Also constrain the pattern against the num var, again to reuse aliases if they're present.
state.constraints.push(Constraint::Pattern( state.constraints.push(constraints.equal_pattern_types(
region,
PatternCategory::Float,
num_type, // TODO check me if something breaks! num_type, // TODO check me if something breaks!
expected, expected,
PatternCategory::Float,
region,
)); ));
} }
StrLiteral(_) => { StrLiteral(_) => {
state.constraints.push(Constraint::Pattern( state.constraints.push(constraints.equal_pattern_types(
region,
PatternCategory::Str,
builtins::str_type(), builtins::str_type(),
expected, expected,
PatternCategory::Str,
region,
)); ));
} }
SingleQuote(_) => { SingleQuote(_) => {
state.constraints.push(Constraint::Pattern( state.constraints.push(constraints.equal_pattern_types(
region,
PatternCategory::Character,
builtins::num_u32(), builtins::num_u32(),
expected, expected,
PatternCategory::Character,
region,
)); ));
} }
@ -322,36 +326,39 @@ pub fn constrain_pattern(
let field_type = match typ { let field_type = match typ {
DestructType::Guard(guard_var, loc_guard) => { DestructType::Guard(guard_var, loc_guard) => {
state.constraints.push(Constraint::Present( state.constraints.push(constraints.pattern_presence(
Type::Variable(*guard_var), Type::Variable(*guard_var),
PresenceConstraint::Pattern(
region,
PatternCategory::PatternGuard,
PExpected::ForReason( PExpected::ForReason(
PReason::PatternGuard, PReason::PatternGuard,
pat_type.clone(), pat_type.clone(),
loc_guard.region, loc_guard.region,
), ),
), PatternCategory::PatternGuard,
region,
)); ));
state.vars.push(*guard_var); state.vars.push(*guard_var);
constrain_pattern(env, &loc_guard.value, loc_guard.region, expected, state); constrain_pattern(
constraints,
env,
&loc_guard.value,
loc_guard.region,
expected,
state,
);
RecordField::Demanded(pat_type) RecordField::Demanded(pat_type)
} }
DestructType::Optional(expr_var, loc_expr) => { DestructType::Optional(expr_var, loc_expr) => {
state.constraints.push(Constraint::Present( state.constraints.push(constraints.pattern_presence(
Type::Variable(*expr_var), Type::Variable(*expr_var),
PresenceConstraint::Pattern(
region,
PatternCategory::PatternDefault,
PExpected::ForReason( PExpected::ForReason(
PReason::OptionalField, PReason::OptionalField,
pat_type.clone(), pat_type.clone(),
loc_expr.region, loc_expr.region,
), ),
), PatternCategory::PatternDefault,
region,
)); ));
state.vars.push(*expr_var); state.vars.push(*expr_var);
@ -362,8 +369,13 @@ pub fn constrain_pattern(
loc_expr.region, loc_expr.region,
); );
let expr_con = let expr_con = constrain_expr(
constrain_expr(env, loc_expr.region, &loc_expr.value, expr_expected); constraints,
env,
loc_expr.region,
&loc_expr.value,
expr_expected,
);
state.constraints.push(expr_con); state.constraints.push(expr_con);
RecordField::Optional(pat_type) RecordField::Optional(pat_type)
@ -381,16 +393,18 @@ pub fn constrain_pattern(
let record_type = Type::Record(field_types, Box::new(ext_type)); let record_type = Type::Record(field_types, Box::new(ext_type));
let whole_con = Constraint::Eq( let whole_con = constraints.equal_types(
Type::Variable(*whole_var), Type::Variable(*whole_var),
Expected::NoExpectation(record_type), Expected::NoExpectation(record_type),
Category::Storage(std::file!(), std::line!()), Category::Storage(std::file!(), std::line!()),
region, region,
); );
let record_con = Constraint::Present( let record_con = constraints.pattern_presence(
Type::Variable(*whole_var), Type::Variable(*whole_var),
PresenceConstraint::Pattern(region, PatternCategory::Record, expected), expected,
PatternCategory::Record,
region,
); );
state.constraints.push(whole_con); state.constraints.push(whole_con);
@ -412,29 +426,36 @@ pub fn constrain_pattern(
let expected = PExpected::ForReason( let expected = PExpected::ForReason(
PReason::TagArg { PReason::TagArg {
tag_name: tag_name.clone(), tag_name: tag_name.clone(),
index: Index::zero_based(index), index: HumanIndex::zero_based(index),
}, },
pattern_type, pattern_type,
region, region,
); );
constrain_pattern(env, &loc_pattern.value, loc_pattern.region, expected, state); constrain_pattern(
constraints,
env,
&loc_pattern.value,
loc_pattern.region,
expected,
state,
);
} }
let pat_category = PatternCategory::Ctor(tag_name.clone()); let pat_category = PatternCategory::Ctor(tag_name.clone());
let whole_con = Constraint::Present( let whole_con = constraints.includes_tag(
expected.clone().get_type(), expected.clone().get_type(),
PresenceConstraint::IncludesTag(
tag_name.clone(), tag_name.clone(),
argument_types.clone(), argument_types.clone(),
region,
pat_category.clone(), pat_category.clone(),
), region,
); );
let tag_con = Constraint::Present( let tag_con = constraints.pattern_presence(
Type::Variable(*whole_var), Type::Variable(*whole_var),
PresenceConstraint::Pattern(region, pat_category, expected), expected,
pat_category,
region,
); );
state.vars.push(*whole_var); state.vars.push(*whole_var);
@ -466,6 +487,7 @@ pub fn constrain_pattern(
// First, add a constraint for the argument "who" // First, add a constraint for the argument "who"
let arg_pattern_expected = PExpected::NoExpectation(arg_pattern_type.clone()); let arg_pattern_expected = PExpected::NoExpectation(arg_pattern_type.clone());
constrain_pattern( constrain_pattern(
constraints,
env, env,
&loc_arg_pattern.value, &loc_arg_pattern.value,
loc_arg_pattern.region, loc_arg_pattern.region,
@ -474,7 +496,7 @@ pub fn constrain_pattern(
); );
// Next, link `whole_var` to the opaque type of "@Id who" // Next, link `whole_var` to the opaque type of "@Id who"
let whole_con = Constraint::Eq( let whole_con = constraints.equal_types(
Type::Variable(*whole_var), Type::Variable(*whole_var),
Expected::NoExpectation(opaque_type), Expected::NoExpectation(opaque_type),
Category::Storage(std::file!(), std::line!()), Category::Storage(std::file!(), std::line!()),
@ -484,7 +506,7 @@ pub fn constrain_pattern(
// Link the entire wrapped opaque type (with the now-constrained argument) to the type // Link the entire wrapped opaque type (with the now-constrained argument) to the type
// variables of the opaque type // variables of the opaque type
// TODO: better expectation here // TODO: better expectation here
let link_type_variables_con = Constraint::Eq( let link_type_variables_con = constraints.equal_types(
(**specialized_def_type).clone(), (**specialized_def_type).clone(),
Expected::NoExpectation(arg_pattern_type), Expected::NoExpectation(arg_pattern_type),
Category::OpaqueWrap(*opaque), Category::OpaqueWrap(*opaque),
@ -492,9 +514,11 @@ pub fn constrain_pattern(
); );
// Next, link `whole_var` (the type of "@Id who") to the expected type // Next, link `whole_var` (the type of "@Id who") to the expected type
let opaque_pattern_con = Constraint::Present( let opaque_pattern_con = constraints.pattern_presence(
Type::Variable(*whole_var), Type::Variable(*whole_var),
PresenceConstraint::Pattern(region, PatternCategory::Opaque(*opaque), expected), expected,
PatternCategory::Opaque(*opaque),
region,
); );
state state

View file

@ -1,4 +1,4 @@
use roc_collections::all::{Index, MutMap}; use roc_collections::all::{HumanIndex, MutMap};
use roc_module::ident::{Lowercase, TagIdIntType, TagName}; use roc_module::ident::{Lowercase, TagIdIntType, TagName};
use roc_region::all::Region; use roc_region::all::Region;
use roc_std::RocDec; use roc_std::RocDec;
@ -70,7 +70,7 @@ pub enum Error {
Redundant { Redundant {
overall_region: Region, overall_region: Region,
branch_region: Region, branch_region: Region,
index: Index, index: HumanIndex,
}, },
} }

View file

@ -293,6 +293,9 @@ impl<'a> Formattable for TypeAnnotation<'a> {
SpaceBefore(ann, spaces) => { SpaceBefore(ann, spaces) => {
buf.newline(); buf.newline();
buf.indent(indent);
fmt_comments_only(buf, spaces.iter(), NewlineAt::Bottom, indent); fmt_comments_only(buf, spaces.iter(), NewlineAt::Bottom, indent);
ann.format_with_options(buf, parens, Newlines::No, indent) ann.format_with_options(buf, parens, Newlines::No, indent)
} }

View file

@ -55,7 +55,6 @@ impl<'a> Buf<'a> {
pub fn push_str_allow_spaces(&mut self, s: &str) { pub fn push_str_allow_spaces(&mut self, s: &str) {
debug_assert!(!self.beginning_of_line); debug_assert!(!self.beginning_of_line);
debug_assert!(!s.contains('\n'));
self.flush_spaces(); self.flush_spaces();

View file

@ -2984,6 +2984,18 @@ mod test_fmt {
)); ));
} }
#[test]
fn multiline_higher_order_function() {
expr_formats_same(indoc!(
r#"
foo :
(Str -> Bool) -> Bool
42
"#
));
}
#[test] #[test]
/// Test that everything under examples/ is formatted correctly /// Test that everything under examples/ is formatted correctly
/// If this test fails on your diff, it probably means you need to re-format the examples. /// If this test fails on your diff, it probably means you need to re-format the examples.

View file

@ -7,6 +7,7 @@ license = "UPL-1.0"
edition = "2018" edition = "2018"
[dependencies] [dependencies]
roc_alias_analysis = { path = "../alias_analysis" }
roc_collections = { path = "../collections" } roc_collections = { path = "../collections" }
roc_module = { path = "../module" } roc_module = { path = "../module" }
roc_builtins = { path = "../builtins" } roc_builtins = { path = "../builtins" }

View file

@ -13,7 +13,7 @@ use crate::llvm::build_list::{
self, allocate_list, empty_polymorphic_list, list_all, list_any, list_append, list_concat, self, allocate_list, empty_polymorphic_list, list_all, list_any, list_append, list_concat,
list_contains, list_drop_at, list_find_unsafe, list_get_unsafe, list_join, list_keep_errs, list_contains, list_drop_at, list_find_unsafe, list_get_unsafe, list_join, list_keep_errs,
list_keep_if, list_keep_oks, list_len, list_map, list_map2, list_map3, list_map4, list_keep_if, list_keep_oks, list_len, list_map, list_map2, list_map3, list_map4,
list_map_with_index, list_prepend, list_range, list_repeat, list_reverse, list_set, list_map_with_index, list_prepend, list_range, list_repeat, list_replace_unsafe, list_reverse,
list_single, list_sort_with, list_sublist, list_swap, list_single, list_sort_with, list_sublist, list_swap,
}; };
use crate::llvm::build_str::{ use crate::llvm::build_str::{
@ -710,7 +710,7 @@ fn promote_to_main_function<'a, 'ctx, 'env>(
top_level: ProcLayout<'a>, top_level: ProcLayout<'a>,
) -> (&'static str, FunctionValue<'ctx>) { ) -> (&'static str, FunctionValue<'ctx>) {
let it = top_level.arguments.iter().copied(); let it = top_level.arguments.iter().copied();
let bytes = roc_mono::alias_analysis::func_name_bytes_help(symbol, it, &top_level.result); let bytes = roc_alias_analysis::func_name_bytes_help(symbol, it, &top_level.result);
let func_name = FuncName(&bytes); let func_name = FuncName(&bytes);
let func_solutions = mod_solutions.func_solutions(func_name).unwrap(); let func_solutions = mod_solutions.func_solutions(func_name).unwrap();
@ -4045,7 +4045,7 @@ pub fn build_proc_headers<'a, 'ctx, 'env>(
// Populate Procs further and get the low-level Expr from the canonical Expr // Populate Procs further and get the low-level Expr from the canonical Expr
let mut headers = Vec::with_capacity_in(procedures.len(), env.arena); let mut headers = Vec::with_capacity_in(procedures.len(), env.arena);
for ((symbol, layout), proc) in procedures { for ((symbol, layout), proc) in procedures {
let name_bytes = roc_mono::alias_analysis::func_name_bytes(&proc); let name_bytes = roc_alias_analysis::func_name_bytes(&proc);
let func_name = FuncName(&name_bytes); let func_name = FuncName(&name_bytes);
let func_solutions = mod_solutions.func_solutions(func_name).unwrap(); let func_solutions = mod_solutions.func_solutions(func_name).unwrap();
@ -4110,7 +4110,7 @@ fn build_procedures_help<'a, 'ctx, 'env>(
let it = procedures.iter().map(|x| x.1); let it = procedures.iter().map(|x| x.1);
let solutions = match roc_mono::alias_analysis::spec_program(opt_level, entry_point, it) { let solutions = match roc_alias_analysis::spec_program(opt_level, entry_point, it) {
Err(e) => panic!("Error in alias analysis: {}", e), Err(e) => panic!("Error in alias analysis: {}", e),
Ok(solutions) => solutions, Ok(solutions) => solutions,
}; };
@ -4118,7 +4118,7 @@ fn build_procedures_help<'a, 'ctx, 'env>(
let solutions = env.arena.alloc(solutions); let solutions = env.arena.alloc(solutions);
let mod_solutions = solutions let mod_solutions = solutions
.mod_solutions(roc_mono::alias_analysis::MOD_APP) .mod_solutions(roc_alias_analysis::MOD_APP)
.unwrap(); .unwrap();
// Add all the Proc headers to the module. // Add all the Proc headers to the module.
@ -4470,11 +4470,8 @@ pub fn build_proc<'a, 'ctx, 'env>(
// * roc__mainForHost_1_Update_result_size() -> i64 // * roc__mainForHost_1_Update_result_size() -> i64
let it = top_level.arguments.iter().copied(); let it = top_level.arguments.iter().copied();
let bytes = roc_mono::alias_analysis::func_name_bytes_help( let bytes =
symbol, roc_alias_analysis::func_name_bytes_help(symbol, it, &top_level.result);
it,
&top_level.result,
);
let func_name = FuncName(&bytes); let func_name = FuncName(&bytes);
let func_solutions = mod_solutions.func_solutions(func_name).unwrap(); let func_solutions = mod_solutions.func_solutions(func_name).unwrap();
@ -5666,12 +5663,12 @@ fn run_low_level<'a, 'ctx, 'env>(
wrapper_struct, wrapper_struct,
) )
} }
ListSet => { ListReplaceUnsafe => {
let list = load_symbol(scope, &args[0]); let list = load_symbol(scope, &args[0]);
let index = load_symbol(scope, &args[1]); let index = load_symbol(scope, &args[1]);
let (element, element_layout) = load_symbol_and_layout(scope, &args[2]); let (element, element_layout) = load_symbol_and_layout(scope, &args[2]);
list_set( list_replace_unsafe(
env, env,
layout_ids, layout_ids,
list, list,

View file

@ -291,52 +291,70 @@ pub fn list_drop_at<'a, 'ctx, 'env>(
) )
} }
/// List.set : List elem, Nat, elem -> List elem /// List.replace_unsafe : List elem, Nat, elem -> { list: List elem, value: elem }
pub fn list_set<'a, 'ctx, 'env>( pub fn list_replace_unsafe<'a, 'ctx, 'env>(
env: &Env<'a, 'ctx, 'env>, env: &Env<'a, 'ctx, 'env>,
layout_ids: &mut LayoutIds<'a>, _layout_ids: &mut LayoutIds<'a>,
list: BasicValueEnum<'ctx>, list: BasicValueEnum<'ctx>,
index: IntValue<'ctx>, index: IntValue<'ctx>,
element: BasicValueEnum<'ctx>, element: BasicValueEnum<'ctx>,
element_layout: &Layout<'a>, element_layout: &Layout<'a>,
update_mode: UpdateMode, update_mode: UpdateMode,
) -> BasicValueEnum<'ctx> { ) -> BasicValueEnum<'ctx> {
let dec_element_fn = build_dec_wrapper(env, layout_ids, element_layout); let element_type = basic_type_from_layout(env, element_layout);
let element_ptr = env
.builder
.build_alloca(element_type, "output_element_as_opaque");
let (length, bytes) = load_list( // Assume the bounds have already been checked earlier
env.builder, // (e.g. by List.replace or List.set, which wrap List.#replaceUnsafe)
list.into_struct_value(), let new_list = match update_mode {
env.context.i8_type().ptr_type(AddressSpace::Generic), UpdateMode::InPlace => call_list_bitcode_fn(
);
let new_bytes = match update_mode {
UpdateMode::InPlace => call_bitcode_fn(
env, env,
&[ &[
bytes.into(), pass_list_cc(env, list),
index.into(), index.into(),
pass_element_as_opaque(env, element, *element_layout), pass_element_as_opaque(env, element, *element_layout),
layout_width(env, element_layout), layout_width(env, element_layout),
dec_element_fn.as_global_value().as_pointer_value().into(), pass_as_opaque(env, element_ptr),
], ],
bitcode::LIST_SET_IN_PLACE, bitcode::LIST_REPLACE_IN_PLACE,
), ),
UpdateMode::Immutable => call_bitcode_fn( UpdateMode::Immutable => call_list_bitcode_fn(
env, env,
&[ &[
bytes.into(), pass_list_cc(env, list),
length.into(),
env.alignment_intvalue(element_layout), env.alignment_intvalue(element_layout),
index.into(), index.into(),
pass_element_as_opaque(env, element, *element_layout), pass_element_as_opaque(env, element, *element_layout),
layout_width(env, element_layout), layout_width(env, element_layout),
dec_element_fn.as_global_value().as_pointer_value().into(), pass_as_opaque(env, element_ptr),
], ],
bitcode::LIST_SET, bitcode::LIST_REPLACE,
), ),
}; };
store_list(env, new_bytes.into_pointer_value(), length) // Load the element and returned list into a struct.
let old_element = env.builder.build_load(element_ptr, "load_element");
let result = env
.context
.struct_type(
&[super::convert::zig_list_type(env).into(), element_type],
false,
)
.const_zero();
let result = env
.builder
.build_insert_value(result, new_list, 0, "insert_list")
.unwrap();
env.builder
.build_insert_value(result, old_element, 1, "insert_value")
.unwrap()
.into_struct_value()
.into()
} }
fn bounds_check_comparison<'ctx>( fn bounds_check_comparison<'ctx>(

View file

@ -260,14 +260,14 @@ impl<'a> LowLevelCall<'a> {
_ => internal_error!("invalid storage for List"), _ => internal_error!("invalid storage for List"),
}, },
ListGetUnsafe | ListSet | ListSingle | ListRepeat | ListReverse | ListConcat ListGetUnsafe | ListSingle | ListRepeat | ListReverse | ListConcat | ListContains
| ListContains | ListAppend | ListPrepend | ListJoin | ListRange | ListMap | ListAppend | ListPrepend | ListJoin | ListRange | ListMap | ListMap2 | ListMap3
| ListMap2 | ListMap3 | ListMap4 | ListMapWithIndex | ListKeepIf | ListWalk | ListMap4 | ListMapWithIndex | ListKeepIf | ListWalk | ListWalkUntil
| ListWalkUntil | ListWalkBackwards | ListKeepOks | ListKeepErrs | ListSortWith | ListWalkBackwards | ListKeepOks | ListKeepErrs | ListSortWith | ListSublist
| ListSublist | ListDropAt | ListSwap | ListAny | ListAll | ListFindUnsafe | ListDropAt | ListSwap | ListAny | ListAll | ListFindUnsafe | DictSize | DictEmpty
| DictSize | DictEmpty | DictInsert | DictRemove | DictContains | DictGetUnsafe | DictInsert | DictRemove | DictContains | DictGetUnsafe | DictKeys | DictValues
| DictKeys | DictValues | DictUnion | DictIntersection | DictDifference | DictWalk | DictUnion | DictIntersection | DictDifference | DictWalk | SetFromList
| SetFromList | SetToDict => { | SetToDict | ListReplaceUnsafe => {
todo!("{:?}", self.lowlevel); todo!("{:?}", self.lowlevel);
} }

View file

@ -5,14 +5,14 @@ use crossbeam::deque::{Injector, Stealer, Worker};
use crossbeam::thread; use crossbeam::thread;
use parking_lot::Mutex; use parking_lot::Mutex;
use roc_builtins::std::StdLib; use roc_builtins::std::StdLib;
use roc_can::constraint::Constraint; use roc_can::constraint::{Constraint as ConstraintSoa, Constraints};
use roc_can::def::Declaration; use roc_can::def::Declaration;
use roc_can::module::{canonicalize_module_defs, Module}; use roc_can::module::{canonicalize_module_defs, Module};
use roc_collections::all::{default_hasher, BumpMap, MutMap, MutSet}; use roc_collections::all::{default_hasher, BumpMap, MutMap, MutSet};
use roc_constrain::module::{ use roc_constrain::module::{
constrain_imports, pre_constrain_imports, ConstrainableImports, Import, constrain_imports, constrain_module, pre_constrain_imports, ConstrainableImports,
ExposedModuleTypes, Import, SubsByModule,
}; };
use roc_constrain::module::{constrain_module, ExposedModuleTypes, SubsByModule};
use roc_module::ident::{Ident, ModuleName, QualifiedModuleName}; use roc_module::ident::{Ident, ModuleName, QualifiedModuleName};
use roc_module::symbol::{ use roc_module::symbol::{
IdentIds, Interns, ModuleId, ModuleIds, PQModuleName, PackageModuleIds, PackageQualified, IdentIds, Interns, ModuleId, ModuleIds, PQModuleName, PackageModuleIds, PackageQualified,
@ -294,6 +294,7 @@ fn start_phase<'a>(
module, module,
ident_ids, ident_ids,
module_timing, module_timing,
constraints,
constraint, constraint,
var_store, var_store,
imported_modules, imported_modules,
@ -306,6 +307,7 @@ fn start_phase<'a>(
module, module,
ident_ids, ident_ids,
module_timing, module_timing,
constraints,
constraint, constraint,
var_store, var_store,
imported_modules, imported_modules,
@ -456,7 +458,8 @@ struct ConstrainedModule {
module: Module, module: Module,
declarations: Vec<Declaration>, declarations: Vec<Declaration>,
imported_modules: MutMap<ModuleId, Region>, imported_modules: MutMap<ModuleId, Region>,
constraint: Constraint, constraints: Constraints,
constraint: ConstraintSoa,
ident_ids: IdentIds, ident_ids: IdentIds,
var_store: VarStore, var_store: VarStore,
dep_idents: MutMap<ModuleId, IdentIds>, dep_idents: MutMap<ModuleId, IdentIds>,
@ -567,7 +570,7 @@ enum Msg<'a> {
}, },
FinishedAllTypeChecking { FinishedAllTypeChecking {
solved_subs: Solved<Subs>, solved_subs: Solved<Subs>,
exposed_vars_by_symbol: MutMap<Symbol, Variable>, exposed_vars_by_symbol: Vec<(Symbol, Variable)>,
exposed_aliases_by_symbol: MutMap<Symbol, Alias>, exposed_aliases_by_symbol: MutMap<Symbol, Alias>,
exposed_values: Vec<Symbol>, exposed_values: Vec<Symbol>,
dep_idents: MutMap<ModuleId, IdentIds>, dep_idents: MutMap<ModuleId, IdentIds>,
@ -793,7 +796,8 @@ enum BuildTask<'a> {
ident_ids: IdentIds, ident_ids: IdentIds,
imported_symbols: Vec<Import>, imported_symbols: Vec<Import>,
module_timing: ModuleTiming, module_timing: ModuleTiming,
constraint: Constraint, constraints: Constraints,
constraint: ConstraintSoa,
var_store: VarStore, var_store: VarStore,
declarations: Vec<Declaration>, declarations: Vec<Declaration>,
dep_idents: MutMap<ModuleId, IdentIds>, dep_idents: MutMap<ModuleId, IdentIds>,
@ -1100,7 +1104,7 @@ fn load<'a>(
) -> Result<LoadResult<'a>, LoadingProblem<'a>> { ) -> Result<LoadResult<'a>, LoadingProblem<'a>> {
// When compiling to wasm, we cannot spawn extra threads // When compiling to wasm, we cannot spawn extra threads
// so we have a single-threaded implementation // so we have a single-threaded implementation
if cfg!(target_family = "wasm") { if true || cfg!(target_family = "wasm") {
load_single_threaded( load_single_threaded(
arena, arena,
load_start, load_start,
@ -2318,7 +2322,7 @@ fn finish(
solved: Solved<Subs>, solved: Solved<Subs>,
exposed_values: Vec<Symbol>, exposed_values: Vec<Symbol>,
exposed_aliases_by_symbol: MutMap<Symbol, Alias>, exposed_aliases_by_symbol: MutMap<Symbol, Alias>,
exposed_vars_by_symbol: MutMap<Symbol, Variable>, exposed_vars_by_symbol: Vec<(Symbol, Variable)>,
dep_idents: MutMap<ModuleId, IdentIds>, dep_idents: MutMap<ModuleId, IdentIds>,
documentation: MutMap<ModuleId, ModuleDocumentation>, documentation: MutMap<ModuleId, ModuleDocumentation>,
) -> LoadedModule { ) -> LoadedModule {
@ -2548,6 +2552,7 @@ fn load_module<'a>(
Loc::at_zero(ExposedName::new("isEmpty")), Loc::at_zero(ExposedName::new("isEmpty")),
Loc::at_zero(ExposedName::new("get")), Loc::at_zero(ExposedName::new("get")),
Loc::at_zero(ExposedName::new("set")), Loc::at_zero(ExposedName::new("set")),
Loc::at_zero(ExposedName::new("replace")),
Loc::at_zero(ExposedName::new("append")), Loc::at_zero(ExposedName::new("append")),
Loc::at_zero(ExposedName::new("map")), Loc::at_zero(ExposedName::new("map")),
Loc::at_zero(ExposedName::new("len")), Loc::at_zero(ExposedName::new("len")),
@ -2637,6 +2642,7 @@ fn load_module<'a>(
get : List a, Nat -> Result a [ OutOfBounds ]* get : List a, Nat -> Result a [ OutOfBounds ]*
set : List a, Nat, a -> List a set : List a, Nat, a -> List a
replace : List a, Nat, a -> { list : List a, value : a }
append : List a, a -> List a append : List a, a -> List a
prepend : List a, a -> List a prepend : List a, a -> List a
len : List a -> Nat len : List a -> Nat
@ -4271,7 +4277,8 @@ impl<'a> BuildTask<'a> {
module: Module, module: Module,
ident_ids: IdentIds, ident_ids: IdentIds,
module_timing: ModuleTiming, module_timing: ModuleTiming,
constraint: Constraint, constraints: Constraints,
constraint: ConstraintSoa,
var_store: VarStore, var_store: VarStore,
imported_modules: MutMap<ModuleId, Region>, imported_modules: MutMap<ModuleId, Region>,
exposed_types: &mut SubsByModule, exposed_types: &mut SubsByModule,
@ -4301,6 +4308,7 @@ impl<'a> BuildTask<'a> {
module, module,
ident_ids, ident_ids,
imported_symbols, imported_symbols,
constraints,
constraint, constraint,
var_store, var_store,
declarations, declarations,
@ -4317,7 +4325,8 @@ fn run_solve<'a>(
ident_ids: IdentIds, ident_ids: IdentIds,
mut module_timing: ModuleTiming, mut module_timing: ModuleTiming,
imported_symbols: Vec<Import>, imported_symbols: Vec<Import>,
constraint: Constraint, mut constraints: Constraints,
constraint: ConstraintSoa,
mut var_store: VarStore, mut var_store: VarStore,
decls: Vec<Declaration>, decls: Vec<Declaration>,
dep_idents: MutMap<ModuleId, IdentIds>, dep_idents: MutMap<ModuleId, IdentIds>,
@ -4328,7 +4337,12 @@ fn run_solve<'a>(
// Finish constraining the module by wrapping the existing Constraint // Finish constraining the module by wrapping the existing Constraint
// in the ones we just computed. We can do this off the main thread. // in the ones we just computed. We can do this off the main thread.
let constraint = constrain_imports(imported_symbols, constraint, &mut var_store); let constraint = constrain_imports(
&mut constraints,
imported_symbols,
constraint,
&mut var_store,
);
let constrain_end = SystemTime::now(); let constrain_end = SystemTime::now();
@ -4341,25 +4355,25 @@ fn run_solve<'a>(
.. ..
} = module; } = module;
if false { // TODO
debug_assert!(constraint.validate(), "{:?}", &constraint); // if false { debug_assert!(constraint.validate(), "{:?}", &constraint); }
}
let (solved_subs, solved_env, problems) = let (solved_subs, solved_env, problems) =
roc_solve::module::run_solve(aliases, rigid_variables, constraint, var_store); roc_solve::module::run_solve(&constraints, constraint, rigid_variables, var_store);
let mut exposed_vars_by_symbol: MutMap<Symbol, Variable> = solved_env.vars_by_symbol.clone(); let exposed_vars_by_symbol: Vec<_> = solved_env
exposed_vars_by_symbol.retain(|k, _| exposed_symbols.contains(k)); .vars_by_symbol()
.filter(|(k, _)| exposed_symbols.contains(k))
.collect();
let solved_types = let solved_types = roc_solve::module::make_solved_types(&solved_subs, &exposed_vars_by_symbol);
roc_solve::module::make_solved_types(&solved_env, &solved_subs, &exposed_vars_by_symbol);
let solved_module = SolvedModule { let solved_module = SolvedModule {
exposed_vars_by_symbol, exposed_vars_by_symbol,
exposed_symbols: exposed_symbols.into_iter().collect::<Vec<_>>(), exposed_symbols: exposed_symbols.into_iter().collect::<Vec<_>>(),
solved_types, solved_types,
problems, problems,
aliases: solved_env.aliases, aliases,
}; };
// Record the final timings // Record the final timings
@ -4490,7 +4504,9 @@ fn canonicalize_and_constrain<'a>(
} }
}; };
let constraint = constrain_module(&module_output.declarations, module_id); let mut constraints = Constraints::new();
let constraint =
constrain_module(&mut constraints, &module_output.declarations, module_id);
let module = Module { let module = Module {
module_id, module_id,
@ -4506,6 +4522,7 @@ fn canonicalize_and_constrain<'a>(
declarations: module_output.declarations, declarations: module_output.declarations,
imported_modules, imported_modules,
var_store, var_store,
constraints,
constraint, constraint,
ident_ids: module_output.ident_ids, ident_ids: module_output.ident_ids,
dep_idents, dep_idents,
@ -4988,6 +5005,7 @@ fn run_task<'a>(
module, module,
module_timing, module_timing,
imported_symbols, imported_symbols,
constraints,
constraint, constraint,
var_store, var_store,
ident_ids, ident_ids,
@ -4999,6 +5017,7 @@ fn run_task<'a>(
ident_ids, ident_ids,
module_timing, module_timing,
imported_symbols, imported_symbols,
constraints,
constraint, constraint,
var_store, var_store,
declarations, declarations,

View file

@ -25,9 +25,9 @@ pub enum LowLevel {
StrToNum, StrToNum,
ListLen, ListLen,
ListGetUnsafe, ListGetUnsafe,
ListSet,
ListSingle, ListSingle,
ListRepeat, ListRepeat,
ListReplaceUnsafe,
ListReverse, ListReverse,
ListConcat, ListConcat,
ListContains, ListContains,
@ -229,7 +229,7 @@ impl LowLevelWrapperType {
Symbol::STR_TO_I8 => WrapperIsRequired, Symbol::STR_TO_I8 => WrapperIsRequired,
Symbol::LIST_LEN => CanBeReplacedBy(ListLen), Symbol::LIST_LEN => CanBeReplacedBy(ListLen),
Symbol::LIST_GET => WrapperIsRequired, Symbol::LIST_GET => WrapperIsRequired,
Symbol::LIST_SET => WrapperIsRequired, Symbol::LIST_REPLACE => WrapperIsRequired,
Symbol::LIST_SINGLE => CanBeReplacedBy(ListSingle), Symbol::LIST_SINGLE => CanBeReplacedBy(ListSingle),
Symbol::LIST_REPEAT => CanBeReplacedBy(ListRepeat), Symbol::LIST_REPEAT => CanBeReplacedBy(ListRepeat),
Symbol::LIST_REVERSE => CanBeReplacedBy(ListReverse), Symbol::LIST_REVERSE => CanBeReplacedBy(ListReverse),

View file

@ -1142,6 +1142,7 @@ define_builtins! {
55 LIST_SORT_ASC: "sortAsc" 55 LIST_SORT_ASC: "sortAsc"
56 LIST_SORT_DESC: "sortDesc" 56 LIST_SORT_DESC: "sortDesc"
57 LIST_SORT_DESC_COMPARE: "#sortDescCompare" 57 LIST_SORT_DESC_COMPARE: "#sortDescCompare"
58 LIST_REPLACE: "replace"
} }
5 RESULT: "Result" => { 5 RESULT: "Result" => {
0 RESULT_RESULT: "Result" // the Result.Result type alias 0 RESULT_RESULT: "Result" // the Result.Result type alias

View file

@ -934,7 +934,7 @@ pub fn lowlevel_borrow_signature(arena: &Bump, op: LowLevel) -> &[bool] {
// - other refcounted arguments are Borrowed // - other refcounted arguments are Borrowed
match op { match op {
ListLen | StrIsEmpty | StrCountGraphemes => arena.alloc_slice_copy(&[borrowed]), ListLen | StrIsEmpty | StrCountGraphemes => arena.alloc_slice_copy(&[borrowed]),
ListSet => arena.alloc_slice_copy(&[owned, irrelevant, irrelevant]), ListReplaceUnsafe => arena.alloc_slice_copy(&[owned, irrelevant, irrelevant]),
ListGetUnsafe => arena.alloc_slice_copy(&[borrowed, irrelevant]), ListGetUnsafe => arena.alloc_slice_copy(&[borrowed, irrelevant]),
ListConcat => arena.alloc_slice_copy(&[owned, owned]), ListConcat => arena.alloc_slice_copy(&[owned, owned]),
StrConcat => arena.alloc_slice_copy(&[owned, borrowed]), StrConcat => arena.alloc_slice_copy(&[owned, borrowed]),

View file

@ -1,5 +1,5 @@
use crate::ir::DestructType; use crate::ir::DestructType;
use roc_collections::all::Index; use roc_collections::all::HumanIndex;
use roc_exhaustive::{ use roc_exhaustive::{
is_useful, Context, Ctor, Error, Guard, Literal, Pattern, RenderAs, TagId, Union, is_useful, Context, Ctor, Error, Guard, Literal, Pattern, RenderAs, TagId, Union,
}; };
@ -189,7 +189,7 @@ fn to_nonredundant_rows(
return Err(Error::Redundant { return Err(Error::Redundant {
overall_region, overall_region,
branch_region: region, branch_region: region,
index: Index::zero_based(checked_rows.len()), index: HumanIndex::zero_based(checked_rows.len()),
}); });
} }
} }

View file

@ -2,7 +2,6 @@
// See github.com/rtfeldman/roc/issues/800 for discussion of the large_enum_variant check. // See github.com/rtfeldman/roc/issues/800 for discussion of the large_enum_variant check.
#![allow(clippy::large_enum_variant, clippy::upper_case_acronyms)] #![allow(clippy::large_enum_variant, clippy::upper_case_acronyms)]
pub mod alias_analysis;
pub mod borrow; pub mod borrow;
pub mod code_gen_help; pub mod code_gen_help;
pub mod inc_dec; pub mod inc_dec;

View file

@ -711,9 +711,8 @@ where
let cur_indent = INDENT.with(|i| *i.borrow()); let cur_indent = INDENT.with(|i| *i.borrow());
println!( println!(
"@{:>5}:{:<5}: {}{:<50}", "{:>5?}: {}{:<50}",
state.line, state.pos(),
state.column,
&indent_text[..cur_indent * 2], &indent_text[..cur_indent * 2],
self.message self.message
); );
@ -728,9 +727,8 @@ where
}; };
println!( println!(
"@{:>5}:{:<5}: {}{:<50} {:<15} {:?}", "{:<5?}: {}{:<50} {:<15} {:?}",
state.line, state.pos(),
state.column,
&indent_text[..cur_indent * 2], &indent_text[..cur_indent * 2],
self.message, self.message,
format!("{:?}", progress), format!("{:?}", progress),
@ -1217,7 +1215,11 @@ macro_rules! collection_trailing_sep_e {
$indent_problem $indent_problem
) )
), ),
$crate::blankspace::space0_e($min_indent, $indent_problem) $crate::blankspace::space0_e(
// we use min_indent=0 because we want to parse incorrectly indented closing braces
// and later fix these up in the formatter.
0 /* min_indent */,
$indent_problem)
).parse(arena, state)?; ).parse(arena, state)?;
let (_,_, state) = let (_,_, state) =
@ -1404,21 +1406,6 @@ where
} }
} }
pub fn check_indent<'a, TE, E>(min_indent: u32, to_problem: TE) -> impl Parser<'a, (), E>
where
TE: Fn(Position) -> E,
E: 'a,
{
move |_arena, state: State<'a>| {
dbg!(state.indent_column, min_indent);
if state.indent_column < min_indent {
Err((NoProgress, to_problem(state.pos()), state))
} else {
Ok((NoProgress, (), state))
}
}
}
#[macro_export] #[macro_export]
macro_rules! word1_check_indent { macro_rules! word1_check_indent {
($word:expr, $word_problem:expr, $min_indent:expr, $indent_problem:expr) => { ($word:expr, $word_problem:expr, $min_indent:expr, $indent_problem:expr) => {

View file

@ -0,0 +1,74 @@
Defs(
[
@0-58 Body(
@0-7 Malformed(
"my_list",
),
@10-58 List(
Collection {
items: [
@16-17 SpaceBefore(
Num(
"0",
),
[
Newline,
],
),
@23-48 SpaceBefore(
List(
Collection {
items: [
@33-34 SpaceBefore(
Var {
module_name: "",
ident: "a",
},
[
Newline,
],
),
@44-45 SpaceBefore(
Var {
module_name: "",
ident: "b",
},
[
Newline,
],
),
],
final_comments: [
Newline,
],
},
),
[
Newline,
],
),
@54-55 SpaceBefore(
Num(
"1",
),
[
Newline,
],
),
],
final_comments: [
Newline,
],
},
),
),
],
@59-61 SpaceBefore(
Num(
"42",
),
[
Newline,
],
),
)

View file

@ -0,0 +1,9 @@
my_list = [
0,
[
a,
b,
],
1,
]
42

View file

@ -0,0 +1,42 @@
Defs(
[
@0-26 Body(
@0-7 Malformed(
"my_list",
),
@10-26 List(
Collection {
items: [
@16-17 SpaceBefore(
Num(
"0",
),
[
Newline,
],
),
@23-24 SpaceBefore(
Num(
"1",
),
[
Newline,
],
),
],
final_comments: [
Newline,
],
},
),
),
],
@27-29 SpaceBefore(
Num(
"42",
),
[
Newline,
],
),
)

View file

@ -0,0 +1,5 @@
my_list = [
0,
1
]
42

View file

@ -0,0 +1,42 @@
Defs(
[
@0-27 Body(
@0-7 Malformed(
"my_list",
),
@10-27 List(
Collection {
items: [
@16-17 SpaceBefore(
Num(
"0",
),
[
Newline,
],
),
@23-24 SpaceBefore(
Num(
"1",
),
[
Newline,
],
),
],
final_comments: [
Newline,
],
},
),
),
],
@28-30 SpaceBefore(
Num(
"42",
),
[
Newline,
],
),
)

View file

@ -0,0 +1,5 @@
my_list = [
0,
1,
]
42

View file

@ -75,19 +75,32 @@ mod test_parse {
let mut base = std::path::PathBuf::from("tests"); let mut base = std::path::PathBuf::from("tests");
base.push("snapshots"); base.push("snapshots");
let pass_or_fail_names = list(&base); let pass_or_fail_names = list(&base);
let mut extra_test_files = std::collections::HashSet::new();
for res in pass_or_fail_names { for res in pass_or_fail_names {
assert!(res == "pass" || res == "fail"); assert!(res == "pass" || res == "fail");
let res_dir = base.join(&res); let res_dir = base.join(&res);
for file in list(&res_dir) { for file in list(&res_dir) {
if let Some(file) = file.strip_suffix(".roc") { let test = if let Some(test) = file.strip_suffix(".roc") {
assert!(tests.contains(format!("{}/{}", &res, file).as_str()), "{}", file); test
} else if let Some(file) = file.strip_suffix(".result-ast") { } else if let Some(test) = file.strip_suffix(".result-ast") {
assert!(tests.contains(format!("{}/{}", &res, file).as_str()), "{}", file); test
} else { } else {
panic!("unexpected test file found: {}", file); panic!("unexpected file found in tests/snapshots: {}", file);
};
let test_name = format!("{}/{}", &res, test);
if !tests.contains(test_name.as_str()) {
extra_test_files.insert(test_name);
} }
} }
} }
if extra_test_files.len() > 0 {
eprintln!("Found extra test files:");
for file in extra_test_files {
eprintln!("{}", file);
}
panic!("Add entries for these in the `snapshot_tests!` macro in test_parse.rs");
}
} }
$( $(
@ -109,6 +122,7 @@ mod test_parse {
snapshot_tests! { snapshot_tests! {
fail/type_argument_no_arrow.expr, fail/type_argument_no_arrow.expr,
fail/type_double_comma.expr, fail/type_double_comma.expr,
pass/list_closing_indent_not_enough.expr,
pass/add_var_with_spaces.expr, pass/add_var_with_spaces.expr,
pass/add_with_spaces.expr, pass/add_with_spaces.expr,
pass/annotated_record_destructure.expr, pass/annotated_record_destructure.expr,
@ -154,6 +168,8 @@ mod test_parse {
pass/int_with_underscore.expr, pass/int_with_underscore.expr,
pass/interface_with_newline.header, pass/interface_with_newline.header,
pass/lowest_float.expr, pass/lowest_float.expr,
pass/list_closing_same_indent_no_trailing_comma.expr,
pass/list_closing_same_indent_with_trailing_comma.expr,
pass/lowest_int.expr, pass/lowest_int.expr,
pass/malformed_ident_due_to_underscore.expr, pass/malformed_ident_due_to_underscore.expr,
pass/malformed_pattern_field_access.expr, // See https://github.com/rtfeldman/roc/issues/399 pass/malformed_pattern_field_access.expr, // See https://github.com/rtfeldman/roc/issues/399
@ -278,15 +294,11 @@ mod test_parse {
let result = func(&input); let result = func(&input);
let actual_result = if should_pass { let actual_result = if should_pass {
eprintln!("The source code for this test did not successfully parse!\n"); result.expect("The source code for this test did not successfully parse!")
result.unwrap()
} else { } else {
eprintln!( result.expect_err(
"The source code for this test successfully parsed, but it was not expected to!\n" "The source code for this test successfully parsed, but it was not expected to!",
); )
result.unwrap_err()
}; };
if std::env::var("ROC_PARSER_SNAPSHOT_TEST_OVERWRITE").is_ok() { if std::env::var("ROC_PARSER_SNAPSHOT_TEST_OVERWRITE").is_ok() {

View file

@ -1,5 +1,5 @@
use crate::solve; use crate::solve;
use roc_can::constraint::Constraint; use roc_can::constraint::{Constraint as ConstraintSoa, Constraints};
use roc_collections::all::MutMap; use roc_collections::all::MutMap;
use roc_module::ident::Lowercase; use roc_module::ident::Lowercase;
use roc_module::symbol::Symbol; use roc_module::symbol::Symbol;
@ -12,20 +12,17 @@ pub struct SolvedModule {
pub solved_types: MutMap<Symbol, SolvedType>, pub solved_types: MutMap<Symbol, SolvedType>,
pub aliases: MutMap<Symbol, Alias>, pub aliases: MutMap<Symbol, Alias>,
pub exposed_symbols: Vec<Symbol>, pub exposed_symbols: Vec<Symbol>,
pub exposed_vars_by_symbol: MutMap<Symbol, Variable>, pub exposed_vars_by_symbol: Vec<(Symbol, Variable)>,
pub problems: Vec<solve::TypeError>, pub problems: Vec<solve::TypeError>,
} }
pub fn run_solve( pub fn run_solve(
aliases: MutMap<Symbol, Alias>, constraints: &Constraints,
constraint: ConstraintSoa,
rigid_variables: MutMap<Variable, Lowercase>, rigid_variables: MutMap<Variable, Lowercase>,
constraint: Constraint,
var_store: VarStore, var_store: VarStore,
) -> (Solved<Subs>, solve::Env, Vec<solve::TypeError>) { ) -> (Solved<Subs>, solve::Env, Vec<solve::TypeError>) {
let env = solve::Env { let env = solve::Env::default();
vars_by_symbol: MutMap::default(),
aliases,
};
let mut subs = Subs::new_from_varstore(var_store); let mut subs = Subs::new_from_varstore(var_store);
@ -38,46 +35,17 @@ pub fn run_solve(
let mut problems = Vec::new(); let mut problems = Vec::new();
// Run the solver to populate Subs. // Run the solver to populate Subs.
let (solved_subs, solved_env) = solve::run(&env, &mut problems, subs, &constraint); let (solved_subs, solved_env) = solve::run(constraints, &env, &mut problems, subs, &constraint);
(solved_subs, solved_env, problems) (solved_subs, solved_env, problems)
} }
pub fn make_solved_types( pub fn make_solved_types(
solved_env: &solve::Env,
solved_subs: &Solved<Subs>, solved_subs: &Solved<Subs>,
exposed_vars_by_symbol: &MutMap<Symbol, Variable>, exposed_vars_by_symbol: &[(Symbol, Variable)],
) -> MutMap<Symbol, SolvedType> { ) -> MutMap<Symbol, SolvedType> {
let mut solved_types = MutMap::default(); let mut solved_types = MutMap::default();
for (symbol, alias) in solved_env.aliases.iter() {
let mut args = Vec::with_capacity(alias.type_variables.len());
for loc_named_var in alias.type_variables.iter() {
let (name, var) = &loc_named_var.value;
args.push((name.clone(), SolvedType::new(solved_subs, *var)));
}
let mut lambda_set_variables = Vec::with_capacity(alias.lambda_set_variables.len());
for set in alias.lambda_set_variables.iter() {
lambda_set_variables.push(roc_types::solved_types::SolvedLambdaSet(
SolvedType::from_type(solved_subs, &set.0),
));
}
let solved_type = SolvedType::from_type(solved_subs, &alias.typ);
let solved_alias = SolvedType::Alias(
*symbol,
args,
lambda_set_variables,
Box::new(solved_type),
alias.kind,
);
solved_types.insert(*symbol, solved_alias);
}
// exposed_vars_by_symbol contains the Variables for all the Symbols // exposed_vars_by_symbol contains the Variables for all the Symbols
// this module exposes. We want to convert those into flat SolvedType // this module exposes. We want to convert those into flat SolvedType
// annotations which are decoupled from our Subs, because that's how // annotations which are decoupled from our Subs, because that's how

View file

@ -1,7 +1,9 @@
use bumpalo::Bump;
use roc_can::constraint::Constraint::{self, *}; use roc_can::constraint::Constraint::{self, *};
use roc_can::constraint::PresenceConstraint; use roc_can::constraint::{Constraints, LetConstraint};
use roc_can::expected::{Expected, PExpected}; use roc_can::expected::{Expected, PExpected};
use roc_collections::all::MutMap; use roc_collections::all::MutMap;
use roc_collections::soa::{Index, Slice};
use roc_module::ident::TagName; use roc_module::ident::TagName;
use roc_module::symbol::{ModuleId, Symbol}; use roc_module::symbol::{ModuleId, Symbol};
use roc_region::all::{Loc, Region}; use roc_region::all::{Loc, Region};
@ -12,7 +14,7 @@ use roc_types::subs::{
}; };
use roc_types::types::Type::{self, *}; use roc_types::types::Type::{self, *};
use roc_types::types::{ use roc_types::types::{
gather_fields_unsorted_iter, Alias, AliasKind, Category, ErrorType, PatternCategory, gather_fields_unsorted_iter, AliasKind, Category, ErrorType, PatternCategory,
}; };
use roc_unify::unify::{unify, Mode, Unified::*}; use roc_unify::unify::{unify, Mode, Unified::*};
use std::collections::hash_map::Entry; use std::collections::hash_map::Entry;
@ -78,8 +80,37 @@ pub enum TypeError {
#[derive(Clone, Debug, Default)] #[derive(Clone, Debug, Default)]
pub struct Env { pub struct Env {
pub vars_by_symbol: MutMap<Symbol, Variable>, symbols: Vec<Symbol>,
pub aliases: MutMap<Symbol, Alias>, variables: Vec<Variable>,
}
impl Env {
pub fn vars_by_symbol(&self) -> impl Iterator<Item = (Symbol, Variable)> + '_ {
let it1 = self.symbols.iter().copied();
let it2 = self.variables.iter().copied();
it1.zip(it2)
}
fn get_var_by_symbol(&self, symbol: &Symbol) -> Option<Variable> {
self.symbols
.iter()
.position(|s| s == symbol)
.map(|index| self.variables[index])
}
fn insert_symbol_var_if_vacant(&mut self, symbol: Symbol, var: Variable) {
match self.symbols.iter().position(|s| *s == symbol) {
None => {
// symbol is not in vars_by_symbol yet; insert it
self.symbols.push(symbol);
self.variables.push(var);
}
Some(_) => {
// do nothing
}
}
}
} }
const DEFAULT_POOLS: usize = 8; const DEFAULT_POOLS: usize = 8;
@ -140,18 +171,20 @@ struct State {
} }
pub fn run( pub fn run(
constraints: &Constraints,
env: &Env, env: &Env,
problems: &mut Vec<TypeError>, problems: &mut Vec<TypeError>,
mut subs: Subs, mut subs: Subs,
constraint: &Constraint, constraint: &Constraint,
) -> (Solved<Subs>, Env) { ) -> (Solved<Subs>, Env) {
let env = run_in_place(env, problems, &mut subs, constraint); let env = run_in_place(constraints, env, problems, &mut subs, constraint);
(Solved(subs), env) (Solved(subs), env)
} }
/// Modify an existing subs in-place instead /// Modify an existing subs in-place instead
pub fn run_in_place( pub fn run_in_place(
constraints: &Constraints,
env: &Env, env: &Env,
problems: &mut Vec<TypeError>, problems: &mut Vec<TypeError>,
subs: &mut Subs, subs: &mut Subs,
@ -163,7 +196,12 @@ pub fn run_in_place(
mark: Mark::NONE.next(), mark: Mark::NONE.next(),
}; };
let rank = Rank::toplevel(); let rank = Rank::toplevel();
let arena = Bump::new();
let state = solve( let state = solve(
&arena,
constraints,
env, env,
state, state,
rank, rank,
@ -177,23 +215,34 @@ pub fn run_in_place(
state.env state.env
} }
enum After {
CheckForInfiniteTypes(LocalDefVarsVec<(Symbol, Loc<Variable>)>),
}
enum Work<'a> { enum Work<'a> {
Constraint { Constraint {
env: Env, env: &'a Env,
rank: Rank, rank: Rank,
constraint: &'a Constraint, constraint: &'a Constraint,
after: Option<After>,
}, },
/// Something to be done after a constraint and all its dependencies are fully solved. CheckForInfiniteTypes(LocalDefVarsVec<(Symbol, Loc<Variable>)>),
After(After), /// The ret_con part of a let constraint that does NOT introduces rigid and/or flex variables
LetConNoVariables {
env: &'a Env,
rank: Rank,
let_con: &'a LetConstraint,
},
/// The ret_con part of a let constraint that introduces rigid and/or flex variables
///
/// These introduced variables must be generalized, hence this variant
/// is more complex than `LetConNoVariables`.
LetConIntroducesVariables {
env: &'a Env,
rank: Rank,
let_con: &'a LetConstraint,
},
} }
#[allow(clippy::too_many_arguments)] #[allow(clippy::too_many_arguments)]
fn solve( fn solve(
arena: &Bump,
constraints: &Constraints,
env: &Env, env: &Env,
mut state: State, mut state: State,
rank: Rank, rank: Rank,
@ -203,48 +252,174 @@ fn solve(
subs: &mut Subs, subs: &mut Subs,
constraint: &Constraint, constraint: &Constraint,
) -> State { ) -> State {
let mut stack = vec![Work::Constraint { let initial = Work::Constraint {
env: env.clone(), env,
rank, rank,
constraint, constraint,
after: None, };
}];
let mut stack = vec![initial];
while let Some(work_item) = stack.pop() { while let Some(work_item) = stack.pop() {
let (env, rank, constraint) = match work_item { let (env, rank, constraint) = match work_item {
Work::After(After::CheckForInfiniteTypes(def_vars)) => {
for (symbol, loc_var) in def_vars.iter() {
check_for_infinite_type(subs, problems, *symbol, *loc_var);
}
// No constraint to be solved
continue;
}
Work::Constraint { Work::Constraint {
env, env,
rank, rank,
constraint, constraint,
after,
} => { } => {
// Push the `after` on first so that we look at it immediately after finishing all // the default case; actually solve this constraint
// the children of this constraint.
if let Some(after) = after {
stack.push(Work::After(after));
}
(env, rank, constraint) (env, rank, constraint)
} }
Work::CheckForInfiniteTypes(def_vars) => {
// after a LetCon, we must check if any of the variables that we introduced
// loop back to themselves after solving the ret_constraint
for (symbol, loc_var) in def_vars.iter() {
check_for_infinite_type(subs, problems, *symbol, *loc_var);
}
continue;
}
Work::LetConNoVariables { env, rank, let_con } => {
// NOTE be extremely careful with shadowing here
let offset = let_con.defs_and_ret_constraint.index();
let ret_constraint = &constraints.constraints[offset + 1];
// Add a variable for each def to new_vars_by_env.
let local_def_vars = LocalDefVarsVec::from_def_types(
constraints,
rank,
pools,
cached_aliases,
subs,
let_con.def_types,
);
let mut new_env = env.clone();
for (symbol, loc_var) in local_def_vars.iter() {
new_env.insert_symbol_var_if_vacant(*symbol, loc_var.value);
}
stack.push(Work::CheckForInfiniteTypes(local_def_vars));
stack.push(Work::Constraint {
env: arena.alloc(new_env),
rank,
constraint: ret_constraint,
});
continue;
}
Work::LetConIntroducesVariables { env, rank, let_con } => {
// NOTE be extremely careful with shadowing here
let offset = let_con.defs_and_ret_constraint.index();
let ret_constraint = &constraints.constraints[offset + 1];
let next_rank = rank.next();
let mark = state.mark;
let saved_env = state.env;
let young_mark = mark;
let visit_mark = young_mark.next();
let final_mark = visit_mark.next();
// Add a variable for each def to local_def_vars.
let local_def_vars = LocalDefVarsVec::from_def_types(
constraints,
next_rank,
pools,
cached_aliases,
subs,
let_con.def_types,
);
debug_assert_eq!(
{
let offenders = pools
.get(next_rank)
.iter()
.filter(|var| {
subs.get_rank(**var).into_usize() > next_rank.into_usize()
})
.collect::<Vec<_>>();
let result = offenders.len();
if result > 0 {
dbg!(&subs, &offenders, &let_con.def_types);
}
result
},
0
);
// pop pool
generalize(subs, young_mark, visit_mark, next_rank, pools);
pools.get_mut(next_rank).clear();
// check that things went well
debug_assert!({
// NOTE the `subs.redundant` check is added for the uniqueness
// inference, and does not come from elm. It's unclear whether this is
// a bug with uniqueness inference (something is redundant that
// shouldn't be) or that it just never came up in elm.
let rigid_vars = &constraints.variables[let_con.rigid_vars.indices()];
let failing: Vec<_> = rigid_vars
.iter()
.filter(|&var| !subs.redundant(*var) && subs.get_rank(*var) != Rank::NONE)
.collect();
if !failing.is_empty() {
println!("Rigids {:?}", &rigid_vars);
println!("Failing {:?}", failing);
}
failing.is_empty()
});
let mut new_env = env.clone();
for (symbol, loc_var) in local_def_vars.iter() {
new_env.insert_symbol_var_if_vacant(*symbol, loc_var.value);
}
// Note that this vars_by_symbol is the one returned by the
// previous call to solve()
let state_for_ret_con = State {
env: saved_env,
mark: final_mark,
};
// Now solve the body, using the new vars_by_symbol which includes
// the assignments' name-to-variable mappings.
stack.push(Work::CheckForInfiniteTypes(local_def_vars));
stack.push(Work::Constraint {
env: arena.alloc(new_env),
rank,
constraint: ret_constraint,
});
state = state_for_ret_con;
continue;
}
}; };
state = match constraint { state = match constraint {
True => state, True => state,
SaveTheEnvironment => { SaveTheEnvironment => {
// NOTE deviation: elm only copies the env into the state on SaveTheEnvironment
let mut copy = state; let mut copy = state;
copy.env = env; copy.env = env.clone();
copy copy
} }
Eq(typ, expectation, category, region) => { Eq(type_index, expectation_index, category_index, region) => {
let typ = &constraints.types[type_index.index()];
let expectation = &constraints.expectations[expectation_index.index()];
let category = &constraints.categories[category_index.index()];
let actual = type_to_var(subs, rank, pools, cached_aliases, typ); let actual = type_to_var(subs, rank, pools, cached_aliases, typ);
let expected = type_to_var( let expected = type_to_var(
subs, subs,
@ -283,7 +458,9 @@ fn solve(
} }
} }
} }
Store(source, target, _filename, _linenr) => { Store(source_index, target, _filename, _linenr) => {
let source = &constraints.types[source_index.index()];
// a special version of Eq that is used to store types in the AST. // a special version of Eq that is used to store types in the AST.
// IT DOES NOT REPORT ERRORS! // IT DOES NOT REPORT ERRORS!
let actual = type_to_var(subs, rank, pools, cached_aliases, source); let actual = type_to_var(subs, rank, pools, cached_aliases, source);
@ -311,8 +488,8 @@ fn solve(
} }
} }
} }
Lookup(symbol, expectation, region) => { Lookup(symbol, expectation_index, region) => {
match env.vars_by_symbol.get(symbol) { match env.get_var_by_symbol(symbol) {
Some(var) => { Some(var) => {
// Deep copy the vars associated with this symbol before unifying them. // Deep copy the vars associated with this symbol before unifying them.
// Otherwise, suppose we have this: // Otherwise, suppose we have this:
@ -335,7 +512,9 @@ fn solve(
// then we copy from that module's Subs into our own. If the value // then we copy from that module's Subs into our own. If the value
// is being looked up in this module, then we use our Subs as both // is being looked up in this module, then we use our Subs as both
// the source and destination. // the source and destination.
let actual = deep_copy_var(subs, rank, pools, *var); let actual = deep_copy_var_in(subs, rank, pools, var, arena);
let expectation = &constraints.expectations[expectation_index.index()];
let expected = type_to_var( let expected = type_to_var(
subs, subs,
rank, rank,
@ -343,6 +522,7 @@ fn solve(
cached_aliases, cached_aliases,
expectation.get_type_ref(), expectation.get_type_ref(),
); );
match unify(subs, actual, expected, Mode::EQ) { match unify(subs, actual, expected, Mode::EQ) {
Success(vars) => { Success(vars) => {
introduce(subs, rank, pools, &vars); introduce(subs, rank, pools, &vars);
@ -383,20 +563,24 @@ fn solve(
} }
} }
} }
And(sub_constraints) => { And(slice) => {
for sub_constraint in sub_constraints.iter().rev() { let it = constraints.constraints[slice.indices()].iter().rev();
for sub_constraint in it {
stack.push(Work::Constraint { stack.push(Work::Constraint {
env: env.clone(), env,
rank, rank,
constraint: sub_constraint, constraint: sub_constraint,
after: None,
}) })
} }
state state
} }
Pattern(region, category, typ, expectation) Pattern(type_index, expectation_index, category_index, region)
| Present(typ, PresenceConstraint::Pattern(region, category, expectation)) => { | PatternPresence(type_index, expectation_index, category_index, region) => {
let typ = &constraints.types[type_index.index()];
let expectation = &constraints.pattern_expectations[expectation_index.index()];
let category = &constraints.pattern_categories[category_index.index()];
let actual = type_to_var(subs, rank, pools, cached_aliases, typ); let actual = type_to_var(subs, rank, pools, cached_aliases, typ);
let expected = type_to_var( let expected = type_to_var(
subs, subs,
@ -407,7 +591,7 @@ fn solve(
); );
let mode = match constraint { let mode = match constraint {
Present(_, _) => Mode::PRESENT, PatternPresence(..) => Mode::PRESENT,
_ => Mode::EQ, _ => Mode::EQ,
}; };
@ -440,76 +624,43 @@ fn solve(
} }
} }
} }
Let(let_con) => { Let(index) => {
match &let_con.ret_constraint { let let_con = &constraints.let_constraints[index.index()];
True if let_con.rigid_vars.is_empty() => {
introduce(subs, rank, pools, &let_con.flex_vars); let offset = let_con.defs_and_ret_constraint.index();
let defs_constraint = &constraints.constraints[offset];
let ret_constraint = &constraints.constraints[offset + 1];
let flex_vars = &constraints.variables[let_con.flex_vars.indices()];
let rigid_vars = &constraints.variables[let_con.rigid_vars.indices()];
if matches!(&ret_constraint, True) && let_con.rigid_vars.is_empty() {
introduce(subs, rank, pools, flex_vars);
// If the return expression is guaranteed to solve, // If the return expression is guaranteed to solve,
// solve the assignments themselves and move on. // solve the assignments themselves and move on.
stack.push(Work::Constraint { stack.push(Work::Constraint {
env, env,
rank, rank,
constraint: &let_con.defs_constraint, constraint: defs_constraint,
after: None,
}); });
state state
} } else if let_con.rigid_vars.is_empty() && let_con.flex_vars.is_empty() {
ret_con if let_con.rigid_vars.is_empty() && let_con.flex_vars.is_empty() => { // items are popped from the stack in reverse order. That means that we'll
// TODO: make into `WorkItem` with `After` // first solve then defs_constraint, and then (eventually) the ret_constraint.
let state = solve( //
&env, // Note that the LetConSimple gets the current env and rank,
state, // and not the env/rank from after solving the defs_constraint
rank, stack.push(Work::LetConNoVariables { env, rank, let_con });
pools,
problems,
cached_aliases,
subs,
&let_con.defs_constraint,
);
// Add a variable for each def to new_vars_by_env.
let mut local_def_vars =
LocalDefVarsVec::with_length(let_con.def_types.len());
for (symbol, loc_type) in let_con.def_types.iter() {
let var =
type_to_var(subs, rank, pools, cached_aliases, &loc_type.value);
local_def_vars.push((
*symbol,
Loc {
value: var,
region: loc_type.region,
},
));
}
let mut new_env = env.clone();
for (symbol, loc_var) in local_def_vars.iter() {
match new_env.vars_by_symbol.entry(*symbol) {
Entry::Occupied(_) => {
// keep the existing value
}
Entry::Vacant(vacant) => {
vacant.insert(loc_var.value);
}
}
}
stack.push(Work::Constraint { stack.push(Work::Constraint {
env: new_env, env,
rank, rank,
constraint: ret_con, constraint: defs_constraint,
after: Some(After::CheckForInfiniteTypes(local_def_vars)),
}); });
state state
} } else {
ret_con => {
let rigid_vars = &let_con.rigid_vars;
let flex_vars = &let_con.flex_vars;
// work in the next pool to localize header // work in the next pool to localize header
let next_rank = rank.next(); let next_rank = rank.next();
@ -538,127 +689,24 @@ fn solve(
// run solver in next pool // run solver in next pool
// Add a variable for each def to local_def_vars. // items are popped from the stack in reverse order. That means that we'll
let mut local_def_vars = // first solve then defs_constraint, and then (eventually) the ret_constraint.
LocalDefVarsVec::with_length(let_con.def_types.len()); //
// Note that the LetConSimple gets the current env and rank,
for (symbol, loc_type) in let_con.def_types.iter() { // and not the env/rank from after solving the defs_constraint
let def_type = &loc_type.value; stack.push(Work::LetConIntroducesVariables { env, rank, let_con });
let var = type_to_var(subs, next_rank, pools, cached_aliases, def_type);
local_def_vars.push((
*symbol,
Loc {
value: var,
region: loc_type.region,
},
));
}
// Solve the assignments' constraints first.
// TODO: make into `WorkItem` with `After`
let State {
env: saved_env,
mark,
} = solve(
&env,
state,
next_rank,
pools,
problems,
cached_aliases,
subs,
&let_con.defs_constraint,
);
let young_mark = mark;
let visit_mark = young_mark.next();
let final_mark = visit_mark.next();
debug_assert_eq!(
{
let offenders = pools
.get(next_rank)
.iter()
.filter(|var| {
let current_rank =
subs.get_rank(roc_types::subs::Variable::clone(var));
current_rank.into_usize() > next_rank.into_usize()
})
.collect::<Vec<_>>();
let result = offenders.len();
if result > 0 {
dbg!(&subs, &offenders, &let_con.def_types);
}
result
},
0
);
// pop pool
generalize(subs, young_mark, visit_mark, next_rank, pools);
pools.get_mut(next_rank).clear();
// check that things went well
debug_assert!({
// NOTE the `subs.redundant` check is added for the uniqueness
// inference, and does not come from elm. It's unclear whether this is
// a bug with uniqueness inference (something is redundant that
// shouldn't be) or that it just never came up in elm.
let failing: Vec<_> = rigid_vars
.iter()
.filter(|&var| {
!subs.redundant(*var) && subs.get_rank(*var) != Rank::NONE
})
.collect();
if !failing.is_empty() {
println!("Rigids {:?}", &rigid_vars);
println!("Failing {:?}", failing);
}
failing.is_empty()
});
let mut new_env = env.clone();
for (symbol, loc_var) in local_def_vars.iter() {
match new_env.vars_by_symbol.entry(*symbol) {
Entry::Occupied(_) => {
// keep the existing value
}
Entry::Vacant(vacant) => {
vacant.insert(loc_var.value);
}
}
}
// Note that this vars_by_symbol is the one returned by the
// previous call to solve()
let state_for_ret_con = State {
env: saved_env,
mark: final_mark,
};
// Now solve the body, using the new vars_by_symbol which includes
// the assignments' name-to-variable mappings.
stack.push(Work::Constraint { stack.push(Work::Constraint {
env: new_env, env,
rank, rank: next_rank,
constraint: ret_con, constraint: defs_constraint,
after: Some(After::CheckForInfiniteTypes(local_def_vars)),
}); });
state_for_ret_con state
} }
} }
} IsOpenType(type_index) => {
Present(typ, PresenceConstraint::IsOpen) => { let typ = &constraints.types[type_index.index()];
let actual = type_to_var(subs, rank, pools, cached_aliases, typ); let actual = type_to_var(subs, rank, pools, cached_aliases, typ);
let mut new_desc = subs.get(actual); let mut new_desc = subs.get(actual);
match new_desc.content { match new_desc.content {
@ -680,13 +728,24 @@ fn solve(
} }
} }
} }
Present( IncludesTag(index) => {
typ, let includes_tag = &constraints.includes_tags[index.index()];
PresenceConstraint::IncludesTag(tag_name, tys, region, pattern_category),
) => { let roc_can::constraint::IncludesTag {
type_index,
tag_name,
types,
pattern_category,
region,
} = includes_tag;
let typ = &constraints.types[type_index.index()];
let tys = &constraints.types[types.indices()];
let pattern_category = &constraints.pattern_categories[pattern_category.index()];
let actual = type_to_var(subs, rank, pools, cached_aliases, typ); let actual = type_to_var(subs, rank, pools, cached_aliases, typ);
let tag_ty = Type::TagUnion( let tag_ty = Type::TagUnion(
vec![(tag_name.clone(), tys.clone())], vec![(tag_name.clone(), tys.to_vec())],
Box::new(Type::EmptyTagUnion), Box::new(Type::EmptyTagUnion),
); );
let includes = type_to_var(subs, rank, pools, cached_aliases, &tag_ty); let includes = type_to_var(subs, rank, pools, cached_aliases, &tag_ty);
@ -756,24 +815,49 @@ impl<T> LocalDefVarsVec<T> {
} }
} }
impl LocalDefVarsVec<(Symbol, Loc<Variable>)> {
fn from_def_types(
constraints: &Constraints,
rank: Rank,
pools: &mut Pools,
cached_aliases: &mut MutMap<Symbol, Variable>,
subs: &mut Subs,
def_types_slice: Slice<(Symbol, Loc<Index<Type>>)>,
) -> Self {
let def_types = &constraints.def_types[def_types_slice.indices()];
let mut local_def_vars = Self::with_length(def_types.len());
for (symbol, loc_type_index) in def_types.iter() {
let typ = &constraints.types[loc_type_index.value.index()];
let var = type_to_var(subs, rank, pools, cached_aliases, typ);
local_def_vars.push((
*symbol,
Loc {
value: var,
region: loc_type_index.region,
},
));
}
local_def_vars
}
}
use std::cell::RefCell; use std::cell::RefCell;
std::thread_local! { std::thread_local! {
/// Scratchpad arena so we don't need to allocate a new one all the time /// Scratchpad arena so we don't need to allocate a new one all the time
static SCRATCHPAD: RefCell<bumpalo::Bump> = RefCell::new(bumpalo::Bump::with_capacity(4 * 1024)); static SCRATCHPAD: RefCell<Option<bumpalo::Bump>> = RefCell::new(Some(bumpalo::Bump::with_capacity(4 * 1024)));
} }
fn take_scratchpad() -> bumpalo::Bump { fn take_scratchpad() -> bumpalo::Bump {
let mut result = bumpalo::Bump::new(); SCRATCHPAD.with(|f| f.take().unwrap())
SCRATCHPAD.with(|f| {
result = f.replace(bumpalo::Bump::new());
});
result
} }
fn put_scratchpad(scratchpad: bumpalo::Bump) { fn put_scratchpad(scratchpad: bumpalo::Bump) {
SCRATCHPAD.with(|f| { SCRATCHPAD.with(|f| {
f.replace(scratchpad); f.replace(Some(scratchpad));
}); });
} }
@ -956,7 +1040,7 @@ fn type_to_variable<'a>(
return reserved; return reserved;
} else { } else {
// for any other rank, we need to copy; it takes care of adjusting the rank // for any other rank, we need to copy; it takes care of adjusting the rank
return deep_copy_var(subs, rank, pools, reserved); return deep_copy_var_in(subs, rank, pools, reserved, arena);
} }
} }
@ -1024,6 +1108,7 @@ fn type_to_variable<'a>(
} }
} }
#[inline(always)]
fn alias_to_var<'a>( fn alias_to_var<'a>(
subs: &mut Subs, subs: &mut Subs,
rank: Rank, rank: Rank,
@ -1053,6 +1138,7 @@ fn alias_to_var<'a>(
} }
} }
#[inline(always)]
fn roc_result_to_var<'a>( fn roc_result_to_var<'a>(
subs: &mut Subs, subs: &mut Subs,
rank: Rank, rank: Rank,
@ -1801,10 +1887,14 @@ fn instantiate_rigids_help(subs: &mut Subs, max_rank: Rank, initial: Variable) {
} }
} }
fn deep_copy_var(subs: &mut Subs, rank: Rank, pools: &mut Pools, var: Variable) -> Variable { fn deep_copy_var_in(
let mut arena = take_scratchpad(); subs: &mut Subs,
rank: Rank,
let mut visited = bumpalo::collections::Vec::with_capacity_in(4 * 1024, &arena); pools: &mut Pools,
var: Variable,
arena: &Bump,
) -> Variable {
let mut visited = bumpalo::collections::Vec::with_capacity_in(256, arena);
let copy = deep_copy_var_help(subs, rank, pools, &mut visited, var); let copy = deep_copy_var_help(subs, rank, pools, &mut visited, var);
@ -1820,9 +1910,6 @@ fn deep_copy_var(subs: &mut Subs, rank: Rank, pools: &mut Pools, var: Variable)
} }
} }
arena.reset();
put_scratchpad(arena);
copy copy
} }
@ -2061,6 +2148,7 @@ fn deep_copy_var_help(
} }
} }
#[inline(always)]
fn register(subs: &mut Subs, rank: Rank, pools: &mut Pools, content: Content) -> Variable { fn register(subs: &mut Subs, rank: Rank, pools: &mut Pools, content: Content) -> Variable {
let descriptor = Descriptor { let descriptor = Descriptor {
content, content,

View file

@ -5516,4 +5516,24 @@ mod solve_expr {
r#"Id [ A, B, C { a : Str }e ] -> Str"#, r#"Id [ A, B, C { a : Str }e ] -> Str"#,
) )
} }
#[test]
fn lambda_set_within_alias_is_quantified() {
infer_eq_without_problem(
indoc!(
r#"
app "test" provides [ effectAlways ] to "./platform"
Effect a : [ @Effect ({} -> a) ]
effectAlways : a -> Effect a
effectAlways = \x ->
inner = \{} -> x
@Effect inner
"#
),
r#"a -> Effect a"#,
)
}
} }

View file

@ -1763,6 +1763,97 @@ fn get_int_list_oob() {
); );
} }
#[test]
#[cfg(any(feature = "gen-llvm"))]
fn replace_unique_int_list() {
assert_evals_to!(
indoc!(
r#"
record = List.replace [ 12, 9, 7, 1, 5 ] 2 33
record.list
"#
),
RocList::from_slice(&[12, 9, 33, 1, 5]),
RocList<i64>
);
}
#[test]
#[cfg(any(feature = "gen-llvm"))]
fn replace_unique_int_list_out_of_bounds() {
assert_evals_to!(
indoc!(
r#"
record = List.replace [ 12, 9, 7, 1, 5 ] 5 33
record.value
"#
),
33,
i64
);
}
#[test]
#[cfg(any(feature = "gen-llvm"))]
fn replace_unique_int_list_get_old_value() {
assert_evals_to!(
indoc!(
r#"
record = List.replace [ 12, 9, 7, 1, 5 ] 2 33
record.value
"#
),
7,
i64
);
}
#[test]
#[cfg(any(feature = "gen-llvm"))]
fn replace_unique_get_large_value() {
assert_evals_to!(
indoc!(
r#"
list : List { a : U64, b: U64, c: U64, d: U64 }
list = [ { a: 1, b: 2, c: 3, d: 4 }, { a: 5, b: 6, c: 7, d: 8 }, { a: 9, b: 10, c: 11, d: 12 } ]
record = List.replace list 1 { a: 13, b: 14, c: 15, d: 16 }
record.value
"#
),
(5, 6, 7, 8),
(u64, u64, u64, u64)
);
}
#[test]
#[cfg(any(feature = "gen-llvm"))]
fn replace_shared_int_list() {
assert_evals_to!(
indoc!(
r#"
wrapper = \shared ->
# This should not mutate the original
replaced = (List.replace shared 1 7.7).list
x =
when List.get replaced 1 is
Ok num -> num
Err _ -> 0
y =
when List.get shared 1 is
Ok num -> num
Err _ -> 0
{ x, y }
wrapper [ 2.1, 4.3 ]
"#
),
(7.7, 4.3),
(f64, f64)
);
}
#[test] #[test]
#[cfg(any(feature = "gen-llvm"))] #[cfg(any(feature = "gen-llvm"))]
fn get_set_unique_int_list() { fn get_set_unique_int_list() {

View file

@ -382,7 +382,7 @@ fn write_content(env: &Env, content: &Content, subs: &Subs, buf: &mut String, pa
_ => write_parens!(write_parens, buf, { _ => write_parens!(write_parens, buf, {
write_symbol(env, *symbol, buf); write_symbol(env, *symbol, buf);
for var_index in args.into_iter() { for var_index in args.named_type_arguments() {
let var = subs[var_index]; let var = subs[var_index];
buf.push(' '); buf.push(' ');
write_content( write_content(

View file

@ -2,7 +2,7 @@ use crate::pretty_print::Parens;
use crate::subs::{ use crate::subs::{
GetSubsSlice, RecordFields, Subs, UnionTags, VarStore, Variable, VariableSubsSlice, GetSubsSlice, RecordFields, Subs, UnionTags, VarStore, Variable, VariableSubsSlice,
}; };
use roc_collections::all::{ImMap, ImSet, Index, MutSet, SendMap}; use roc_collections::all::{HumanIndex, ImMap, ImSet, MutSet, SendMap};
use roc_error_macros::internal_error; use roc_error_macros::internal_error;
use roc_module::called_via::CalledVia; use roc_module::called_via::CalledVia;
use roc_module::ident::{ForeignSymbol, Ident, Lowercase, TagName}; use roc_module::ident::{ForeignSymbol, Ident, Lowercase, TagName};
@ -1203,14 +1203,14 @@ pub struct TagUnionStructure<'a> {
pub enum PReason { pub enum PReason {
TypedArg { TypedArg {
opt_name: Option<Symbol>, opt_name: Option<Symbol>,
index: Index, index: HumanIndex,
}, },
WhenMatch { WhenMatch {
index: Index, index: HumanIndex,
}, },
TagArg { TagArg {
tag_name: TagName, tag_name: TagName,
index: Index, index: HumanIndex,
}, },
PatternGuard, PatternGuard,
OptionalField, OptionalField,
@ -1219,12 +1219,12 @@ pub enum PReason {
#[derive(Debug, Clone, Copy, PartialEq, Eq)] #[derive(Debug, Clone, Copy, PartialEq, Eq)]
pub enum AnnotationSource { pub enum AnnotationSource {
TypedIfBranch { TypedIfBranch {
index: Index, index: HumanIndex,
num_branches: usize, num_branches: usize,
region: Region, region: Region,
}, },
TypedWhenBranch { TypedWhenBranch {
index: Index, index: HumanIndex,
region: Region, region: Region,
}, },
TypedBody { TypedBody {
@ -1246,7 +1246,7 @@ impl AnnotationSource {
pub enum Reason { pub enum Reason {
FnArg { FnArg {
name: Option<Symbol>, name: Option<Symbol>,
arg_index: Index, arg_index: HumanIndex,
}, },
FnCall { FnCall {
name: Option<Symbol>, name: Option<Symbol>,
@ -1254,28 +1254,28 @@ pub enum Reason {
}, },
LowLevelOpArg { LowLevelOpArg {
op: LowLevel, op: LowLevel,
arg_index: Index, arg_index: HumanIndex,
}, },
ForeignCallArg { ForeignCallArg {
foreign_symbol: ForeignSymbol, foreign_symbol: ForeignSymbol,
arg_index: Index, arg_index: HumanIndex,
}, },
FloatLiteral, FloatLiteral,
IntLiteral, IntLiteral,
NumLiteral, NumLiteral,
StrInterpolation, StrInterpolation,
WhenBranch { WhenBranch {
index: Index, index: HumanIndex,
}, },
WhenGuard, WhenGuard,
ExpectCondition, ExpectCondition,
IfCondition, IfCondition,
IfBranch { IfBranch {
index: Index, index: HumanIndex,
total_branches: usize, total_branches: usize,
}, },
ElemInList { ElemInList {
index: Index, index: HumanIndex,
}, },
RecordUpdateValue(Lowercase), RecordUpdateValue(Lowercase),
RecordUpdateKeys(Symbol, SendMap<Lowercase, Region>), RecordUpdateKeys(Symbol, SendMap<Lowercase, Region>),

View file

@ -1,4 +1,7 @@
## :construction: Work In Progress :construction:
The editor is a work in progress, only a limited subset of Roc expressions are currently supported. The editor is a work in progress, only a limited subset of Roc expressions are currently supported.
Unlike most editors, we use projectional or structural editing to edit the [Abstract Syntax Tree](https://en.wikipedia.org/wiki/Abstract_syntax_tree) directly. This will allow for cool features like excellent auto-complete, refactoring and never needing to format your code. Unlike most editors, we use projectional or structural editing to edit the [Abstract Syntax Tree](https://en.wikipedia.org/wiki/Abstract_syntax_tree) directly. This will allow for cool features like excellent auto-complete, refactoring and never needing to format your code.
@ -68,6 +71,7 @@ Important folders/files outside the editor folder:
- ast/src/lang/core/ast.rs - ast/src/lang/core/ast.rs
- ast/src/lang/env.rs - ast/src/lang/env.rs
## Contributing ## Contributing
We welcome new contributors :heart: and are happy to help you get started. We welcome new contributors :heart: and are happy to help you get started.

View file

@ -3,7 +3,7 @@
To run, go to the project home directory and run: To run, go to the project home directory and run:
```bash ```bash
$ cargo run -- build --backend=wasm32 examples/hello-web/Hello.roc $ cargo run -- build --target=wasm32 examples/hello-web/Hello.roc
``` ```
Then `cd` into the example directory and run any web server that can handle WebAssembly. Then `cd` into the example directory and run any web server that can handle WebAssembly.

View file

@ -1,2 +1,2 @@
Windows is not yet supported, we have a big project in the works that will make it easier to achieve this. Windows is not yet supported, we have a big project in the works (see issue #2608) that will allow this.
Until then we recommend using Ubuntu through the "Windows Subsystem for Linux". Until then we recommend using Ubuntu through the "Windows Subsystem for Linux".

View file

@ -24,7 +24,7 @@ roc_collections = { path = "../compiler/collections" }
bumpalo = { version = "3.8.0", features = ["collections"] } bumpalo = { version = "3.8.0", features = ["collections"] }
clap = { version = "3.0.0-beta.5", default-features = false, features = ["std", "color", "suggestions"] } clap = { version = "3.0.0-beta.5", default-features = false, features = ["std", "color", "suggestions"] }
iced-x86 = { version = "1.15.0", default-features = false, features = ["std", "decoder", "op_code_info", "instr_info"] } iced-x86 = { version = "1.15.0", default-features = false, features = ["std", "decoder", "op_code_info", "instr_info"] }
memmap2 = "0.5.0" memmap2 = "0.5.3"
object = { version = "0.26.2", features = ["read", "write"] } object = { version = "0.26.2", features = ["read", "write"] }
serde = { version = "1.0.130", features = ["derive"] } serde = { version = "1.0.130", features = ["derive"] }
bincode = "1.3.3" bincode = "1.3.3"

View file

@ -20,7 +20,6 @@ use std::io;
use std::io::{BufReader, BufWriter}; use std::io::{BufReader, BufWriter};
use std::mem; use std::mem;
use std::os::raw::c_char; use std::os::raw::c_char;
use std::os::unix::fs::PermissionsExt;
use std::path::Path; use std::path::Path;
use std::process::Command; use std::process::Command;
use std::time::{Duration, SystemTime}; use std::time::{Duration, SystemTime};
@ -367,9 +366,7 @@ fn preprocess_impl(
Some(section) => { Some(section) => {
let file_offset = match section.compressed_file_range() { let file_offset = match section.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -494,9 +491,7 @@ fn preprocess_impl(
for sec in text_sections { for sec in text_sections {
let (file_offset, compressed) = match sec.compressed_file_range() { let (file_offset, compressed) = match sec.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -626,9 +621,7 @@ fn preprocess_impl(
}; };
let dyn_offset = match dyn_sec.compressed_file_range() { let dyn_offset = match dyn_sec.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -714,9 +707,7 @@ fn preprocess_impl(
}; };
let symtab_offset = match symtab_sec.compressed_file_range() { let symtab_offset = match symtab_sec.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -738,9 +729,7 @@ fn preprocess_impl(
}; };
let dynsym_offset = match dynsym_sec.compressed_file_range() { let dynsym_offset = match dynsym_sec.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -759,9 +748,7 @@ fn preprocess_impl(
{ {
match sec.compressed_file_range() { match sec.compressed_file_range() {
Ok( Ok(
range range @ CompressedFileRange {
@
CompressedFileRange {
format: CompressionFormat::None, format: CompressionFormat::None,
.. ..
}, },
@ -1627,9 +1614,14 @@ fn surgery_impl(
let flushing_data_duration = flushing_data_start.elapsed().unwrap(); let flushing_data_duration = flushing_data_start.elapsed().unwrap();
// Make sure the final executable has permision to execute. // Make sure the final executable has permision to execute.
// TODO windows alternative?
#[cfg(target_family = "unix")]
{
use std::os::unix::fs::PermissionsExt;
let mut perms = fs::metadata(out_filename)?.permissions(); let mut perms = fs::metadata(out_filename)?.permissions();
perms.set_mode(perms.mode() | 0o111); perms.set_mode(perms.mode() | 0o111);
fs::set_permissions(out_filename, perms)?; fs::set_permissions(out_filename, perms)?;
}
let total_duration = total_start.elapsed().unwrap(); let total_duration = total_start.elapsed().unwrap();

View file

@ -5,10 +5,10 @@
"homepage": "https://github.com/nmattia/niv", "homepage": "https://github.com/nmattia/niv",
"owner": "nmattia", "owner": "nmattia",
"repo": "niv", "repo": "niv",
"rev": "5830a4dd348d77e39a0f3c4c762ff2663b602d4c", "rev": "9cb7ef336bb71fd1ca84fc7f2dff15ef4b033f2a",
"sha256": "1d3lsrqvci4qz2hwjrcnd8h5vfkg8aypq3sjd4g3izbc8frwz5sm", "sha256": "1ajyqr8zka1zlb25jx1v4xys3zqmdy3prbm1vxlid6ah27a8qnzh",
"type": "tarball", "type": "tarball",
"url": "https://github.com/nmattia/niv/archive/5830a4dd348d77e39a0f3c4c762ff2663b602d4c.tar.gz", "url": "https://github.com/nmattia/niv/archive/9cb7ef336bb71fd1ca84fc7f2dff15ef4b033f2a.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz" "url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
}, },
"nixpkgs": { "nixpkgs": {
@ -17,10 +17,10 @@
"homepage": "", "homepage": "",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "fe6f208d68ac254873b659db9676d44dea9b0555", "rev": "ed02c2ba0384b2800db41333045a6fb781f12aac",
"sha256": "0ybvy1zx97k811bz73xmgsb41d33i2kr2dfqcxzq9m9h958178nq", "sha256": "040rawxqbpblxpsq73qxlk25my2cm0g3gx1pksiacsj15q5fi84q",
"type": "tarball", "type": "tarball",
"url": "https://github.com/NixOS/nixpkgs/archive/fe6f208d68ac254873b659db9676d44dea9b0555.tar.gz", "url": "https://github.com/NixOS/nixpkgs/archive/ed02c2ba0384b2800db41333045a6fb781f12aac.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz" "url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
}, },
"nixpkgs-unstable": { "nixpkgs-unstable": {
@ -29,10 +29,10 @@
"homepage": "", "homepage": "",
"owner": "NixOS", "owner": "NixOS",
"repo": "nixpkgs", "repo": "nixpkgs",
"rev": "ea171bc81fcb3c6f21deeb46dbc10000087777ef", "rev": "684c73c9e6ac8f4d0c6dea3251292e758ac375b5",
"sha256": "15hh28c98kb6pf7wgydc07bx2ivq04a2cay5mhwnqk5cpa8dbiap", "sha256": "0hl2nzizn4pwd3sn9gxkngzn88k9in01xm14afpj7716j8y0j2qa",
"type": "tarball", "type": "tarball",
"url": "https://github.com/NixOS/nixpkgs/archive/ea171bc81fcb3c6f21deeb46dbc10000087777ef.tar.gz", "url": "https://github.com/NixOS/nixpkgs/archive/684c73c9e6ac8f4d0c6dea3251292e758ac375b5.tar.gz",
"url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz" "url_template": "https://github.com/<owner>/<repo>/archive/<rev>.tar.gz"
} }
} }

View file

@ -18,8 +18,8 @@ bumpalo = { version = "3.8.0", features = ["collections"] }
const_format = "0.2.22" const_format = "0.2.22"
inkwell = {path = "../vendor/inkwell"} inkwell = {path = "../vendor/inkwell"}
libloading = "0.7.1" libloading = "0.7.1"
rustyline = {git = "https://github.com/rtfeldman/rustyline", tag = "v9.1.1"} rustyline = {git = "https://github.com/rtfeldman/rustyline", rev = "e74333c"}
rustyline-derive = {git = "https://github.com/rtfeldman/rustyline", tag = "v9.1.1"} rustyline-derive = {git = "https://github.com/rtfeldman/rustyline", rev = "e74333c"}
target-lexicon = "0.12.2" target-lexicon = "0.12.2"
# TODO: make llvm optional # TODO: make llvm optional
@ -31,7 +31,7 @@ roc_load = {path = "../compiler/load"}
roc_mono = {path = "../compiler/mono"} roc_mono = {path = "../compiler/mono"}
roc_parse = {path = "../compiler/parse"} roc_parse = {path = "../compiler/parse"}
roc_repl_eval = {path = "../repl_eval"} roc_repl_eval = {path = "../repl_eval"}
roc_std = {path = "../roc_std"} roc_std = {path = "../roc_std", default-features = false}
roc_target = {path = "../compiler/roc_target"} roc_target = {path = "../compiler/roc_target"}
roc_types = {path = "../compiler/types"} roc_types = {path = "../compiler/types"}

View file

@ -519,11 +519,11 @@ fn four_element_record() {
); );
} }
// #[test] #[test]
// fn multiline_string() { fn multiline_string() {
// // If a string contains newlines, format it as a multiline string in the output // If a string contains newlines, format it as a multiline string in the output
// expect_success(r#""\n\nhi!\n\n""#, "\"\"\"\n\nhi!\n\n\"\"\""); expect_success(r#""\n\nhi!\n\n""#, "\"\n\nhi!\n\n\" : Str");
// } }
#[test] #[test]
fn list_of_3_field_records() { fn list_of_3_field_records() {

View file

@ -1,3 +1,4 @@
use std::env;
use std::ffi::OsStr; use std::ffi::OsStr;
use std::path::Path; use std::path::Path;
use std::process::Command; use std::process::Command;
@ -11,6 +12,15 @@ fn main() {
println!("cargo:rerun-if-changed=build.rs"); println!("cargo:rerun-if-changed=build.rs");
println!("cargo:rerun-if-changed=src/{}.c", PLATFORM_FILENAME); println!("cargo:rerun-if-changed=src/{}.c", PLATFORM_FILENAME);
// When we build on Netlify, zig is not installed (but also not used,
// since all we're doing is generating docs), so we can skip the steps
// that require having zig installed.
if env::var_os("NO_ZIG_INSTALLED").is_some() {
// We still need to do the other things before this point, because
// setting the env vars is needed for other parts of the build.
return;
}
std::fs::create_dir_all("./data").unwrap(); std::fs::create_dir_all("./data").unwrap();
// Build a pre-linked binary with platform, builtins and all their libc dependencies // Build a pre-linked binary with platform, builtins and all their libc dependencies

View file

@ -25,7 +25,7 @@ python3 -m http.server
``` ```
### 3. Open your browser ### 3. Open your browser
You should be able to find the Roc REPL at http://127.0.0.1:8000 (or wherever your web server said when it started up.) You should be able to find the Roc REPL at http://127.0.0.1:8000/repl (or whatever port your web server mentioned when it started up.)
**Warning:** This is work in progress! Not all language features are implemented yet, error messages don't look nice yet, up/down arrows don't work for history, etc. **Warning:** This is work in progress! Not all language features are implemented yet, error messages don't look nice yet, up/down arrows don't work for history, etc.

View file

@ -1,6 +1,6 @@
#!/usr/bin/env bash #!/usr/bin/env bash
set -eux set -euxo pipefail
if [[ ! -d repl_www ]] if [[ ! -d repl_www ]]
then then
@ -14,9 +14,13 @@ then
cargo install wasm-pack cargo install wasm-pack
fi fi
WWW_DIR="repl_www/build" # output directory is first argument or default
mkdir -p $WWW_DIR WWW_ROOT="${1:-repl_www/build}"
cp repl_www/public/* $WWW_DIR mkdir -p $WWW_ROOT
# End up with `repl/index.html` and everything else at the root directory.
# We want all the assets to be at the root because the files auto-generated by `cargo` expect them to be there.
cp -r repl_www/public/* $WWW_ROOT
# When debugging the REPL, use `REPL_DEBUG=1 repl_www/build.sh` # When debugging the REPL, use `REPL_DEBUG=1 repl_www/build.sh`
if [ -n "${REPL_DEBUG:-}" ] if [ -n "${REPL_DEBUG:-}" ]
@ -28,10 +32,10 @@ else
wasm-pack build --target web repl_wasm wasm-pack build --target web repl_wasm
fi fi
cp repl_wasm/pkg/*.wasm $WWW_DIR cp repl_wasm/pkg/*.wasm $WWW_ROOT
# Copy the JS from wasm_bindgen, replacing its invalid `import` statement with a `var`. # Copy the JS from wasm_bindgen, replacing its invalid `import` statement with a `var`.
# The JS import from the invalid path 'env', seems to be generated when there are unresolved symbols. # The JS import from the invalid path 'env', seems to be generated when there are unresolved symbols.
BINDGEN_FILE="roc_repl_wasm.js" BINDGEN_FILE="roc_repl_wasm.js"
echo 'var __wbg_star0 = { now: Date.now };' > $WWW_DIR/$BINDGEN_FILE echo 'var __wbg_star0 = { now: Date.now };' > $WWW_ROOT/$BINDGEN_FILE
grep -v '^import' repl_wasm/pkg/$BINDGEN_FILE >> $WWW_DIR/$BINDGEN_FILE grep -v '^import' repl_wasm/pkg/$BINDGEN_FILE >> $WWW_ROOT/$BINDGEN_FILE

View file

@ -1,97 +0,0 @@
<html>
<head>
<style>
body {
background-color: #222;
color: #ccc;
font-family: sans-serif;
font-size: 18px;
}
.body-wrapper {
display: flex;
flex-direction: column;
max-width: 900px;
height: 100%;
margin: 0 auto;
padding: 0 24px;
}
h1 {
margin: 32px auto;
color: #eee;
text-align: center;
}
li {
margin: 8px;
}
section.history {
flex: 1;
}
.scroll-wrap {
position: relative;
height: 100%;
}
.scroll {
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
overflow: auto;
}
#history-text {
margin: 16px 0;
padding: 8px;
}
#history-text .input {
margin-bottom: 8px;
}
#history-text .output {
margin-bottom: 16px;
}
#history-text .output-ok {
color: #0f8;
}
#history-text .output-error {
color: #f00;
}
.code {
font-family: "Courier New", Courier, monospace;
background-color: #111;
color: #fff;
}
section.source {
display: flex;
flex-direction: column;
}
section.source input {
height: 32px;
padding: 8px;
margin-bottom: 16px;
}
</style>
<title>Roc REPL</title>
</head>
<body>
<div class="body-wrapper">
<section class="text">
<h1>The rockin' Roc REPL</h1>
</section>
<section class="history">
<div class="scroll-wrap">
<div id="history-text" class="scroll code"></div>
</div>
</section>
<section class="source">
<input
id="source-input"
class="code"
placeholder="Type some Roc code and press Enter"
/>
</section>
</div>
<script type="module" src="repl.js"></script>
</body>
</html>

72
repl_www/public/repl.css Normal file
View file

@ -0,0 +1,72 @@
html {
height: 100%;
}
body {
height: 100%;
background-color: #222;
color: #ccc;
font-family: sans-serif;
font-size: 18px;
}
.body-wrapper {
display: flex;
flex-direction: column;
max-width: 900px;
height: 100%;
margin: 0 auto;
padding: 0 24px;
}
h1 {
margin: 32px auto;
color: #eee;
text-align: center;
}
li {
margin: 8px;
}
section.history {
flex: 1;
}
.scroll-wrap {
position: relative;
height: 100%;
}
.scroll {
position: absolute;
top: 0;
bottom: 0;
left: 0;
right: 0;
overflow: auto;
}
#history-text {
margin: 16px 0;
padding: 8px;
}
#history-text .input {
margin-bottom: 8px;
}
#history-text .output {
margin-bottom: 16px;
}
#history-text .output-ok {
color: #0f8;
}
#history-text .output-error {
color: #f00;
}
.code {
font-family: "Courier New", Courier, monospace;
background-color: #111;
color: #fff;
}
section.source {
display: flex;
flex-direction: column;
}
section.source textarea {
height: 32px;
padding: 8px;
margin-bottom: 16px;
}

View file

@ -2,8 +2,8 @@
window.js_create_app = js_create_app; window.js_create_app = js_create_app;
window.js_run_app = js_run_app; window.js_run_app = js_run_app;
window.js_get_result_and_memory = js_get_result_and_memory; window.js_get_result_and_memory = js_get_result_and_memory;
import * as roc_repl_wasm from "./roc_repl_wasm.js"; import * as roc_repl_wasm from "/roc_repl_wasm.js";
import { getMockWasiImports } from "./wasi.js"; import { getMockWasiImports } from "/wasi.js";
// ---------------------------------------------------------------------------- // ----------------------------------------------------------------------------
// REPL state // REPL state

View file

@ -0,0 +1,29 @@
<!DOCTYPE html>
<html>
<head>
<title>Roc REPL</title>
<link rel="stylesheet" href="/repl.css" />
</head>
<body>
<div class="body-wrapper">
<section class="text">
<h1>The rockin' Roc REPL</h1>
</section>
<section class="history">
<div class="scroll-wrap">
<div id="history-text" class="scroll code"></div>
</div>
</section>
<section class="source">
<textarea autofocus id="source-input" class="code" placeholder="Type some Roc code and press Enter"></textarea>
</section>
</div>
<script type="module" src="/repl.js"></script>
</body>
</html>

View file

@ -1,5 +1,5 @@
use roc_can::expected::{Expected, PExpected}; use roc_can::expected::{Expected, PExpected};
use roc_collections::all::{Index, MutSet, SendMap}; use roc_collections::all::{HumanIndex, MutSet, SendMap};
use roc_module::called_via::{BinOp, CalledVia}; use roc_module::called_via::{BinOp, CalledVia};
use roc_module::ident::{Ident, IdentStr, Lowercase, TagName}; use roc_module::ident::{Ident, IdentStr, Lowercase, TagName};
use roc_module::symbol::Symbol; use roc_module::symbol::Symbol;
@ -350,7 +350,7 @@ fn to_expr_report<'b>(
num_branches, num_branches,
.. ..
} if num_branches == 2 => alloc.concat(vec![ } if num_branches == 2 => alloc.concat(vec![
alloc.keyword(if index == Index::FIRST { alloc.keyword(if index == HumanIndex::FIRST {
"then" "then"
} else { } else {
"else" "else"
@ -1384,7 +1384,7 @@ fn to_pattern_report<'b>(
} }
} }
PReason::WhenMatch { index } => { PReason::WhenMatch { index } => {
if index == Index::FIRST { if index == HumanIndex::FIRST {
let doc = alloc.stack(vec![ let doc = alloc.stack(vec![
alloc alloc
.text("The 1st pattern in this ") .text("The 1st pattern in this ")

View file

@ -1,7 +1,7 @@
extern crate bumpalo; extern crate bumpalo;
use self::bumpalo::Bump; use self::bumpalo::Bump;
use roc_can::constraint::Constraint; use roc_can::constraint::{Constraint, Constraints};
use roc_can::env::Env; use roc_can::env::Env;
use roc_can::expected::Expected; use roc_can::expected::Expected;
use roc_can::expr::{canonicalize_expr, Expr, Output}; use roc_can::expr::{canonicalize_expr, Expr, Output};
@ -28,14 +28,12 @@ pub fn test_home() -> ModuleId {
pub fn infer_expr( pub fn infer_expr(
subs: Subs, subs: Subs,
problems: &mut Vec<solve::TypeError>, problems: &mut Vec<solve::TypeError>,
constraints: &Constraints,
constraint: &Constraint, constraint: &Constraint,
expr_var: Variable, expr_var: Variable,
) -> (Content, Subs) { ) -> (Content, Subs) {
let env = solve::Env { let env = solve::Env::default();
aliases: MutMap::default(), let (solved, _) = solve::run(constraints, &env, problems, subs, constraint);
vars_by_symbol: MutMap::default(),
};
let (solved, _) = solve::run(&env, problems, subs, constraint);
let content = solved let content = solved
.inner() .inner()
@ -99,6 +97,7 @@ pub struct CanExprOut {
pub var_store: VarStore, pub var_store: VarStore,
pub var: Variable, pub var: Variable,
pub constraint: Constraint, pub constraint: Constraint,
pub constraints: Constraints,
} }
#[derive(Debug)] #[derive(Debug)]
@ -155,9 +154,11 @@ pub fn can_expr_with<'a>(
&loc_expr.value, &loc_expr.value,
); );
let mut constraints = Constraints::new();
let constraint = constrain_expr( let constraint = constrain_expr(
&mut constraints,
&roc_constrain::expr::Env { &roc_constrain::expr::Env {
rigids: ImMap::default(), rigids: MutMap::default(),
home, home,
}, },
loc_expr.region, loc_expr.region,
@ -177,7 +178,7 @@ pub fn can_expr_with<'a>(
//load builtin values //load builtin values
let (_introduced_rigids, constraint) = let (_introduced_rigids, constraint) =
constrain_imported_values(imports, constraint, &mut var_store); constrain_imported_values(&mut constraints, imports, constraint, &mut var_store);
let mut all_ident_ids = MutMap::default(); let mut all_ident_ids = MutMap::default();
@ -203,6 +204,7 @@ pub fn can_expr_with<'a>(
interns, interns,
var, var,
constraint, constraint,
constraints,
}) })
} }

View file

@ -62,6 +62,7 @@ mod test_reporting {
output, output,
var_store, var_store,
var, var,
constraints,
constraint, constraint,
home, home,
interns, interns,
@ -79,7 +80,8 @@ mod test_reporting {
} }
let mut unify_problems = Vec::new(); let mut unify_problems = Vec::new();
let (_content, mut subs) = infer_expr(subs, &mut unify_problems, &constraint, var); let (_content, mut subs) =
infer_expr(subs, &mut unify_problems, &constraints, &constraint, var);
name_all_type_vars(var, &mut subs); name_all_type_vars(var, &mut subs);
@ -4490,32 +4492,6 @@ mod test_reporting {
) )
} }
#[test]
fn record_type_indent_end() {
report_problem_as(
indoc!(
r#"
f : { a: Int
}
"#
),
indoc!(
r#"
NEED MORE INDENTATION
I am partway through parsing a record type, but I got stuck here:
1 f : { a: Int
2 }
^
I need this curly brace to be indented more. Try adding more spaces
before it!
"#
),
)
}
#[test] #[test]
fn record_type_keyword_field_name() { fn record_type_keyword_field_name() {
report_problem_as( report_problem_as(
@ -5451,36 +5427,6 @@ mod test_reporting {
) )
} }
#[test]
fn list_bad_indent() {
report_problem_as(
indoc!(
r#"
x = [ 1, 2,
]
x
"#
),
indoc!(
r#"
UNFINISHED LIST
I cannot find the end of this list:
1 x = [ 1, 2,
^
You could change it to something like [ 1, 2, 3 ] or even just [].
Anything where there is an open and a close square bracket, and where
the elements of the list are separated by commas.
Note: I may be confused by indentation
"#
),
)
}
#[test] #[test]
fn number_double_dot() { fn number_double_dot() {
report_problem_as( report_problem_as(
@ -6469,38 +6415,6 @@ I need all branches in an `if` to have the same type!
) )
} }
#[test]
fn outdented_alias() {
report_problem_as(
indoc!(
r#"
Box item : [
Box item,
Items item item
]
4
"#
),
indoc!(
r#"
NEED MORE INDENTATION
I am partway through parsing a tag union type, but I got stuck here:
1 Box item : [
2 Box item,
3 Items item item
4 ]
^
I need this square bracket to be indented more. Try adding more spaces
before it!
"#
),
)
}
#[test] #[test]
fn outdented_in_parens() { fn outdented_in_parens() {
report_problem_as( report_problem_as(
@ -6532,36 +6446,6 @@ I need all branches in an `if` to have the same type!
) )
} }
#[test]
fn outdented_record() {
report_problem_as(
indoc!(
r#"
Box : {
id: Str
}
4
"#
),
indoc!(
r#"
NEED MORE INDENTATION
I am partway through parsing a record type, but I got stuck here:
1 Box : {
2 id: Str
3 }
^
I need this curly brace to be indented more. Try adding more spaces
before it!
"#
),
)
}
#[test] #[test]
fn backpassing_type_error() { fn backpassing_type_error() {
report_problem_as( report_problem_as(
@ -8558,4 +8442,48 @@ I need all branches in an `if` to have the same type!
), ),
) )
} }
#[test]
fn let_polymorphism_with_scoped_type_variables() {
report_problem_as(
indoc!(
r#"
f : a -> a
f = \x ->
y : a -> a
y = \z -> z
n = y 1u8
x1 = y x
(\_ -> x1) n
f
"#
),
indoc!(
r#"
TYPE MISMATCH
The 1st argument to `y` is not what I expect:
6 n = y 1u8
^^^
This argument is an integer of type:
U8
But `y` needs the 1st argument to be:
a
Tip: The type annotation uses the type variable `a` to say that this
definition can produce any type of value. But in the body I see that
it will only produce a `U8` value of a single specific type. Maybe
change the type annotation to be more specific? Maybe change the code
to be more general?
"#
),
)
}
} }

View file

@ -183,8 +183,7 @@ is zero-configuration like `elm-format`) formats multi-line record literals (and
record types) with a comma at the end of each line, like so: record types) with a comma at the end of each line, like so:
```elm ```elm
user = user = {
{
firstName: "Sam", firstName: "Sam",
lastName: "Sample", lastName: "Sample",
email: "sam@example.com", email: "sam@example.com",
@ -456,22 +455,19 @@ The key is that each of the error types is a type alias for a Roc *tag union*.
Here's how those look: Here's how those look:
```elm ```elm
Http.Err a : Http.Err a : [
[
PageNotFound, PageNotFound,
Timeout, Timeout,
BadPayload Str, BadPayload Str,
]a ]a
File.ReadErr a : File.ReadErr a : [
[
FileNotFound, FileNotFound,
Corrupted, Corrupted,
BadFormat, BadFormat,
]a ]a
File.WriteErr a : File.WriteErr a : [
[
FileNotFound, FileNotFound,
DiskFull, DiskFull,
]a ]a
@ -758,86 +754,6 @@ Elm does permit overriding open imports - e.g. if you have
`import Foo exposing (bar)`, or `import Foo exposing (..)`, you can still define `import Foo exposing (bar)`, or `import Foo exposing (..)`, you can still define
`bar = ...` in the module. Roc treats this as shadowing and does not allow it. `bar = ...` in the module. Roc treats this as shadowing and does not allow it.
## Function equality
In Elm, if you write `(\val -> val) == (\val -> val)`, you currently get a runtime exception
which links to [the `==` docs](https://package.elm-lang.org/packages/elm/core/latest/Basics#==),
which explain why this is the current behavior and what the better version will look like.
> OCaml also has the "runtime exception if you compare functions for structural equality"
> behavior, but unlike Elm, in OCaml this appears to be the long-term design.
In Roc, function equality is a compile error, tracked explicitly in the type system.
Here's the type of Roc's equality function:
```elm
'val, 'val -> Bool
```
Whenever a named type variable in Roc has a `'` at the beginning, that means
it is a *functionless* type - a type which cannot involve functions.
If there are any functions in that type, you get a type mismatch. This is true
whether `val` itself is a function, or if it's a type that wraps a function,
like `{ predicate: (Str -> Bool) }` or `List (Bool -> Bool)`.
So if you write `(\a -> a) == (\a -> a)` in Roc, you'll get a type mismatch.
If you wrap both sides of that `==` in a record or list, you'll still get a
type mismatch.
If a named type variable has a `'` anywhere in a given type, then it must have a `'`
everywhere in that type. So it would be an error to have a type like `x, 'x -> Bool`
because `x` has a `'` in one place but not everywhere.
## Standard Data Structures
Elm has `List`, `Array`, `Set`, and `Dict` in the standard library.
Roc has all of these except `Array`, and there are some differences in how they work:
* `List` in Roc uses the term "list" the way Python does: to mean an ordered sequence of elements. Roc's `List` is more like an array, in that all the elements are sequential in memory and can be accessed in constant time. It still uses the `[` `]` syntax for list literals. Also there is no `::` operator because "cons" is not an efficient operation on an array like it is in a linked list.
* `Set` in Roc is like `Set` in Elm: it's shorthand for a `Dict` with keys but no value, and it has a slightly different API.
* `Dict` in Roc is like `Dict` in Elm, except it's backed by hashing rather than ordering. Roc silently computes hash values for any value that can be used with `==`, so instead of a `comparable` constraint on `Set` elements and `Dict` keys, in Roc they instead have the *functionless* constraint indicated with a `'`.
Roc also has a literal syntax for dictionaries and sets. Here's how to write a `Dict` literal:
```elm
{: "Sam" => True, "Ali" => False, firstName => False :}
```
This expression has the type `Dict Str Bool`, and the `firstName` variable would
necessarily be a `Str` as well.
The `Dict` literal syntax is for two reasons. First, Roc doesn't have tuples;
without tuples, initializing the above `Dict` would involve an API that looked
something like one of these:
```elm
Dict.fromList [ { k: "Sam", v: True }, { k: "Ali", v: False }, { k: firstName, v: False } ]
Dict.fromList [ KV "Sam" True, KV "Ali" False KV firstName False
```
This works, but is not nearly as nice to read.
Additionally, `Dict` literals can compile directly to efficient initialization code
without needing to (hopefully be able to) optimize away the intermediate
`List` involved in `fromList`.
`{::}` is an empty `Dict`.
You can write a `Set` literal like this:
```elm
[: "Sam", "Ali", firstName :]
```
The `Set` literal syntax is partly for the initialization benefit, and also
for symmetry with the `Dict` literal syntax.
`[::]` is an empty `Set`.
Roc does not have syntax for pattern matching on data structures - not even `[` `]` like Elm does.
## Operators ## Operators
In Elm, operators are functions. In Roc, all operators are syntax sugar. In Elm, operators are functions. In Roc, all operators are syntax sugar.
@ -1318,51 +1234,6 @@ If you put these into a hypothetical Roc REPL, here's what you'd see:
28 : Int * 28 : Int *
``` ```
## Phantom Types
[Phantom types](https://medium.com/@ckoster22/advanced-types-in-elm-phantom-types-808044c5946d)
exist in Elm but not in Roc. This is because phantom types can't be defined
using type aliases (in fact, there is a custom error message in Elm if you
try to do this), and Roc only has type aliases. However, in Roc, you can achieve
the same API and runtime performance characteristics as if you had phantom types,
by using *phantom values* instead.
A phantom value is one which affects types, but which holds no information at runtime.
As an example, let's say I wanted to define a [units library](https://package.elm-lang.org/packages/ianmackenzie/elm-units/latest/) -
a classic example of phantom types. I could do that in Roc like this:
```
Quantity units data : [ Quantity units data ]
km : Num a -> Quantity [ Km ] (Num a)
km = \num ->
Quantity Km num
cm : Num a -> Quantity [ Cm ] (Num a)
cm = \num ->
Quantity Cm num
mm : Num a -> Quantity [ Mm ] (Num a)
mm = \num ->
Quantity Mm num
add : Quantity u (Num a), Quantity u (Num a) -> Quantity u (Num a)
add = \Quantity units a, Quantity _ b ->
Quantity units (a + b)
```
From a performance perspective, it's relevant here that `[ Km ]`, `[ Cm ]`, and `[ Mm ]`
are all unions containing a single tag. That means they hold no information at runtime
(they would always destructure to the same tag), which means they can be "unboxed" away -
that is, discarded prior to code generation.
During code generation, Roc treats `Quantity [ Km ] Int` as equivalent to `Quantity Int`.
Then, because `Quantity Int` is an alias for `[ Quantity Int ]`, it will unbox again
and reduce that all the way down to to `Int`.
This means that, just like phantom *types*, phantom *values* affect type checking
only, and have no runtime overhead. Rust has a related concept called [phantom data](https://doc.rust-lang.org/nomicon/phantom-data.html).
## Standard library ## Standard library
`elm/core` has these modules: `elm/core` has these modules:

View file

@ -11,9 +11,16 @@ cd $SCRIPT_RELATIVE_DIR
rm -rf build/ rm -rf build/
cp -r public/ build/ cp -r public/ build/
# grab the source code and copy it to Netlify's server; if it's not there, fail the build.
pushd build pushd build
# grab the source code and copy it to Netlify's server; if it's not there, fail the build.
wget https://github.com/rtfeldman/elm-css/files/8037422/roc-source-code.zip wget https://github.com/rtfeldman/elm-css/files/8037422/roc-source-code.zip
# grab the pre-compiled REPL and copy it to Netlify's server; if it's not there, fail the build.
wget https://github.com/brian-carroll/mock-repl/files/8167902/roc_repl_wasm.tar.gz
tar xzvf roc_repl_wasm.tar.gz
rm roc_repl_wasm.tar.gz
cp -r ../../repl_www/public/* .
popd popd
# pushd .. # pushd ..

16
www/netlify.sh Normal file
View file

@ -0,0 +1,16 @@
#!/bin/bash
# Runs on every Netlify build, to set up the Netlify server.
set -euxo pipefail
rustup update
rustup default stable
# TODO remove this once we actually build the web repl!
SCRIPT_DIR=$( cd -- "$( dirname -- "${BASH_SOURCE[0]}" )" &> /dev/null && pwd )
REPL_WASM_DATA=${SCRIPT_DIR}/../repl_wasm/data/
mkdir -p ${REPL_WASM_DATA}
touch ${REPL_WASM_DATA}/pre_linked_binary.o
bash build.sh

View file

@ -5,14 +5,19 @@
# https://docs.netlify.com/routing/headers/#syntax-for-the-netlify-configuration-file # https://docs.netlify.com/routing/headers/#syntax-for-the-netlify-configuration-file
[build] [build]
publish = "build/" publish = "build/"
command = "bash build.sh" command = "bash netlify.sh"
# Always build on push - see https://answers.netlify.com/t/builds-cancelled-for-a-new-branch-due-to-no-content-change/17169/2
ignore = "/bin/false"
[[headers]] [[headers]]
for = "/*" for = "/*"
[headers.values] [headers.values]
X-Frame-Options = "DENY" X-Frame-Options = "DENY"
X-XSS-Protection = "1; mode=block" X-XSS-Protection = "1; mode=block"
Content-Security-Policy = "default-src 'self'; img-src *;" # unsafe-eval is needed for wasm compilation in the repl to work on Safari and Chrome;
# otherwise they block it.
# TODO figure out how to tell Netlify to apply that policy only to the repl, not to everything.
Content-Security-Policy = "default-src 'self'; img-src *; script-src 'self' 'unsafe-eval';"
X-Content-Type-Options = "nosniff" X-Content-Type-Options = "nosniff"
# Redirect roc-lang.org/authors to the AUTHORS file in this repo # Redirect roc-lang.org/authors to the AUTHORS file in this repo

View file

@ -1,4 +1,4 @@
<!doctype html> <!DOCTYPE html>
<html lang="en"> <html lang="en">
@ -15,6 +15,7 @@
font-family: sans-serif; font-family: sans-serif;
line-height: 145%; line-height: 145%;
} }
li { li {
margin-bottom: 0.5rem; margin-bottom: 0.5rem;
} }