1
0
forked from GitHub/gf-core

Compare commits

..

118 Commits
wasm ... lpgf

Author SHA1 Message Date
John J. Camilleri
b47eb18f86 Minor additions to LPGF readme 2021-08-27 09:32:43 +02:00
John J. Camilleri
1ce1cea068 Add bounds check (probably unnecessary) 2021-08-10 16:45:42 +02:00
John J. Camilleri
48dba4ade5 Pass all missing test cases, including Phrasebook, except PhrasebookSnd 2021-08-10 11:46:28 +02:00
John J. Camilleri
b96fa7e08a Add more unit tests for missing lin functions 2021-08-10 11:32:51 +02:00
John J. Camilleri
42bdee1e5f Remove workarounds for bugs in canonical format 2021-08-04 10:58:49 +02:00
John J. Camilleri
e2ed512bbb Merge branch 'master' into lpgf 2021-08-01 09:38:42 +02:00
John J. Camilleri
1b8a9b37b0 Remove auto/wagen variant from PhrasebookGer 2021-07-22 22:22:56 +02:00
John J. Camilleri
e681e4dbb0 Remove --show-details flag in CI/cabal 2021-07-08 15:05:47 +02:00
John J. Camilleri
639f1f043a Cabal file fixes. Fix tests in CI with cabal too. 2021-07-08 14:26:41 +02:00
John J. Camilleri
c02a3e0617 Merge branch 'master' into lpgf 2021-07-08 13:57:18 +02:00
John J. Camilleri
d6e26e0577 Do not run lpgf tests in CI, since they require RGL 2021-07-08 13:45:26 +02:00
John J. Camilleri
89a01d81cc Add ghc-prim to build-depends for LPGF testsuite 2021-07-08 12:46:49 +02:00
John J. Camilleri
2315641e77 Don't build benchmarks in CI 2021-07-08 12:33:39 +02:00
John J. Camilleri
7dc396e841 Merge branch 'master' into lpgf
# Conflicts:
#	gf.cabal
2021-07-08 12:25:46 +02:00
John J. Camilleri
d6416089d6 Move some definitions into LPGF.Internal, clean up public API. 2021-07-08 11:34:29 +02:00
John J. Camilleri
7b0637850c Add explicit exports to LPGF module 2021-07-07 13:25:41 +02:00
John J. Camilleri
2b8d792e09 Updates to cabal file 2021-07-07 12:38:01 +02:00
John J. Camilleri
045def61d8 Merge branch 'master' into lpgf
# Conflicts:
#	gf.cabal
#	src/compiler/GF/Grammar/Canonical.hs
2021-07-07 09:42:44 +02:00
John J. Camilleri
2be54ffb12 Update ConcreteNew, RConcrete in gf.cabal 2021-07-07 08:55:35 +02:00
John J. Camilleri
4bd26eae6d Merge branch 'master' into lpgf
# Conflicts:
#	gf.cabal
#	src/compiler/GF/Compile/GrammarToCanonical.hs
#	src/compiler/GF/Grammar/Canonical.hs
#	src/compiler/GF/Infra/Option.hs
2021-07-07 08:36:09 +02:00
John J. Camilleri
c1af40532c Add some stuff to gitignore 2021-04-30 13:06:50 +02:00
John J. Camilleri
d0c27cdaae Make GF.Grammar.Canonical.Id a type synonym for GF.Infra.Ident.RawIdent
This avoids a lot of conversion back and forth between Strings and ByteStrings
2021-04-06 22:15:07 +02:00
John J. Camilleri
f7df62a445 Add support for literals 2021-03-22 09:12:34 +01:00
John J. Camilleri
2d066853f1 Add unit test for literals (fails) 2021-03-22 08:37:59 +01:00
John J. Camilleri
f900ea3885 Don't process impossible values at all (not even for finding their types) 2021-03-16 16:59:58 +01:00
John J. Camilleri
d9c37fc093 Minors to LPGF readme 2021-03-12 13:47:36 +01:00
John J. Camilleri
c9f0867491 Remove state Map from compilation 2021-03-12 13:46:50 +01:00
John J. Camilleri
6c6a201d96 Introduce state with Map for caching compilation, but results are worse 2021-03-12 13:39:56 +01:00
John J. Camilleri
8f5033e4ce Add notes on profiling 2021-03-09 08:36:35 +01:00
John J. Camilleri
126b61ea03 Merge branch 'master' into lpgf 2021-03-08 13:52:34 +01:00
John J. Camilleri
99abb9b2a5 Add Phrasebook benchmark snippet to LPGF README 2021-03-08 13:37:02 +01:00
John J. Camilleri
3e9d12854a Switch to 10000-tree Phrasebook treebank. All errors to do with missing functions, plus variants in German. 2021-03-08 11:19:06 +01:00
John J. Camilleri
fd07946a50 Remove commented line 2021-03-08 10:42:16 +01:00
John J. Camilleri
c76efcf916 Use C runtime in mkTreebank script 2021-03-08 10:17:03 +01:00
John J. Camilleri
785d6069e2 Fix lin2string and pass all unittests and Phrasebook 2021-03-08 09:53:10 +01:00
John J. Camilleri
0f4b349b0b Remove old commented code 2021-03-05 16:51:59 +01:00
John J. Camilleri
dbf369aae5 Make removal of record fields recursive. Latest results with Phrasebook:
Bul ✓
Cat ✗
Chi ✓
Dan ✓
Dut ✓
Eng ✓
Est ✓
Fin ✓
Fre ✗
Ger ✓
Hin ✓
Ita ✗
Jpn ✓
Lav ✓
Nor ✓
Pol ✓
Ron ✓
Snd ✗
Spa ✓
Swe ✓
Tha ✓
Urd ✓

Passed 18 | Failed 4 | Total 22
2021-03-05 16:48:05 +01:00
John J. Camilleri
0d4659fe8c Add workaround for missing param defs. Add links to gf-core issues in workaround comments. 2021-03-05 13:23:00 +01:00
John J. Camilleri
575a746a3e Add LPGF function for catching errors. Manual fixes to Phrasebook treebank. 2021-03-05 12:05:25 +01:00
John J. Camilleri
70581c2d8c Improve base case in table handling, cleanup. Add run-phrasebook script, current output:
Bul ✗
Cat ✗
Chi ✓
Dan ✓
Dut ✓
Eng ✓
Est ✓
Fin ✗
Fre ✗
Ger ✓
Hin ✓
Ita ✗
Jpn ✓
Lav ✓
Nor ✓
Pol ✓
Ron ✓
Snd ✗
Spa ✗
Swe ✓
Tha ✓
Urd ✓

Passed 15 | Failed 7 | Total 22
2021-03-04 17:09:35 +01:00
John J. Camilleri
bca1e2286d New handling of tables, works for all tests but Phrasebook still fails 2021-03-04 16:42:56 +01:00
John J. Camilleri
94f76b9e36 Add more tests to Params5 which cause it to fail again
Originally found in PhrasebookFre
2021-03-04 13:38:55 +01:00
John J. Camilleri
f5886bf447 Add more complex param/table unit tests and pass them. Still fails on Phrasebook though. 2021-03-04 12:37:12 +01:00
John J. Camilleri
0ba0438dc7 Add a little colour to benchmark output 2021-03-04 10:20:57 +01:00
John J. Camilleri
30b016032d Also store Pre prefixes in token map. Introduce IntMapBuilder data structure.
Storing of prefixes uses show/read, which isn't a great solution but avoids having yet another token map.
2021-03-04 09:58:17 +01:00
John J. Camilleri
4082c006c3 Extract token strings and put them in map which linfuns refer to by index, to reduce LPGF sizes. 2021-03-04 00:16:12 +01:00
John J. Camilleri
adc162b374 Pass all unit tests and Foods again, with new strategy. Cleanup. 2021-03-03 15:21:32 +01:00
John J. Camilleri
3beed2c49e Replace list comprehension lookups with maps. Halfway through transitioning to new strategy for tables/params, see testsuite/lpgf/README.md. 2021-03-03 13:26:03 +01:00
John J. Camilleri
a8e3dc8855 Improve mkTreebank script. Add 100-tree Phrasebook treebank. Improve output in testsuite. 2021-03-03 11:01:31 +01:00
John J. Camilleri
997d7c1694 Use ErrorMonad instead of IOE
It probably ends up being the same thing, but the code is a little cleaner for it.
2021-03-03 09:36:48 +01:00
John J. Camilleri
4c09e4a340 Remove LF prefix from constructors. Pass all unit tests and Foods again, but improvements/cleanup still necessary. 2021-03-03 09:19:52 +01:00
John J. Camilleri
33e0e98aec Add 1-tree treebank for Phrasebook in a few languages 2021-02-28 00:34:46 +01:00
John J. Camilleri
83bc3c9c6e More work on params: pass all tests except params1 (!) 2021-02-27 23:13:02 +01:00
John J. Camilleri
f42b5ec9ef More work on params, but Foods fails now 2021-02-26 20:25:05 +01:00
John J. Camilleri
4771d9c356 WIP params 2021-02-26 17:18:21 +01:00
John J. Camilleri
9785f8351d Reduce Params2 further 2021-02-26 11:52:12 +01:00
John J. Camilleri
6a5d735904 Reduce Params2 unittest (still fails) 2021-02-26 10:26:11 +01:00
John J. Camilleri
8324ad8801 Add pretty-printing of LPGF grammars, to help debugging 2021-02-26 10:13:33 +01:00
John J. Camilleri
20290be616 Add Params2 unit test, from problem uncovered in PhrasebookGer 2021-02-22 10:52:37 +01:00
John J. Camilleri
b4a393ac09 Pass missing unit test 2021-02-21 14:22:46 +01:00
John J. Camilleri
9942908df9 Add unit test for missing lins 2021-02-21 14:05:31 +01:00
John J. Camilleri
dca2ebaf72 Add Phrasebook to testsuite. Move grammars into subfolders. Add run-bench script. 2021-02-20 13:22:29 +01:00
John J. Camilleri
5ad5789b31 Filter out record fields which don't exist in lintype
This is to work around an inconsistency in the canonical representation
2021-02-19 15:19:40 +01:00
John J. Camilleri
9f3f4139b1 Grammar and languages to run in testsuite can be specified by command line options, see README 2021-02-19 11:14:55 +01:00
John J. Camilleri
505c12c528 Rename run.hs to test.hs 2021-02-19 09:33:35 +01:00
John J. Camilleri
023b50557e Write LPGF dump to file when DEBUG is set, rather than console 2021-02-19 09:31:26 +01:00
John J. Camilleri
2b0493eece Tweak memory reporting and strictness in benchmark 2021-02-19 09:18:01 +01:00
John J. Camilleri
51e543878b Add support for wildcards when specifying modules names in benchmark compilation 2021-02-18 21:34:23 +01:00
John J. Camilleri
625386a14f Force evaluation in benchmark linearisation
BangPatterns only does WHNF which is not sufficient, previous benchmark results are thus wrong
2021-02-18 21:01:30 +01:00
John J. Camilleri
5240749fad Make grammar and trees files command line arguments into benchmark script 2021-02-18 15:27:25 +01:00
John J. Camilleri
e6079523f1 Remove ParamAliasDefs by inlining their definitions 2021-02-18 14:45:10 +01:00
John J. Camilleri
866a2101e1 When projecting a non-existent field, return Prelude.False
This seems to be GF's own behaviour, as exhibited by the canonical version of PhrasebookTha:

    NNumeral Numeral_0 = {s = Numeral_0.s; hasC = <>.hasC};
2021-02-18 14:42:39 +01:00
John J. Camilleri
d8557e8433 Enable debug output to files with envvar DEBUG=1 2021-02-18 14:40:03 +01:00
John J. Camilleri
7a5bc2dab3 Separate compile/run in benchmark 2021-02-17 16:57:06 +01:00
John J. Camilleri
9a263450f5 Add PFG2 linearisation to benchmark 2021-02-17 15:30:11 +01:00
John J. Camilleri
8e1fa4981f Add memory stats to benchmark 2021-02-17 15:02:39 +01:00
John J. Camilleri
b4fce5db59 Use envvars in benchmark for controlling PGF/LPGF. Add readme. 2021-02-17 11:44:00 +01:00
John J. Camilleri
6a7ead0f84 Add benchmark for comparing PGF and LPGF 2021-02-17 10:04:36 +01:00
John J. Camilleri
d3988f93d5 writePGF et al. functions return path[s] of written files 2021-02-17 10:03:52 +01:00
John J. Camilleri
236dbdbba3 Minor tidying 2021-02-17 00:15:44 +01:00
John J. Camilleri
768c3d9b2d Include return types for params, records, pre 2021-02-17 00:04:37 +01:00
John J. Camilleri
29114ce606 Improve binary format, reducing Foods.lpgf from 300 to 73KB (4x smaller!) 2021-02-16 23:30:21 +01:00
John J. Camilleri
5be21dba1c Add and pass FoodsJpn 2021-02-16 22:49:37 +01:00
John J. Camilleri
d5cf00f711 Add and pass all Foods languages, except Jpn 2021-02-16 22:41:28 +01:00
John J. Camilleri
312cfeb69d Add Afr, Amh, Cat, Cze, Dut, Ger foods grammars to testsuite 2021-02-16 22:33:26 +01:00
John J. Camilleri
2d03b9ee0c Finish type passing in val2lin, generalise projection case and pass FoodsFre testsuite. 2021-02-16 21:07:24 +01:00
John J. Camilleri
4c06c3f825 Add case for when pre is not followed by anything 2021-02-16 21:01:01 +01:00
John J. Camilleri
7227ede24b WIP return type from val2lin for use in projection case 2021-02-16 17:18:01 +01:00
John J. Camilleri
398b294734 Use Data.Text instead of String. Rename Abstr to Abstract, Concr to Concrete. 2021-02-16 16:04:40 +01:00
John J. Camilleri
d394cacddf Add support for CAPIT and ALL_CAPIT 2021-02-16 15:17:54 +01:00
John J. Camilleri
21f14c2aa1 Add support for SOFT_SPACE 2021-02-16 14:57:33 +01:00
John J. Camilleri
23e49cddb7 Add support for SOFT_BIND (which PGF runtime doesn't support) 2021-02-16 14:51:29 +01:00
John J. Camilleri
4d1217b06d Add support for pre 2021-02-15 21:57:05 +01:00
John J. Camilleri
4f0abe5540 Add FoodsFre, fails because pre is not implemented
Also an unhandled Projection case
2021-02-15 01:14:34 +01:00
John J. Camilleri
109822675b Pass test with FoodsFin, by forcibly resorting record fields to make s first 2021-02-15 00:43:53 +01:00
John J. Camilleri
d563abb928 Minors 2021-02-13 00:59:15 +01:00
John J. Camilleri
a58a6c8a59 Add FoodsFin to testsuite (fails) 2021-02-13 00:16:03 +01:00
John J. Camilleri
98f6136ebd Add support for BIND 2021-02-13 00:14:35 +01:00
John J. Camilleri
8cfaa69b6e Handle record tables, pass FoodSwe in testsuite 2021-02-12 23:51:16 +01:00
John J. Camilleri
a12f58e7b0 Add test case for selection using records (fails) 2021-02-10 13:55:38 +01:00
John J. Camilleri
d5f68970b9 Add FoodsSwe (fails) 2021-02-09 10:54:51 +01:00
John J. Camilleri
9c2d8eb0b2 Add FoodsChi, FoodsHeb to LPGF testsuite 2021-02-09 10:14:40 +01:00
John J. Camilleri
34f0fc0ba7 Fix bug in dynamic parameter handling, compile FoodsBul successfully 2021-02-03 15:41:27 +01:00
John J. Camilleri
42b9e7036e Support dynamic param values 2021-02-03 13:16:10 +01:00
John J. Camilleri
132f693713 Minor cleanup 2021-02-03 09:44:15 +01:00
John J. Camilleri
153bffdad7 Support nested parameters, but fails with non-static values (see FoodsBull, ASg kind.g). 2021-02-03 00:11:22 +01:00
John J. Camilleri
d09838e97e Separate .trees and .treebank, and add a script for making the latter from the former 2021-02-02 21:46:38 +01:00
John J. Camilleri
c94bffe435 Generalise testsuite script to use treebank files, add FoodEng 2021-02-02 21:22:36 +01:00
John J. Camilleri
2a5850023b Correctly handle projection, but only in limited cases 2021-02-01 13:08:39 +01:00
John J. Camilleri
fe15aa0c00 Use canonical GF in LPGF compiler
Still contains some hardcoded values, missing cases.

I notice now that LPGF and Canonical GF are almost identical, so maybe we don't need a new LPGF format,
just a linearization-only runtime which works on canonical grammars.
The argument for keeping LGPF is that it would be optimized for size and speed.
2021-02-01 12:28:06 +01:00
John J. Camilleri
cead0cc4c1 Add selection and projection cases but not working 2021-01-26 09:55:07 +01:00
John J. Camilleri
6f622b496b Rename Zero grammar to Walking 2021-01-26 09:35:21 +01:00
John J. Camilleri
270e7f021f Add binary instances 2021-01-25 14:42:00 +01:00
John J. Camilleri
32b0860925 Make LPGF testsuite work (but still fails)
stack test :lpgf
2021-01-25 13:41:33 +01:00
John J. Camilleri
f24c50339b Strip down format. More early work on compiler. Add testsuite (doesn't work yet). 2021-01-25 12:10:30 +01:00
John J. Camilleri
cd5881d83a Early work on LPGF compiler 2021-01-22 15:17:36 +01:00
John J. Camilleri
93b81b9f13 Add first version of LPGF datatype, with linearization function and some hardcoded examples 2021-01-22 14:07:41 +01:00
John J. Camilleri
8ad9cf1e09 Add flag and stubs for compiling to LPGF format 2021-01-19 17:21:13 +01:00
274 changed files with 314672 additions and 1269 deletions

View File

@@ -18,7 +18,7 @@ jobs:
ghc:
- "8.6.5"
- "8.8.3"
- "8.10.7"
- "8.10.1"
exclude:
- os: macos-latest
ghc: 8.8.3
@@ -33,7 +33,7 @@ jobs:
- uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: haskell/actions/setup@v1.2.9
- uses: haskell/actions/setup@v1
id: setup-haskell-cabal
name: Setup Haskell
with:
@@ -53,12 +53,12 @@ jobs:
- name: Build
run: |
cabal configure --enable-tests --enable-benchmarks --test-show-details=direct
cabal build all
cabal configure --enable-tests --test-show-details=direct
cabal build
# - name: Test
# run: |
# cabal test all
- name: Test
run: |
PATH="$PWD/dist/build/gf:$PATH" cabal test gf-tests
stack:
name: stack / ghc ${{ matrix.ghc }}
@@ -66,38 +66,30 @@ jobs:
strategy:
matrix:
stack: ["latest"]
ghc: ["7.10.3","8.0.2", "8.2.2", "8.4.4", "8.6.5", "8.8.4", "8.10.7", "9.0.2"]
ghc: ["7.10.3","8.0.2", "8.2.2", "8.4.4", "8.6.5", "8.8.4"]
# ghc: ["8.8.3"]
steps:
- uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: haskell/actions/setup@v1.2.9
- uses: haskell/actions/setup@v1
name: Setup Haskell Stack
with:
ghc-version: ${{ matrix.ghc }}
stack-version: 'latest'
enable-stack: true
# Fix linker errrors on ghc-7.10.3 for ubuntu (see https://github.com/commercialhaskell/stack/blob/255cd830627870cdef34b5e54d670ef07882523e/doc/faq.md#i-get-strange-ld-errors-about-recompiling-with--fpic)
- run: sed -i.bak 's/"C compiler link flags", "/&-no-pie /' /home/runner/.ghcup/ghc/7.10.3/lib/ghc-7.10.3/settings
if: matrix.ghc == '7.10.3'
- uses: actions/cache@v1
name: Cache ~/.stack
with:
path: ~/.stack
key: ${{ runner.os }}-${{ matrix.ghc }}-stack--${{ hashFiles(format('stack-ghc{0}', matrix.ghc)) }}
restore-keys: |
${{ runner.os }}-${{ matrix.ghc }}-stack
key: ${{ runner.os }}-${{ matrix.ghc }}-stack
- name: Build
run: |
stack build --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml
# stack build --system-ghc --test --bench --no-run-tests --no-run-benchmarks
stack build --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml --test --no-run-tests
- name: Test
run: |
stack test --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml
stack test --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml gf:test:gf-tests

10
.gitignore vendored
View File

@@ -5,6 +5,7 @@
*.jar
*.gfo
*.pgf
*.lpgf
debian/.debhelper
debian/debhelper-build-stamp
debian/gf
@@ -48,7 +49,7 @@ src/runtime/java/.libs/
src/runtime/python/build/
.cabal-sandbox
cabal.sandbox.config
.stack-work
.stack-work*
DATA_DIR
stack*.yaml.lock
@@ -73,3 +74,10 @@ doc/icfp-2012.html
download/*.html
gf-book/index.html
src/www/gf-web-api.html
DEBUG/
PROF/
*.aux
*.hp
*.prof
*.ps

14
.travis.yml Normal file
View File

@@ -0,0 +1,14 @@
sudo: required
language: c
services:
- docker
before_install:
- docker pull odanoburu/gf-src:3.9
script:
- |
docker run --mount src="$(pwd)",target=/home/gfer,type=bind odanoburu/gf-src:3.9 /bin/bash -c "cd /home/gfer/src/runtime/c &&
autoreconf -i && ./configure && make && make install ; cd /home/gfer ; cabal install -fserver -fc-runtime --extra-lib-dirs='/usr/local/lib'"

View File

@@ -1,11 +0,0 @@
### New since 3.11 (WIP)
- Added a changelog!
### 3.11
See <https://www.grammaticalframework.org/download/release-3.11.html>
### 3.10
See <https://www.grammaticalframework.org/download/release-3.10.html>

View File

@@ -65,6 +65,6 @@ bintar:
# Make a source tar.gz distribution using git to make sure that everything is included.
# We put the distribution in dist/ so it is removed on `make clean`
# sdist:
# test -d dist || mkdir dist
# git archive --format=tar.gz --output=dist/gf-${VERSION}.tar.gz HEAD
sdist:
test -d dist || mkdir dist
git archive --format=tar.gz --output=dist/gf-${VERSION}.tar.gz HEAD

View File

@@ -1,4 +1,4 @@
![GF Logo](https://www.grammaticalframework.org/doc/Logos/gf1.svg)
![GF Logo](doc/Logos/gf1.svg)
# Grammatical Framework (GF)
@@ -39,7 +39,7 @@ or:
stack install
```
For more information, including links to precompiled binaries, see the [download page](https://www.grammaticalframework.org/download/index.html).
For more information, including links to precompiled binaries, see the [download page](http://www.grammaticalframework.org/download/index.html).
## About this repository

View File

@@ -47,14 +47,11 @@ but the generated _artifacts_ must be manually attached to the release as _asset
In order to do this you will need to be added the [GF maintainers](https://hackage.haskell.org/package/gf/maintainers/) on Hackage.
1. Run `stack sdist --test-tarball` and address any issues.
1. Run `make sdist`
2. Upload the package, either:
1. **Manually**: visit <https://hackage.haskell.org/upload> and upload the file generated by the previous command.
2. **via Stack**: `stack upload . --candidate`
3. After testing the candidate, publish it:
1. **Manually**: visit <https://hackage.haskell.org/package/gf-X.Y.Z/candidate/publish>
1. **via Stack**: `stack upload .`
4. If the documentation-building fails on the Hackage server, do:
1. **Manually**: visit <https://hackage.haskell.org/upload> and upload the file `dist/gf-X.Y.tar.gz`
2. **via Cabal (≥2.4)**: `cabal upload dist/gf-X.Y.tar.gz`
3. If the documentation-building fails on the Hackage server, do:
```
cabal v2-haddock --builddir=dist/docs --haddock-for-hackage --enable-doc
cabal upload --documentation dist/docs/*-docs.tar.gz

View File

@@ -17,10 +17,9 @@ instructions inside.
==Visual Studio Code==
- [Grammatical Framework Language Server https://marketplace.visualstudio.com/items?itemName=anka-213.gf-vscode] by Andreas Källberg.
This provides syntax highlighting and a client for the Grammatical Framework language server. Follow the installation instructions in the link.
- [Grammatical Framework https://marketplace.visualstudio.com/items?itemName=GrammaticalFramework.gf-vscode] is a simpler extension
without any external dependencies which provides only syntax highlighting.
[Grammatical Framework Language Server https://marketplace.visualstudio.com/items?itemName=anka-213.gf-vscode] by Andreas Källberg.
This provides syntax highlighting and a client for the Grammatical Framework language server. Follow the installation instructions in the link.
==Eclipse==

View File

@@ -7,6 +7,7 @@ title: "Grammatical Framework: Authors and Acknowledgements"
The current maintainers of GF are
[Krasimir Angelov](http://www.chalmers.se/cse/EN/organization/divisions/computing-science/people/angelov-krasimir),
[Thomas Hallgren](http://www.cse.chalmers.se/~hallgren/),
[Aarne Ranta](http://www.cse.chalmers.se/~aarne/),
[John J. Camilleri](http://johnjcamilleri.com), and
[Inari Listenmaa](https://inariksit.github.io/).
@@ -21,7 +22,6 @@ and
The following people have contributed code to some of the versions:
- [Thomas Hallgren](http://www.cse.chalmers.se/~hallgren/) (University of Gothenburg)
- Grégoire Détrez (University of Gothenburg)
- Ramona Enache (University of Gothenburg)
- [Björn Bringert](http://www.cse.chalmers.se/alumni/bringert) (University of Gothenburg)

View File

@@ -53,39 +53,26 @@ You will probably need to update the `PATH` environment variable to include your
For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10).
## Installing from Hackage
_Instructions applicable for macOS, Linux, and WSL2 on Windows._
<!--## Installing the latest Hackage release (macOS, Linux, and WSL2 on Windows)
[GF is on Hackage](http://hackage.haskell.org/package/gf), so under
normal circumstances the procedure is fairly simple:
```
cabal update
cabal install gf-3.11
```
1. Install ghcup https://www.haskell.org/ghcup/
2. `ghcup install ghc 8.10.4`
3. `ghcup set ghc 8.10.4`
4. `cabal update`
5. On Linux: install some C libraries from your Linux distribution (see note below)
6. `cabal install gf-3.11`
You can also download the source code release from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases),
and follow the instructions below under **Installing from the latest developer source code**.
### Notes
**GHC version**
The GF source code is known to be compilable with GHC versions 7.10 through to 8.10.
**Obtaining Haskell**
There are various ways of obtaining Haskell, including:
- ghcup
1. Install from https://www.haskell.org/ghcup/
2. `ghcup install ghc 8.10.4`
3. `ghcup set ghc 8.10.4`
- Haskell Platform https://www.haskell.org/platform/
- Stack https://haskellstack.org/
**Installation location**
The above steps install GF for a single user.
The above steps installs GF for a single user.
The executables are put in `$HOME/.cabal/bin` (or on macOS in `$HOME/Library/Haskell/bin`),
so you might want to add this directory to your path (in `.bash_profile` or similar):
@@ -97,34 +84,32 @@ PATH=$HOME/.cabal/bin:$PATH
GF uses [`haskeline`](http://hackage.haskell.org/package/haskeline), which
on Linux depends on some non-Haskell libraries that won't be installed
automatically by Cabal, and therefore need to be installed manually.
automatically by cabal, and therefore need to be installed manually.
Here is one way to do this:
- On Ubuntu: `sudo apt-get install libghc-haskeline-dev`
- On Fedora: `sudo dnf install ghc-haskeline-devel`
## Installing from source code
**GHC version**
**Obtaining**
The GF source code has been updated to compile with GHC versions 7.10 through to 8.8.
-->
## Installing from the latest developer source code
To obtain the source code for the **release**,
download it from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases).
If you haven't already, clone the repository with:
Alternatively, to obtain the **latest version** of the source code:
1. If you haven't already, clone the repository with:
```
git clone https://github.com/GrammaticalFramework/gf-core.git
```
2. If you've already cloned the repository previously, update with:
If you've already cloned the repository previously, update with:
```
git pull
```
Then install with:
**Installing**
You can then install with:
```
cabal install
```

446
gf.cabal
View File

@@ -8,17 +8,12 @@ license-file: LICENSE
category: Natural Language Processing, Compiler
synopsis: Grammatical Framework
description: GF, Grammatical Framework, is a programming language for multilingual grammar applications
maintainer: John J. Camilleri <john@digitalgrammars.com>
homepage: https://www.grammaticalframework.org/
bug-reports: https://github.com/GrammaticalFramework/gf-core/issues
tested-with: GHC==7.10.3, GHC==8.0.2, GHC==8.10.4, GHC==9.0.2
tested-with: GHC==7.10.3, GHC==8.0.2, GHC==8.10.4
data-dir: src
extra-source-files:
README.md
CHANGELOG.md
WebSetup.hs
doc/Logos/gf0.png
extra-source-files: WebSetup.hs
data-files:
www/*.html
www/*.css
@@ -46,7 +41,7 @@ data-files:
custom-setup
setup-depends:
base >= 4.9.1 && < 4.16,
base >= 4.9.1 && < 4.15,
Cabal >= 1.22.0.0,
directory >= 1.3.0 && < 1.4,
filepath >= 1.4.1 && < 1.5,
@@ -81,14 +76,17 @@ library
build-depends:
-- GHC 8.0.2 to GHC 8.10.4
array >= 0.5.1 && < 0.6,
base >= 4.9.1 && < 4.16,
base >= 4.9.1 && < 4.15,
bytestring >= 0.10.8 && < 0.11,
containers >= 0.5.7 && < 0.7,
exceptions >= 0.8.3 && < 0.11,
ghc-prim >= 0.5.0 && < 0.7.1,
ghc-prim >= 0.5.0 && < 0.7,
hashable >= 1.2.6 && < 1.4,
mtl >= 2.2.1 && < 2.3,
pretty >= 1.1.3 && < 1.2,
random >= 1.1 && < 1.3,
text >= 1.2.2 && < 1.3,
unordered-containers >= 0.2.8 && < 0.3,
utf8-string >= 1.0.1.1 && < 1.1,
-- We need transformers-compat >= 0.6.3, but that is only in newer snapshots where it is redundant.
transformers-compat >= 0.5.1.4 && < 0.7
@@ -111,14 +109,16 @@ library
--ghc-options: -fwarn-unused-imports
--if impl(ghc>=7.8)
-- ghc-options: +RTS -A20M -RTS
-- ghc-prof-options: -fprof-auto
ghc-prof-options: -fprof-auto
exposed-modules:
LPGF
PGF
PGF.Internal
PGF.Haskell
other-modules:
LPGF.Internal
PGF.Data
PGF.Macros
PGF.Binary
@@ -211,6 +211,7 @@ library
GF.Compile.Export
GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF
GF.Compile.Multi
GF.Compile.Optimize
@@ -239,6 +240,7 @@ library
GF.Data.ErrM
GF.Data.Graph
GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Relation
GF.Data.Str
GF.Data.Utilities
@@ -302,14 +304,14 @@ library
build-depends:
cgi >= 3001.3.0.2 && < 3001.6,
httpd-shed >= 0.4.0 && < 0.5,
network>=2.3 && <3.2
network>=2.3 && <2.7
if flag(network-uri)
build-depends:
network-uri >= 2.6.1.0 && < 2.7,
network>=2.6 && <3.2
network>=2.6 && <2.7
else
build-depends:
network >= 2.5 && <3.2
network >= 2.5 && <2.6
cpp-options: -DSERVER_MODE
other-modules:
@@ -375,7 +377,7 @@ executable gf
if impl(ghc<7.8)
ghc-options: -with-rtsopts=-K64M
-- ghc-prof-options: -auto-all
ghc-prof-options: -auto-all
if impl(ghc>=8.2)
ghc-options: -fhide-source-paths
@@ -400,10 +402,422 @@ test-suite gf-tests
main-is: run.hs
hs-source-dirs: testsuite
build-depends:
base >= 4.9.1 && < 4.16,
base >= 4.9.1 && < 4.15,
Cabal >= 1.8,
directory >= 1.3.0 && < 1.4,
filepath >= 1.4.1 && < 1.5,
process >= 1.4.3 && < 1.7
build-tool-depends: gf:gf
default-language: Haskell2010
test-suite lpgf
type: exitcode-stdio-1.0
main-is: test.hs
hs-source-dirs:
src/compiler
src/runtime/haskell
testsuite/lpgf
other-modules:
Data.Binary
Data.Binary.Builder
Data.Binary.Get
Data.Binary.IEEE754
Data.Binary.Put
GF
GF.Command.Abstract
GF.Command.CommandInfo
GF.Command.Commands
GF.Command.CommonCommands
GF.Command.Help
GF.Command.Importing
GF.Command.Interpreter
GF.Command.Messages
GF.Command.Parse
GF.Command.SourceCommands
GF.Command.TreeOperations
GF.Compile
GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar
GF.Compile.Compute.Concrete
GF.Compile.Compute.Predef
GF.Compile.Compute.Value
GF.Compile.ConcreteToHaskell
GF.Compile.ExampleBased
GF.Compile.Export
GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG
GF.Compile.GetGrammar
GF.Compile.GrammarToCanonical
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF
GF.Compile.Multi
GF.Compile.Optimize
GF.Compile.PGFtoHaskell
GF.Compile.PGFtoJava
GF.Compile.PGFtoJS
GF.Compile.PGFtoJSON
GF.Compile.PGFtoProlog
GF.Compile.PGFtoPython
GF.Compile.ReadFiles
GF.Compile.Rename
GF.Compile.SubExOpt
GF.Compile.Tags
GF.Compile.ToAPI
GF.Compile.TypeCheck.Abstract
GF.Compile.TypeCheck.Concrete
GF.Compile.TypeCheck.ConcreteNew
GF.Compile.TypeCheck.Primitives
GF.Compile.TypeCheck.TC
GF.Compile.Update
GF.CompileInParallel
GF.CompileOne
GF.Compiler
GF.Data.BacktrackM
GF.Data.ErrM
GF.Data.Graph
GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Operations
GF.Data.Relation
GF.Data.Str
GF.Data.Utilities
GF.Data.XML
GF.Grammar
GF.Grammar.Analyse
GF.Grammar.Binary
GF.Grammar.BNFC
GF.Grammar.Canonical
GF.Grammar.CanonicalJSON
GF.Grammar.CFG
GF.Grammar.EBNF
GF.Grammar.Grammar
GF.Grammar.Lexer
GF.Grammar.Lockfield
GF.Grammar.Lookup
GF.Grammar.Macros
GF.Grammar.Parser
GF.Grammar.PatternMatch
GF.Grammar.Predef
GF.Grammar.Printer
GF.Grammar.ShowTerm
GF.Grammar.Unify
GF.Grammar.Values
GF.Haskell
GF.Infra.BuildInfo
GF.Infra.CheckM
GF.Infra.Concurrency
GF.Infra.Dependencies
GF.Infra.GetOpt
GF.Infra.Ident
GF.Infra.Location
GF.Infra.Option
GF.Infra.SIO
GF.Infra.UseIO
GF.Interactive
GF.JavaScript.AbsJS
GF.JavaScript.PrintJS
GF.Main
GF.Quiz
GF.Speech.CFGToFA
GF.Speech.FiniteState
GF.Speech.GSL
GF.Speech.JSGF
GF.Speech.PGFToCFG
GF.Speech.PrRegExp
GF.Speech.RegExp
GF.Speech.SISR
GF.Speech.SLF
GF.Speech.SRG
GF.Speech.SRGS_ABNF
GF.Speech.SRGS_XML
GF.Speech.VoiceXML
GF.Support
GF.System.Catch
GF.System.Concurrency
GF.System.Console
GF.System.Directory
GF.System.Process
GF.System.Signal
GF.Text.Clitics
GF.Text.Coding
GF.Text.Lexing
GF.Text.Pretty
GF.Text.Transliterations
LPGF
LPGF.Internal
PGF
PGF.Binary
PGF.ByteCode
PGF.CId
PGF.Data
PGF.Expr
PGF.Forest
PGF.Generate
PGF.Internal
PGF.Linearize
PGF.Macros
PGF.Morphology
PGF.OldBinary
PGF.Optimize
PGF.Paraphrase
PGF.Parse
PGF.Printer
PGF.Probabilistic
PGF.Tree
PGF.TrieMap
PGF.Type
PGF.TypeCheck
PGF.Utilities
PGF.VisualizeTree
Paths_gf
if flag(interrupt)
cpp-options: -DUSE_INTERRUPT
other-modules: GF.System.UseSignal
else
other-modules: GF.System.NoSignal
build-depends:
ansi-terminal >= 0.6.3 && < 0.12,
array >= 0.5.1 && < 0.6,
base >=4.6 && < 5,
bytestring >= 0.10.8 && < 0.11,
containers >= 0.5.7 && < 0.7,
directory >= 1.3.0 && < 1.4,
filepath >= 1.4.1 && < 1.5,
ghc-prim >= 0.5.0 && < 0.7,
hashable >= 1.2.6 && < 1.4,
haskeline >= 0.7.3 && < 0.9,
json >= 0.9.1 && < 0.11,
mtl >= 2.2.1 && < 2.3,
parallel >= 3.2.1.1 && < 3.3,
pretty >= 1.1.3 && < 1.2,
process >= 1.4.3 && < 1.7,
random >= 1.1 && < 1.3,
text >= 1.2.2 && < 1.3,
time >= 1.6.0 && < 1.10,
transformers-compat >= 0.5.1.4 && < 0.7,
unordered-containers >= 0.2.8 && < 0.3,
utf8-string >= 1.0.1.1 && < 1.1
if impl(ghc<8.0)
build-depends:
fail >= 4.9.0 && < 4.10
if os(windows)
build-depends:
Win32 >= 2.3.1.1 && < 2.7
else
build-depends:
unix >= 2.7.2 && < 2.8,
terminfo >=0.4.0 && < 0.5
default-language: Haskell2010
benchmark lpgf-bench
type: exitcode-stdio-1.0
main-is: bench.hs
hs-source-dirs:
src/compiler
src/runtime/haskell
testsuite/lpgf
other-modules:
Data.Binary
Data.Binary.Builder
Data.Binary.Get
Data.Binary.IEEE754
Data.Binary.Put
GF
GF.Command.Abstract
GF.Command.CommandInfo
GF.Command.Commands
GF.Command.CommonCommands
GF.Command.Help
GF.Command.Importing
GF.Command.Interpreter
GF.Command.Messages
GF.Command.Parse
GF.Command.SourceCommands
GF.Command.TreeOperations
GF.Compile
GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar
GF.Compile.Compute.Concrete
GF.Compile.Compute.Predef
GF.Compile.Compute.Value
GF.Compile.ConcreteToHaskell
GF.Compile.ExampleBased
GF.Compile.Export
GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG
GF.Compile.GetGrammar
GF.Compile.GrammarToCanonical
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF
GF.Compile.Multi
GF.Compile.Optimize
GF.Compile.PGFtoHaskell
GF.Compile.PGFtoJS
GF.Compile.PGFtoJSON
GF.Compile.PGFtoJava
GF.Compile.PGFtoProlog
GF.Compile.PGFtoPython
GF.Compile.ReadFiles
GF.Compile.Rename
GF.Compile.SubExOpt
GF.Compile.Tags
GF.Compile.ToAPI
GF.Compile.TypeCheck.Abstract
GF.Compile.TypeCheck.Concrete
GF.Compile.TypeCheck.ConcreteNew
GF.Compile.TypeCheck.Primitives
GF.Compile.TypeCheck.TC
GF.Compile.Update
GF.CompileInParallel
GF.CompileOne
GF.Compiler
GF.Data.BacktrackM
GF.Data.ErrM
GF.Data.Graph
GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Operations
GF.Data.Relation
GF.Data.Str
GF.Data.Utilities
GF.Data.XML
GF.Grammar
GF.Grammar.Analyse
GF.Grammar.BNFC
GF.Grammar.Binary
GF.Grammar.CFG
GF.Grammar.Canonical
GF.Grammar.CanonicalJSON
GF.Grammar.EBNF
GF.Grammar.Grammar
GF.Grammar.Lexer
GF.Grammar.Lockfield
GF.Grammar.Lookup
GF.Grammar.Macros
GF.Grammar.Parser
GF.Grammar.PatternMatch
GF.Grammar.Predef
GF.Grammar.Printer
GF.Grammar.ShowTerm
GF.Grammar.Unify
GF.Grammar.Values
GF.Haskell
GF.Infra.BuildInfo
GF.Infra.CheckM
GF.Infra.Concurrency
GF.Infra.Dependencies
GF.Infra.GetOpt
GF.Infra.Ident
GF.Infra.Location
GF.Infra.Option
GF.Infra.SIO
GF.Infra.UseIO
GF.Interactive
GF.JavaScript.AbsJS
GF.JavaScript.PrintJS
GF.Main
GF.Quiz
GF.Speech.CFGToFA
GF.Speech.FiniteState
GF.Speech.GSL
GF.Speech.JSGF
GF.Speech.PGFToCFG
GF.Speech.PrRegExp
GF.Speech.RegExp
GF.Speech.SISR
GF.Speech.SLF
GF.Speech.SRG
GF.Speech.SRGS_ABNF
GF.Speech.SRGS_XML
GF.Speech.VoiceXML
GF.Support
GF.System.Catch
GF.System.Concurrency
GF.System.Console
GF.System.Directory
GF.System.Process
GF.System.Signal
GF.Text.Clitics
GF.Text.Coding
GF.Text.Lexing
GF.Text.Pretty
GF.Text.Transliterations
LPGF
LPGF.Internal
PGF
PGF.Binary
PGF.ByteCode
PGF.CId
PGF.Data
PGF.Expr
PGF.Expr
PGF.Forest
PGF.Generate
PGF.Internal
PGF.Linearize
PGF.Macros
PGF.Morphology
PGF.OldBinary
PGF.Optimize
PGF.Paraphrase
PGF.Parse
PGF.Printer
PGF.Probabilistic
PGF.Tree
PGF.TrieMap
PGF.Type
PGF.TypeCheck
PGF.Utilities
PGF.VisualizeTree
PGF2
PGF2.Expr
PGF2.Type
PGF2.FFI
Paths_gf
if flag(interrupt)
cpp-options: -DUSE_INTERRUPT
other-modules: GF.System.UseSignal
else
other-modules: GF.System.NoSignal
hs-source-dirs:
src/runtime/haskell-bind
other-modules:
PGF2
PGF2.FFI
PGF2.Expr
PGF2.Type
build-tools: hsc2hs
extra-libraries: pgf gu
c-sources: src/runtime/haskell-bind/utils.c
cc-options: -std=c99
build-depends:
ansi-terminal,
array,
base>=4.6 && <5,
bytestring,
containers,
deepseq,
directory,
filepath,
ghc-prim,
hashable,
haskeline,
json,
mtl,
parallel>=3,
pretty,
process,
random,
terminfo,
text,
time,
transformers-compat,
unix,
unordered-containers,
utf8-string
default-language: Haskell2010

View File

@@ -8,7 +8,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css" integrity="sha384-MCw98/SFnGE8fJT3GXwEOngsV7Zt27NXFoaoApmYm81iuXoPkFOJwJ8ERdknLPMO" crossorigin="anonymous">
<link rel="stylesheet" href="https://use.fontawesome.com/releases/v5.15.4/css/all.css" crossorigin="anonymous">
<link rel="stylesheet" href="https://use.fontawesome.com/releases/v5.4.2/css/all.css" integrity="sha384-/rXc/GQVaYpyDdyxK+ecHPVYJSN9bmVFBvjA/9eOB+pb3F2w2N6fc5qB9Ew5yIns" crossorigin="anonymous">
<link rel="alternate" href="https://github.com/GrammaticalFramework/gf-core/" title="GF GitHub repository">
</head>
@@ -85,27 +85,10 @@
<div class="col-sm-6 col-md-3 mb-4">
<h3>Contribute</h3>
<ul class="mb-2">
<li>
<a href="https://web.libera.chat/?channels=#gf">
<i class="fas fa-hashtag"></i>
IRC
</a>
/
<a href="https://discord.gg/EvfUsjzmaz">
<i class="fab fa-discord"></i>
Discord
</a>
</li>
<li>
<a href="https://stackoverflow.com/questions/tagged/gf">
<i class="fab fa-stack-overflow"></i>
Stack Overflow
</a>
</li>
<li><a href="https://groups.google.com/group/gf-dev">Mailing List</a></li>
<li><a href="http://groups.google.com/group/gf-dev">Mailing List</a></li>
<li><a href="https://github.com/GrammaticalFramework/gf-core/issues">Issue Tracker</a></li>
<li><a href="//school.grammaticalframework.org/2020/">Summer School</a></li>
<li><a href="doc/gf-people.html">Authors</a></li>
<li><a href="//school.grammaticalframework.org/2020/">Summer School</a></li>
</ul>
<a href="https://github.com/GrammaticalFramework/" class="btn btn-primary ml-3">
<i class="fab fa-github mr-1"></i>
@@ -171,7 +154,7 @@ least one, it may help you to get a first idea of what GF is.
<div class="row">
<div class="col-md-6">
<h2>Applications & availability</h2>
<h2>Applications & Availability</h2>
<p>
GF can be used for building
<a href="//cloud.grammaticalframework.org/translator/">translation systems</a>,
@@ -236,28 +219,19 @@ least one, it may help you to get a first idea of what GF is.
or <a href="https://www.grammaticalframework.org/irc/?C=M;O=D">browse the channel logs</a>.
</p>
<p>
There is also a <a href="https://discord.gg/EvfUsjzmaz">GF server on Discord</a>.
</p>
<p>
For bug reports and feature requests, please create an issue in the
<a href="https://github.com/GrammaticalFramework/gf-core/issues">GF Core</a> or
<a href="https://github.com/GrammaticalFramework/gf-rgl/issues">RGL</a> repository.
For programming questions, consider asking them on <a href="https://stackoverflow.com/questions/tagged/gf">Stack Overflow with the <code>gf</code> tag</a>.
If you have a more general question to the community, we recommend you ask it on the <a href="http://groups.google.com/group/gf-dev">mailing list</a>.
If you have a larger question which the community may benefit from, we recommend you ask it on the <a href="http://groups.google.com/group/gf-dev">mailing list</a>.
</p>
</div>
<div class="col-md-6">
<h2>News</h2>
<dt class="col-sm-3 text-center text-nowrap">2021-07-25</dt>
<dd class="col-sm-9">
<strong>GF 3.11 released.</strong>
<a href="download/release-3.11.html">Release notes</a>
</dd>
<dl class="row">
<dt class="col-sm-3 text-center text-nowrap">2021-07-25</dt>
<dd class="col-sm-9">
<strong>GF 3.11 released.</strong>
<a href="download/release-3.11.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2021-05-05</dt>
<dd class="col-sm-9">
<a href="https://cloud.grammaticalframework.org/wordnet/">GF WordNet</a> now supports languages for which there are no other WordNets. New additions: Afrikaans, German, Korean, Maltese, Polish, Somali, Swahili.
@@ -270,6 +244,34 @@ least one, it may help you to get a first idea of what GF is.
<dd class="col-sm-9">
<a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Abstract Syntax as Interlingua</a>: Scaling Up the Grammatical Framework from Controlled Languages to Robust Pipelines. A paper in Computational Linguistics (2020) summarizing much of the development in GF in the past ten years.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-03</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2018/">Sixth GF Summer School</a> in Stellenbosch (South Africa), 314 December 2018
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-02</dt>
<dd class="col-sm-9">
<strong>GF 3.10 released.</strong>
<a href="download/release-3.10.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-07-25</dt>
<dd class="col-sm-9">
The GF repository has been split in two:
<a href="https://github.com/GrammaticalFramework/gf-core">gf-core</a> and
<a href="https://github.com/GrammaticalFramework/gf-rgl">gf-rgl</a>.
The original <a href="https://github.com/GrammaticalFramework/GF">GF</a> repository is now archived.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-08-11</dt>
<dd class="col-sm-9">
<strong>GF 3.9 released.</strong>
<a href="download/release-3.9.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-06-29</dt>
<dd class="col-sm-9">
GF is moving to <a href="https://github.com/GrammaticalFramework/GF/">GitHub</a>.</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-03-13</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2017/">GF Summer School</a> in Riga (Latvia), 14-25 August 2017
</dd>
</dl>
<h2>Projects</h2>
@@ -339,7 +341,7 @@ least one, it may help you to get a first idea of what GF is.
Libraries are at the heart of modern software engineering. In natural language
applications, libraries are a way to cope with thousands of details involved in
syntax, lexicon, and inflection. The
<a href="lib/doc/synopsis/index.html">GF resource grammar library</a> (RGL) has
<a href="lib/doc/synopsis/index.html">GF resource grammar library</a> has
support for an increasing number of languages, currently including
Afrikaans,
Amharic (partial),

View File

@@ -4,7 +4,6 @@ module GF.Command.Commands (
options,flags,
) where
import Prelude hiding (putStrLn,(<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import System.Info(os)
import PGF
@@ -883,15 +882,11 @@ pgfCommands = Map.fromList [
Right ty -> ty
Nothing -> error ("Can't parse '"++str++"' as a type")
optViewFormat opts = valStrOpts "format" "png" opts
optViewGraph opts = valStrOpts "view" open_cmd opts
optViewGraph opts = valStrOpts "view" "open" opts
optNum opts = valIntOpts "number" 1 opts
optNumInf opts = valIntOpts "number" 1000000000 opts ---- 10^9
takeOptNum opts = take (optNumInf opts)
open_cmd | os == "linux" = "xdg-open"
| os == "mingw32" = "start"
| otherwise = "open"
returnFromExprs es = return $ case es of
[] -> pipeMessage "no trees found"
_ -> fromExprs es
@@ -1027,4 +1022,4 @@ stanzas = map unlines . chop . lines where
#if !(MIN_VERSION_base(4,9,0))
errorWithoutStackTrace = error
#endif
#endif

View File

@@ -1,6 +1,7 @@
module GF.Compile (compileToPGF, link, batchCompile, srcAbsName) where
module GF.Compile (compileToPGF, compileToLPGF, link, linkl, batchCompile, srcAbsName) where
import GF.Compile.GrammarToPGF(mkCanon2pgf)
import GF.Compile.GrammarToLPGF(mkCanon2lpgf)
import GF.Compile.ReadFiles(ModEnv,getOptionsFromFile,getAllFiles,
importsOfModule)
import GF.CompileOne(compileOne)
@@ -14,7 +15,7 @@ import GF.Infra.UseIO(IOE,FullPath,liftIO,getLibraryDirectory,putIfVerb,
justModuleName,extendPathEnv,putStrE,putPointE)
import GF.Data.Operations(raise,(+++),err)
import Control.Monad(foldM,when,(<=<),filterM,liftM)
import Control.Monad(foldM,when,(<=<),filterM)
import GF.System.Directory(doesFileExist,getModificationTime)
import System.FilePath((</>),isRelative,dropFileName)
import qualified Data.Map as Map(empty,insert,elems) --lookup
@@ -24,12 +25,16 @@ import GF.Text.Pretty(render,($$),(<+>),nest)
import PGF.Internal(optimizePGF)
import PGF(PGF,defaultProbabilities,setProbabilities,readProbabilitiesFromFile)
import LPGF(LPGF)
-- | Compiles a number of source files and builds a 'PGF' structure for them.
-- This is a composition of 'link' and 'batchCompile'.
compileToPGF :: Options -> [FilePath] -> IOE PGF
compileToPGF opts fs = link opts . snd =<< batchCompile opts fs
compileToLPGF :: Options -> [FilePath] -> IOE LPGF
compileToLPGF opts fs = linkl opts . snd =<< batchCompile opts fs
-- | Link a grammar into a 'PGF' that can be used to 'PGF.linearize' and
-- 'PGF.parse' with the "PGF" run-time system.
link :: Options -> (ModuleName,Grammar) -> IOE PGF
@@ -39,9 +44,17 @@ link opts (cnc,gr) =
pgf <- mkCanon2pgf opts gr abs
probs <- liftIO (maybe (return . defaultProbabilities) readProbabilitiesFromFile (flag optProbsFile opts) pgf)
when (verbAtLeast opts Normal) $ putStrE "OK"
return $ setProbabilities probs
return $ setProbabilities probs
$ if flag optOptimizePGF opts then optimizePGF pgf else pgf
-- | Link a grammar into a 'LPGF' that can be used for linearization only.
linkl :: Options -> (ModuleName,Grammar) -> IOE LPGF
linkl opts (cnc,gr) =
putPointE Normal opts "linking ... " $ do
let abs = srcAbsName gr cnc
lpgf <- mkCanon2lpgf opts gr abs
return lpgf
-- | Returns the name of the abstract syntax corresponding to the named concrete syntax
srcAbsName gr cnc = err (const cnc) id $ abstractOfConcrete gr cnc

View File

@@ -0,0 +1,429 @@
module GF.Compile.GrammarToLPGF (mkCanon2lpgf) where
import LPGF.Internal (LPGF (..))
import qualified LPGF.Internal as L
import PGF.CId
import GF.Grammar.Grammar
import qualified GF.Grammar.Canonical as C
import GF.Compile.GrammarToCanonical (grammar2canonical)
import GF.Data.Operations (ErrorMonad (..))
import qualified GF.Data.IntMapBuilder as IntMapBuilder
import GF.Infra.Ident (rawIdentS, showRawIdent)
import GF.Infra.Option (Options)
import GF.Infra.UseIO (IOE)
import GF.Text.Pretty (pp, render)
import Control.Applicative ((<|>))
import Control.Monad (when, unless, forM, forM_)
import qualified Control.Monad.State.Strict as CMS
import Data.Either (lefts, rights)
import Data.List (elemIndex)
import qualified Data.List as L
import qualified Data.Map.Strict as Map
import Data.Maybe (fromJust, isJust)
import Data.Text (Text)
import qualified Data.Text as T
import System.Environment (lookupEnv)
import System.FilePath ((</>), (<.>))
import Text.Printf (printf)
import qualified Debug.Trace
trace x = Debug.Trace.trace ("> " ++ show x) (return ())
mkCanon2lpgf :: Options -> SourceGrammar -> ModuleName -> IOE LPGF
mkCanon2lpgf opts gr am = do
debug <- isJust <$> lookupEnv "DEBUG"
when debug $ do
ppCanonical debugDir canon
dumpCanonical debugDir canon
(an,abs) <- mkAbstract ab
cncs <- mapM (mkConcrete debug ab) cncs
let lpgf = LPGF {
L.absname = an,
L.abstract = abs,
L.concretes = Map.fromList cncs
}
when debug $ ppLPGF debugDir lpgf
return lpgf
where
canon@(C.Grammar ab cncs) = grammar2canonical opts am gr
mkAbstract :: (ErrorMonad err) => C.Abstract -> err (CId, L.Abstract)
mkAbstract (C.Abstract modId flags cats funs) = return (mdi2i modId, L.Abstract {})
mkConcrete :: (ErrorMonad err) => Bool -> C.Abstract -> C.Concrete -> err (CId, L.Concrete)
mkConcrete debug (C.Abstract _ _ _ funs) (C.Concrete modId absModId flags params0 lincats0 lindefs0) = do
let
-- Some transformations on canonical grammar
params :: [C.ParamDef]
params = inlineParamAliases params0
lincats :: [C.LincatDef]
lincats = s:i:f:lincats0
where
ss = C.RecordType [C.RecordRow (C.LabelId (rawIdentS "s")) C.StrType]
s = C.LincatDef (C.CatId (rawIdentS "String")) ss
i = C.LincatDef (C.CatId (rawIdentS "Int")) ss
f = C.LincatDef (C.CatId (rawIdentS "Float")) ss
lindefs :: [C.LinDef]
lindefs =
[ C.LinDef funId varIds linValue
| (C.LinDef funId varIds linValue) <- lindefs0
, let Right linType = lookupLinType funId
]
-- Builds maps for lookups
paramValueMap :: Map.Map C.ParamId C.ParamDef -- constructor -> definition
paramValueMap = Map.fromList [ (v,d) | d@(C.ParamDef _ vs) <- params, (C.Param v _) <- vs ]
lincatMap :: Map.Map C.CatId C.LincatDef
lincatMap = Map.fromList [ (cid,d) | d@(C.LincatDef cid _) <- lincats ]
funMap :: Map.Map C.FunId C.FunDef
funMap = Map.fromList [ (fid,d) | d@(C.FunDef fid _) <- funs ]
-- | Lookup paramdef
lookupParamDef :: C.ParamId -> Either String C.ParamDef
lookupParamDef pid = m2e (printf "Cannot find param definition: %s" (show pid)) (Map.lookup pid paramValueMap)
-- | Lookup lintype for a function
lookupLinType :: C.FunId -> Either String C.LinType
lookupLinType funId = do
fun <- m2e (printf "Cannot find type for: %s" (show funId)) (Map.lookup funId funMap)
let (C.FunDef _ (C.Type _ (C.TypeApp catId _))) = fun
lincat <- m2e (printf "Cannot find lincat for: %s" (show catId)) (Map.lookup catId lincatMap)
let (C.LincatDef _ lt) = lincat
return lt
-- | Lookup lintype for a function's argument
lookupLinTypeArg :: C.FunId -> Int -> Either String C.LinType
lookupLinTypeArg funId argIx = do
fun <- m2e (printf "Cannot find type for: %s" (show funId)) (Map.lookup funId funMap)
let (C.FunDef _ (C.Type args _)) = fun
let (C.TypeBinding _ (C.Type _ (C.TypeApp catId _))) = args !! argIx
lincat <- m2e (printf "Cannot find lincat for: %s" (show catId)) (Map.lookup catId lincatMap)
let (C.LincatDef _ lt) = lincat
return lt
-- Code generation
-- | Main code generation function
mkLin :: C.LinDef -> CodeGen (CId, L.LinFun)
mkLin (C.LinDef funId varIds linValue) = do
-- when debug $ trace funId
(lf, _) <- val2lin linValue
return (fi2i funId, lf)
where
val2lin :: C.LinValue -> CodeGen (L.LinFun, Maybe C.LinType)
val2lin lv = case lv of
C.ConcatValue v1 v2 -> do
(v1',t1) <- val2lin v1
(v2',t2) <- val2lin v2
return (L.Concat v1' v2', t1 <|> t2) -- t1 else t2
C.LiteralValue ll -> case ll of
C.FloatConstant f -> return (L.Token $ T.pack $ show f, Just C.FloatType)
C.IntConstant i -> return (L.Token $ T.pack $ show i, Just C.IntType)
C.StrConstant s -> return (L.Token $ T.pack s, Just C.StrType)
C.ErrorValue err -> return (L.Error err, Nothing)
C.ParamConstant (C.Param pid lvs) -> do
let
collectProjections :: C.LinValue -> CodeGen [L.LinFun]
collectProjections (C.ParamConstant (C.Param pid lvs)) = do
def <- lookupParamDef pid
let (C.ParamDef tpid defpids) = def
pidIx <- eitherElemIndex pid [ p | C.Param p _ <- defpids ]
rest <- mapM collectProjections lvs
return $ L.Ix (pidIx+1) : concat rest
collectProjections lv = do
(lf,_) <- val2lin lv
return [lf]
lfs <- collectProjections lv
let term = L.Tuple lfs
def <- lookupParamDef pid
let (C.ParamDef tpid _) = def
return (term, Just $ C.ParamType (C.ParamTypeId tpid))
C.PredefValue (C.PredefId pid) -> case showRawIdent pid of
"BIND" -> return (L.Bind, Nothing)
"SOFT_BIND" -> return (L.Bind, Nothing)
"SOFT_SPACE" -> return (L.Space, Nothing)
"CAPIT" -> return (L.Capit, Nothing)
"ALL_CAPIT" -> return (L.AllCapit, Nothing)
x -> Left $ printf "Unknown predef function: %s" x
C.RecordValue rrvs -> do
ts <- sequence [ val2lin lv | C.RecordRow lid lv <- rrvs ]
return (L.Tuple (map fst ts), Just $ C.RecordType [ C.RecordRow lid lt | (C.RecordRow lid _, (_, Just lt)) <- zip rrvs ts])
C.TableValue lt trvs -> do
-- group the rows by "left-most" value
let
groupRow :: C.TableRowValue -> C.TableRowValue -> Bool
groupRow (C.TableRow p1 _) (C.TableRow p2 _) = groupPattern p1 p2
groupPattern :: C.LinPattern -> C.LinPattern -> Bool
groupPattern p1 p2 = case (p1,p2) of
(C.ParamPattern (C.Param pid1 _), C.ParamPattern (C.Param pid2 _)) -> pid1 == pid2 -- compare only constructors
(C.RecordPattern (C.RecordRow lid1 patt1:_), C.RecordPattern (C.RecordRow lid2 patt2:_)) -> groupPattern patt1 patt2 -- lid1 == lid2 necessarily
_ -> error $ printf "Mismatched patterns in grouping:\n%s\n%s" (show p1) (show p2)
grps :: [[C.TableRowValue]]
grps = L.groupBy groupRow trvs
-- remove one level of depth and recurse
let
handleGroup :: [C.TableRowValue] -> CodeGen (L.LinFun, Maybe C.LinType)
handleGroup [C.TableRow patt lv] =
case reducePattern patt of
Just patt' -> do
(lf,lt) <- handleGroup [C.TableRow patt' lv]
return (L.Tuple [lf],lt)
Nothing -> val2lin lv
handleGroup rows = do
let rows' = map reduceRow rows
val2lin (C.TableValue lt rows') -- lt is wrong here, but is unused
reducePattern :: C.LinPattern -> Maybe C.LinPattern
reducePattern patt =
case patt of
C.ParamPattern (C.Param _ []) -> Nothing
C.ParamPattern (C.Param _ patts) -> Just $ C.ParamPattern (C.Param pid' patts')
where
C.ParamPattern (C.Param pid1 patts1) = head patts
pid' = pid1
patts' = patts1 ++ tail patts
C.RecordPattern [] -> Nothing
C.RecordPattern (C.RecordRow lid patt:rrs) ->
case reducePattern patt of
Just patt' -> Just $ C.RecordPattern (C.RecordRow lid patt':rrs)
Nothing -> if null rrs then Nothing else Just $ C.RecordPattern rrs
_ -> error $ printf "Unhandled pattern in reducing: %s" (show patt)
reduceRow :: C.TableRowValue -> C.TableRowValue
reduceRow (C.TableRow patt lv) =
let Just patt' = reducePattern patt
in C.TableRow patt' lv
-- ts :: [(L.LinFun, Maybe C.LinType)]
ts <- mapM handleGroup grps
-- return
let typ = case ts of
(_, Just tst):_ -> Just $ C.TableType lt tst
_ -> Nothing
return (L.Tuple (map fst ts), typ)
-- TODO TuplePattern, WildPattern?
C.TupleValue lvs -> do
ts <- mapM val2lin lvs
return (L.Tuple (map fst ts), Just $ C.TupleType (map (fromJust.snd) ts))
C.VariantValue [] -> return (L.Empty, Nothing) -- TODO Just C.StrType ?
C.VariantValue (vr:_) -> val2lin vr -- NOTE variants not supported, just pick first
C.VarValue (C.VarValueId (C.Unqual v)) -> do
ix <- eitherElemIndex (C.VarId v) varIds
lt <- lookupLinTypeArg funId ix
return (L.Argument (ix+1), Just lt)
C.PreValue pts df -> do
pts' <- forM pts $ \(pfxs, lv) -> do
(lv', _) <- val2lin lv
return (map T.pack pfxs, lv')
(df', lt) <- val2lin df
return (L.Pre pts' df', lt)
C.Projection v1 lblId -> do
(v1', mtyp) <- val2lin v1
-- find label index in argument type
let Just (C.RecordType rrs) = mtyp
let rrs' = [ lid | C.RecordRow lid _ <- rrs ]
-- lblIx <- eitherElemIndex lblId rrs'
let
lblIx = case eitherElemIndex lblId rrs' of
Right x -> x
Left _ -> 0 -- corresponds to Prelude.False
-- lookup lintype for record row
let C.RecordRow _ lt = rrs !! lblIx
return (L.Projection v1' (L.Ix (lblIx+1)), Just lt)
C.Selection v1 v2 -> do
(v1', t1) <- val2lin v1
(v2', t2) <- val2lin v2
let Just (C.TableType t11 t12) = t1 -- t11 == t2
return (L.Projection v1' v2', Just t12)
-- C.CommentedValue cmnt lv -> val2lin lv
C.CommentedValue cmnt lv -> case cmnt of
"impossible" -> return (L.Empty, Nothing)
-- "impossible" -> val2lin lv >>= \(_, typ) -> return (L.Empty, typ)
_ -> val2lin lv
v -> Left $ printf "val2lin not implemented for: %s" (show v)
-- Invoke code generation
let es = map mkLin lindefs
unless (null $ lefts es) (raise $ unlines (lefts es))
let maybeOptimise = if debug then id else extractStrings
let concr = maybeOptimise $ L.Concrete {
L.toks = IntMapBuilder.emptyIntMap,
L.lins = Map.fromList (rights es)
}
return (mdi2i modId, concr)
type CodeGen a = Either String a
-- | Remove ParamAliasDefs by inlining their definitions
inlineParamAliases :: [C.ParamDef] -> [C.ParamDef]
inlineParamAliases defs = if null aliases then defs else map rp' pdefs
where
(aliases,pdefs) = L.partition isParamAliasDef defs
rp' :: C.ParamDef -> C.ParamDef
rp' (C.ParamDef pid pids) = C.ParamDef pid (map rp'' pids)
rp' (C.ParamAliasDef _ _) = error "inlineParamAliases called on ParamAliasDef" -- impossible
rp'' :: C.ParamValueDef -> C.ParamValueDef
rp'' (C.Param pid pids) = C.Param pid (map rp''' pids)
rp''' :: C.ParamId -> C.ParamId
rp''' pid = case L.find (\(C.ParamAliasDef p _) -> p == pid) aliases of
Just (C.ParamAliasDef _ (C.ParamType (C.ParamTypeId p))) -> p
_ -> pid
isParamAliasDef :: C.ParamDef -> Bool
isParamAliasDef (C.ParamAliasDef _ _) = True
isParamAliasDef _ = False
isParamType :: C.LinType -> Bool
isParamType (C.ParamType _) = True
isParamType _ = False
isRecordType :: C.LinType -> Bool
isRecordType (C.RecordType _) = True
isRecordType _ = False
-- | Find all token strings, put them in a map and replace with token indexes
extractStrings :: L.Concrete -> L.Concrete
extractStrings concr = L.Concrete { L.toks = toks', L.lins = lins' }
where
imb = IntMapBuilder.fromIntMap (L.toks concr)
(lins',imb') = CMS.runState (go0 (L.lins concr)) imb
toks' = IntMapBuilder.toIntMap imb'
go0 :: Map.Map CId L.LinFun -> CMS.State (IntMapBuilder.IMB Text) (Map.Map CId L.LinFun)
go0 mp = do
xs <- mapM (\(cid,lin) -> go lin >>= \lin' -> return (cid,lin')) (Map.toList mp)
return $ Map.fromList xs
go :: L.LinFun -> CMS.State (IntMapBuilder.IMB Text) L.LinFun
go lf = case lf of
L.Token str -> do
imb <- CMS.get
let (ix,imb') = IntMapBuilder.insert' str imb
CMS.put imb'
return $ L.TokenIx ix
L.Pre pts df -> do
-- pts' <- mapM (\(pfxs,lv) -> go lv >>= \lv' -> return (pfxs,lv')) pts
pts' <- forM pts $ \(pfxs,lv) -> do
imb <- CMS.get
let str = T.pack $ show pfxs
let (ix,imb') = IntMapBuilder.insert' str imb
CMS.put imb'
lv' <- go lv
return (ix,lv')
df' <- go df
return $ L.PreIx pts' df'
L.Concat s t -> do
s' <- go s
t' <- go t
return $ L.Concat s' t'
L.Tuple ts -> do
ts' <- mapM go ts
return $ L.Tuple ts'
L.Projection t u -> do
t' <- go t
u' <- go u
return $ L.Projection t' u'
t -> return t
-- | Convert Maybe to Either value with error
m2e :: String -> Maybe a -> Either String a
m2e err = maybe (Left err) Right
-- | Wrap elemIndex into Either value
eitherElemIndex :: (Eq a, Show a) => a -> [a] -> Either String Int
eitherElemIndex x xs = m2e (printf "Cannot find: %s in %s" (show x) (show xs)) (elemIndex x xs)
mdi2s :: C.ModId -> String
mdi2s (C.ModId i) = showRawIdent i
mdi2i :: C.ModId -> CId
mdi2i (C.ModId i) = mkCId (showRawIdent i)
fi2i :: C.FunId -> CId
fi2i (C.FunId i) = mkCId (showRawIdent i)
-- Debugging
debugDir :: FilePath
debugDir = "DEBUG"
-- | Pretty-print canonical grammars to file
ppCanonical :: FilePath -> C.Grammar -> IO ()
ppCanonical path (C.Grammar ab cncs) = do
let (C.Abstract modId flags cats funs) = ab
writeFile (path </> mdi2s modId <.> "canonical.gf") (render $ pp ab)
forM_ cncs $ \cnc@(C.Concrete modId absModId flags params lincats lindefs) ->
writeFile' (path </> mdi2s modId <.> "canonical.gf") (render $ pp cnc)
-- | Dump canonical grammars to file
dumpCanonical :: FilePath -> C.Grammar -> IO ()
dumpCanonical path (C.Grammar ab cncs) = do
let (C.Abstract modId flags cats funs) = ab
let body = unlines $ map show cats ++ [""] ++ map show funs
writeFile' (path </> mdi2s modId <.> "canonical.dump") body
forM_ cncs $ \(C.Concrete modId absModId flags params lincats lindefs) -> do
let body = unlines $ concat [
map show params,
[""],
map show lincats,
[""],
map show lindefs
]
writeFile' (path </> mdi2s modId <.> "canonical.dump") body
-- | Pretty-print LPGF to file
ppLPGF :: FilePath -> LPGF -> IO ()
ppLPGF path lpgf =
forM_ (Map.toList $ L.concretes lpgf) $ \(cid,concr) ->
writeFile' (path </> showCId cid <.> "lpgf.txt") (L.render $ L.pp concr)
-- | Dump LPGF to file
dumpLPGF :: FilePath -> LPGF -> IO ()
dumpLPGF path lpgf =
forM_ (Map.toList $ L.concretes lpgf) $ \(cid,concr) -> do
let body = unlines $ map show (Map.toList $ L.lins concr)
writeFile' (path </> showCId cid <.> "lpgf.dump") body
-- | Write a file and report it to console
writeFile' :: FilePath -> String -> IO ()
writeFile' p b = do
writeFile p b
putStrLn $ "Wrote " ++ p

View File

@@ -1,9 +1,11 @@
module GF.Compiler (mainGFC, linkGrammars, writePGF, writeOutputs) where
module GF.Compiler (mainGFC, linkGrammars, writePGF, writeLPGF, writeOutputs) where
import PGF
import PGF.Internal(concretes,optimizePGF,unionPGF)
import PGF.Internal(putSplitAbs,encodeFile,runPut)
import GF.Compile as S(batchCompile,link,srcAbsName)
import LPGF(LPGF)
import qualified LPGF.Internal as LPGF
import GF.Compile as S(batchCompile,link,linkl,srcAbsName)
import GF.CompileInParallel as P(parallelBatchCompile)
import GF.Compile.Export
import GF.Compile.ConcreteToHaskell(concretes2haskell)
@@ -11,7 +13,8 @@ import GF.Compile.GrammarToCanonical--(concretes2canonical)
import GF.Compile.CFGtoPGF
import GF.Compile.GetGrammar
import GF.Grammar.BNFC
import GF.Grammar.CFG
import GF.Grammar.CFG hiding (Grammar)
import GF.Grammar.Grammar (Grammar, ModuleName)
--import GF.Infra.Ident(showIdent)
import GF.Infra.UseIO
@@ -23,10 +26,11 @@ import GF.Text.Pretty(render,render80)
import Data.Maybe
import qualified Data.Map as Map
import qualified Data.Set as Set
import Data.Time(UTCTime)
import qualified Data.ByteString.Lazy as BSL
import GF.Grammar.CanonicalJSON (encodeJSON)
import System.FilePath
import Control.Monad(when,unless,forM_)
import Control.Monad(when,unless,forM,void)
-- | Compile the given GF grammar files. The result is a number of @.gfo@ files
-- and, depending on the options, a @.pgf@ file. (@gf -batch@, @gf -make@)
@@ -47,7 +51,7 @@ mainGFC opts fs = do
extensionIs ext = (== ext) . takeExtension
compileSourceFiles :: Options -> [FilePath] -> IOE ()
compileSourceFiles opts fs =
compileSourceFiles opts fs =
do output <- batchCompile opts fs
exportCanonical output
unless (flag optStopAfterPhase opts == Compile) $
@@ -93,6 +97,10 @@ compileSourceFiles opts fs =
-- If a @.pgf@ file by the same name already exists and it is newer than the
-- source grammar files (as indicated by the 'UTCTime' argument), it is not
-- recreated. Calls 'writePGF' and 'writeOutputs'.
linkGrammars :: Options -> (UTCTime,[(ModuleName, Grammar)]) -> IOE ()
linkGrammars opts (_,cnc_grs) | FmtLPGF `elem` flag optOutputFormats opts = do
lpgf <- linkl opts (head cnc_grs)
void $ writeLPGF opts lpgf
linkGrammars opts (t_src,~cnc_grs@(~(cnc,gr):_)) =
do let abs = render (srcAbsName gr cnc)
pgfFile = outputPath opts (grammarName' opts abs<.>"pgf")
@@ -145,7 +153,7 @@ unionPGFFiles opts fs =
pgfFile = outputPath opts (grammarName opts pgf <.> "pgf")
if pgfFile `elem` fs
then putStrLnE $ "Refusing to overwrite " ++ pgfFile
else writePGF opts pgf
else void $ writePGF opts pgf
writeOutputs opts pgf
readPGFVerbose f =
@@ -155,33 +163,46 @@ unionPGFFiles opts fs =
-- Calls 'exportPGF'.
writeOutputs :: Options -> PGF -> IOE ()
writeOutputs opts pgf = do
sequence_ [writeOutput opts name str
sequence_ [writeOutput opts name str
| fmt <- flag optOutputFormats opts,
(name,str) <- exportPGF opts fmt pgf]
-- | Write the result of compiling a grammar (e.g. with 'compileToPGF' or
-- 'link') to a @.pgf@ file.
-- A split PGF file is output if the @-split-pgf@ option is used.
writePGF :: Options -> PGF -> IOE ()
writePGF :: Options -> PGF -> IOE [FilePath]
writePGF opts pgf =
if flag optSplitPGF opts then writeSplitPGF else writeNormalPGF
where
writeNormalPGF =
do let outfile = outputPath opts (grammarName opts pgf <.> "pgf")
writing opts outfile $ encodeFile outfile pgf
return [outfile]
writeSplitPGF =
do let outfile = outputPath opts (grammarName opts pgf <.> "pgf")
writing opts outfile $ BSL.writeFile outfile (runPut (putSplitAbs pgf))
--encodeFile_ outfile (putSplitAbs pgf)
forM_ (Map.toList (concretes pgf)) $ \cnc -> do
outfiles <- forM (Map.toList (concretes pgf)) $ \cnc -> do
let outfile = outputPath opts (showCId (fst cnc) <.> "pgf_c")
writing opts outfile $ encodeFile outfile cnc
return outfile
return (outfile:outfiles)
writeOutput :: Options -> FilePath-> String -> IOE ()
writeOutput opts file str = writing opts path $ writeUTF8File path str
where path = outputPath opts file
writeLPGF :: Options -> LPGF -> IOE FilePath
writeLPGF opts lpgf = do
let
grammarName = fromMaybe (showCId (LPGF.absname lpgf)) (flag optName opts)
outfile = outputPath opts (grammarName <.> "lpgf")
writing opts outfile $ liftIO $ LPGF.encodeFile outfile lpgf
return outfile
writeOutput :: Options -> FilePath-> String -> IOE FilePath
writeOutput opts file str = do
let outfile = outputPath opts file
writing opts outfile $ writeUTF8File outfile str
return outfile
-- * Useful helper functions

View File

@@ -0,0 +1,61 @@
-- | In order to build an IntMap in one pass, we need a map data structure with
-- fast lookup in both keys and values.
-- This is achieved by keeping a separate reversed map of values to keys during building.
module GF.Data.IntMapBuilder where
import Data.IntMap (IntMap)
import qualified Data.IntMap as IntMap
import Data.Hashable (Hashable)
import Data.HashMap.Strict (HashMap)
import qualified Data.HashMap.Strict as HashMap
import Data.Tuple (swap)
import Prelude hiding (lookup)
data IMB a = IMB {
intMap :: IntMap a,
valMap :: HashMap a Int
}
-- | An empty IMB
empty :: (Eq a, Hashable a) => IMB a
empty = IMB {
intMap = IntMap.empty,
valMap = HashMap.empty
}
-- | An empty IntMap
emptyIntMap :: IntMap a
emptyIntMap = IntMap.empty
-- | Lookup a value
lookup :: (Eq a, Hashable a) => a -> IMB a -> Maybe Int
lookup a IMB { valMap = vm } = HashMap.lookup a vm
-- | Insert without any lookup
insert :: (Eq a, Hashable a) => a -> IMB a -> (Int, IMB a)
insert a IMB { intMap = im, valMap = vm } =
let
ix = IntMap.size im
im' = IntMap.insert ix a im
vm' = HashMap.insert a ix vm
imb' = IMB { intMap = im', valMap = vm' }
in
(ix, imb')
-- | Insert only when lookup fails
insert' :: (Eq a, Hashable a) => a -> IMB a -> (Int, IMB a)
insert' a imb =
case lookup a imb of
Just ix -> (ix, imb)
Nothing -> insert a imb
-- | Build IMB from existing IntMap
fromIntMap :: (Eq a, Hashable a) => IntMap a -> IMB a
fromIntMap im = IMB {
intMap = im,
valMap = HashMap.fromList (map swap (IntMap.toList im))
}
-- | Get IntMap from IMB
toIntMap :: (Eq a, Hashable a) => IMB a -> IntMap a
toIntMap = intMap

View File

@@ -31,7 +31,7 @@ data TypeApp = TypeApp CatId [Type] deriving Show
data TypeBinding = TypeBinding VarId Type deriving Show
--------------------------------------------------------------------------------
-- ** Concreate syntax
-- ** Concrete syntax
-- | Concrete Syntax
data Concrete = Concrete ModId ModId Flags [ParamDef] [LincatDef] [LinDef]
@@ -103,9 +103,9 @@ data TableRow rhs = TableRow LinPattern rhs
-- *** Identifiers in Concrete Syntax
newtype PredefId = PredefId Id deriving (Eq,Ord,Show)
newtype LabelId = LabelId Id deriving (Eq,Ord,Show)
data VarValueId = VarValueId QualId deriving (Eq,Ord,Show)
newtype PredefId = PredefId Id deriving (Eq,Ord,Show)
newtype LabelId = LabelId Id deriving (Eq,Ord,Show)
newtype VarValueId = VarValueId QualId deriving (Eq,Ord,Show)
-- | Name of param type or param value
newtype ParamId = ParamId QualId deriving (Eq,Ord,Show)
@@ -116,9 +116,9 @@ newtype ParamId = ParamId QualId deriving (Eq,Ord,Show)
newtype ModId = ModId Id deriving (Eq,Ord,Show)
newtype CatId = CatId Id deriving (Eq,Ord,Show)
newtype FunId = FunId Id deriving (Eq,Show)
newtype FunId = FunId Id deriving (Eq,Ord,Show)
data VarId = Anonymous | VarId Id deriving Show
data VarId = Anonymous | VarId Id deriving (Eq,Show)
newtype Flags = Flags [(FlagName,FlagValue)] deriving Show
type FlagName = Id

View File

@@ -87,7 +87,8 @@ data Verbosity = Quiet | Normal | Verbose | Debug
data Phase = Preproc | Convert | Compile | Link
deriving (Show,Eq,Ord)
data OutputFormat = FmtPGFPretty
data OutputFormat = FmtLPGF
| FmtPGFPretty
| FmtCanonicalGF
| FmtCanonicalJson
| FmtJavaScript
@@ -335,7 +336,7 @@ optDescr =
Option ['f'] ["output-format"] (ReqArg outFmt "FMT")
(unlines ["Output format. FMT can be one of:",
"Canonical GF grammar: canonical_gf, canonical_json, (and haskell with option --haskell=concrete)",
"Multiple concrete: pgf (default), json, js, pgf_pretty, prolog, python, ...", -- gar,
"Multiple concrete: pgf (default), lpgf, json, js, pgf_pretty, prolog, python, ...", -- gar,
"Single concrete only: bnf, ebnf, fa, gsl, jsgf, regexp, slf, srgs_xml, srgs_abnf, vxml, ....", -- cf, lbnf,
"Abstract only: haskell, ..."]), -- prolog_abs,
Option [] ["sisr"] (ReqArg sisrFmt "FMT")
@@ -477,7 +478,8 @@ outputFormats = map fst outputFormatsExpl
outputFormatsExpl :: [((String,OutputFormat),String)]
outputFormatsExpl =
[(("pgf_pretty", FmtPGFPretty),"human-readable pgf"),
[(("lpgf", FmtLPGF),"Linearisation-only PGF"),
(("pgf_pretty", FmtPGFPretty),"Human-readable PGF"),
(("canonical_gf", FmtCanonicalGF),"Canonical GF source files"),
(("canonical_json", FmtCanonicalJson),"Canonical JSON source files"),
(("js", FmtJavaScript),"JavaScript (whole grammar)"),

View File

@@ -16,7 +16,6 @@ import Data.Version
import System.Directory
import System.Environment (getArgs)
import System.Exit
import GHC.IO.Encoding
-- import GF.System.Console (setConsoleEncoding)
-- | Run the GF main program, taking arguments from the command line.
@@ -24,7 +23,6 @@ import GHC.IO.Encoding
-- Run @gf --help@ for usage info.
main :: IO ()
main = do
setLocaleEncoding utf8
-- setConsoleEncoding
uncurry mainOpts =<< getOptions

View File

@@ -38,7 +38,7 @@ decodeUnicode :: TextEncoding -> ByteString -> String
decodeUnicode enc bs = unsafePerformIO $ decodeUnicodeIO enc bs
decodeUnicodeIO enc (PS fptr l len) = do
let bbuf = (emptyBuffer fptr len ReadBuffer) { bufL=l, bufR=l+len }
let bbuf = Buffer{bufRaw=fptr, bufState=ReadBuffer, bufSize=len, bufL=l, bufR=l+len}
cbuf <- newCharBuffer 128 WriteBuffer
case enc of
TextEncoding {mkTextDecoder=mk} -> do decoder <- mk

View File

@@ -1,5 +1,3 @@
module Main where
import qualified GF
main = GF.main

View File

@@ -18,24 +18,12 @@ gu_exn_is_raised(GuExn* err) {
return err && (err->state == GU_EXN_RAISED);
}
GU_API_DECL void
gu_exn_clear(GuExn* err) {
err->caught = NULL;
err->state = GU_EXN_OK;
}
GU_API bool
gu_exn_caught_(GuExn* err, const char* type)
{
return (err->caught && strcmp(err->caught, type) == 0);
}
GU_API_DECL void*
gu_exn_caught_data(GuExn* err)
{
return err->data.data;
}
GU_API void
gu_exn_block(GuExn* err)
{

View File

@@ -71,13 +71,11 @@ gu_new_exn(GuPool* pool);
GU_API_DECL bool
gu_exn_is_raised(GuExn* err);
// static inline void
// gu_exn_clear(GuExn* err) {
// err->caught = NULL;
// err->state = GU_EXN_OK;
// }
GU_API_DECL void
gu_exn_clear(GuExn* err);
static inline void
gu_exn_clear(GuExn* err) {
err->caught = NULL;
err->state = GU_EXN_OK;
}
#define gu_exn_caught(err, type) \
(err->caught && strcmp(err->caught, #type) == 0)
@@ -85,13 +83,11 @@ gu_exn_clear(GuExn* err);
GU_API_DECL bool
gu_exn_caught_(GuExn* err, const char* type);
// static inline const void*
// gu_exn_caught_data(GuExn* err)
// {
// return err->data.data;
// }
GU_API_DECL void*
gu_exn_caught_data(GuExn* err);
static inline const void*
gu_exn_caught_data(GuExn* err)
{
return err->data.data;
}
/// Temporarily block a raised exception.
GU_API_DECL void

View File

@@ -8,42 +8,6 @@
//#define PGF_JIT_DEBUG
#ifdef EMSCRIPTEN
PGF_INTERNAL PgfJitState*
pgf_new_jit(PgfReader* rdr)
{
return NULL;
}
PGF_INTERNAL PgfEvalGates*
pgf_jit_gates(PgfReader* rdr)
{
return NULL;
}
PGF_INTERNAL void
pgf_jit_predicate(PgfReader* rdr, PgfAbstr* abstr,
PgfAbsCat* abscat)
{
size_t n_funs = pgf_read_len(rdr);
gu_return_on_exn(rdr->err, );
for (size_t i = 0; i < n_funs; i++) {
gu_in_f64be(rdr->in, rdr->err); // ignore
gu_return_on_exn(rdr->err, );
PgfCId name = pgf_read_cid(rdr, rdr->tmp_pool);
gu_return_on_exn(rdr->err, );
}
}
PGF_INTERNAL void
pgf_jit_done(PgfReader* rdr, PgfAbstr* abstr)
{
}
#else
struct PgfJitState {
jit_state jit;
@@ -1365,5 +1329,3 @@ pgf_jit_done(PgfReader* rdr, PgfAbstr* abstr)
jit_flush_code(rdr->jit_state->buf, jit_get_ip().ptr);
}
#endif

View File

@@ -44,7 +44,6 @@ typedef struct {
PgfParseState *before;
PgfParseState *after;
PgfToken prefix;
bool prefix_bind;
PgfTokenProb* tp;
PgfExprEnum en; // enumeration for the generated trees/tokens
#ifdef PGF_COUNTS_DEBUG
@@ -1010,7 +1009,6 @@ pgf_new_parse_state(PgfParsing* ps, size_t start_offset,
(start_offset == end_offset);
state->start_offset = start_offset;
state->end_offset = end_offset;
state->viterbi_prob = viterbi_prob;
state->lexicon_idx =
gu_new_buf(PgfLexiconIdxEntry, ps->pool);
@@ -1383,30 +1381,20 @@ pgf_parsing_symbol(PgfParsing* ps, PgfItem* item, PgfSymbol sym)
break;
}
case PGF_SYMBOL_BIND: {
if (!ps->prefix_bind && ps->prefix != NULL && *(ps->sentence + ps->before->end_offset) == 0) {
PgfProductionApply* papp = gu_variant_data(item->prod);
ps->tp = gu_new(PgfTokenProb, ps->out_pool);
ps->tp->tok = NULL;
ps->tp->cat = item->conts->ccat->cnccat->abscat->name;
ps->tp->fun = papp->fun->absfun->name;
ps->tp->prob = item->inside_prob + item->conts->outside_prob;
} else {
if (ps->before->start_offset == ps->before->end_offset &&
ps->before->needs_bind) {
PgfParseState* state =
pgf_new_parse_state(ps, ps->before->end_offset, BIND_HARD,
item->inside_prob+item->conts->outside_prob);
if (state != NULL) {
pgf_item_advance(item, ps->pool);
gu_buf_heap_push(state->agenda, pgf_item_prob_order, &item);
} else {
pgf_item_free(ps, item);
}
} else {
pgf_item_free(ps, item);
}
}
if (ps->before->start_offset == ps->before->end_offset &&
ps->before->needs_bind) {
PgfParseState* state =
pgf_new_parse_state(ps, ps->before->end_offset, BIND_HARD,
item->inside_prob+item->conts->outside_prob);
if (state != NULL) {
pgf_item_advance(item, ps->pool);
gu_buf_heap_push(state->agenda, pgf_item_prob_order, &item);
} else {
pgf_item_free(ps, item);
}
} else {
pgf_item_free(ps, item);
}
break;
}
case PGF_SYMBOL_SOFT_BIND:
@@ -2349,8 +2337,7 @@ pgf_parser_completions_next(GuEnum* self, void* to, GuPool* pool)
PGF_API GuEnum*
pgf_complete(PgfConcr* concr, PgfType* type, GuString sentence,
GuString prefix, bool prefix_bind,
GuExn *err, GuPool* pool)
GuString prefix, GuExn *err, GuPool* pool)
{
if (concr->sequences == NULL ||
concr->cnccats == NULL) {
@@ -2390,7 +2377,6 @@ pgf_complete(PgfConcr* concr, PgfType* type, GuString sentence,
// Now begin enumerating the completions
ps->en.next = pgf_parser_completions_next;
ps->prefix = prefix;
ps->prefix_bind = prefix_bind;
ps->tp = NULL;
return &ps->en;
}

View File

@@ -251,8 +251,7 @@ typedef struct {
PGF_API_DECL GuEnum*
pgf_complete(PgfConcr* concr, PgfType* type, GuString string,
GuString prefix, bool prefix_bind,
GuExn* err, GuPool* pool);
GuString prefix, GuExn* err, GuPool* pool);
typedef struct PgfLiteralCallback PgfLiteralCallback;

View File

@@ -1326,7 +1326,7 @@ pgf_read_concretes(PgfReader* rdr, PgfAbstr* abstr, bool with_content)
PGF_INTERNAL PgfPGF*
pgf_read_pgf(PgfReader* rdr) {
PgfPGF* pgf = gu_new(PgfPGF, rdr->opool);
pgf->major_version = gu_in_u16be(rdr->in, rdr->err);
gu_return_on_exn(rdr->err, NULL);
@@ -1335,7 +1335,7 @@ pgf_read_pgf(PgfReader* rdr) {
pgf->gflags = pgf_read_flags(rdr);
gu_return_on_exn(rdr->err, NULL);
pgf_read_abstract(rdr, &pgf->abstract);
gu_return_on_exn(rdr->err, NULL);

View File

@@ -1026,10 +1026,7 @@ complete lang (Type ctype _) sent pfx =
touchConcr lang
return []
else do
p_tok <- (#peek PgfTokenProb, tok) cmpEntry
tok <- if p_tok == nullPtr
then return "&+"
else peekUtf8CString p_tok
tok <- peekUtf8CString =<< (#peek PgfTokenProb, tok) cmpEntry
cat <- peekUtf8CString =<< (#peek PgfTokenProb, cat) cmpEntry
fun <- peekUtf8CString =<< (#peek PgfTokenProb, fun) cmpEntry
prob <- (#peek PgfTokenProb, prob) cmpEntry

View File

@@ -26,7 +26,7 @@ library
PGF2.Expr,
PGF2.Type
build-depends:
base >= 4.9.1 && < 4.16,
base >= 4.9.1 && < 4.15,
containers >= 0.5.7 && < 0.7,
pretty >= 1.1.3 && < 1.2
default-language: Haskell2010

View File

@@ -1,6 +1,4 @@
{-# LANGUAGE CPP, MagicHash #-}
-- This module makes profiling a lot slower, so don't add automatic cost centres
{-# OPTIONS_GHC -fno-prof-auto #-}
-- for unboxed shifts
-----------------------------------------------------------------------------

234
src/runtime/haskell/LPGF.hs Normal file
View File

@@ -0,0 +1,234 @@
{-# LANGUAGE LambdaCase #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE ScopedTypeVariables #-}
-- | Linearisation-only portable grammar format.
--
-- LPGF is an output format from the GF compiler, intended as a smaller and faster alternative to PGF.
-- This API allows LPGF files to be used in Haskell programs.
--
-- The implementation closely follows description in Section 2 of Angelov, Bringert, Ranta (2009):
-- "PGF: A Portable Run-Time Format for Type-Theoretical Grammars".
-- http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.640.6330&rep=rep1&type=pdf
module LPGF (
-- * LPGF
LPGF,
showLPGF,
readLPGF,
-- * Identifiers
CId,
mkCId,
showCId,
readCId,
-- * Abstract syntax
Abstract,
abstractName,
-- ** Categories
-- ** Functions
-- ** Expressions
Expr,
PGF.showExpr,
PGF.readExpr,
-- ** Types
-- ** Type checking
-- * Concrete syntax
Language,
PGF.showLanguage,
PGF.readLanguage,
languages,
Concrete,
LPGF.concretes,
-- ** Linearization
linearize,
linearizeText,
linearizeConcrete,
linearizeConcreteText
) where
import LPGF.Internal
import PGF (Language)
import PGF.CId
import PGF.Expr (Expr, Literal (..))
import PGF.Tree (Tree (..), expr2tree, prTree)
import qualified PGF
import Data.Binary (decodeFile)
import Data.Either (isLeft)
import qualified Data.IntMap as IntMap
import qualified Data.Map.Strict as Map
import Data.Text (Text)
import qualified Data.Text as T
import Numeric (showFFloat)
import Text.Printf (printf)
import Prelude hiding ((!!))
import qualified Prelude
-- | The abstract language name is the name of the top-level abstract module.
abstractName :: LPGF -> CId
abstractName = absname
-- | List of all languages available in the given grammar.
languages :: LPGF -> [Language]
languages = Map.keys . LPGF.Internal.concretes
-- | Map of all languages and their corresponding concrete sytaxes.
concretes :: LPGF -> Map.Map Language Concrete
concretes = LPGF.Internal.concretes
-- | Reads file in LPGF and produces 'LPGF' term.
-- The file is usually produced with:
--
-- > $ gf --make --output-format=lpgf <grammar file name>
readLPGF :: FilePath -> IO LPGF
readLPGF = Data.Binary.decodeFile
-- | Produce pretty-printed representation of an LPGF.
showLPGF :: LPGF -> String
showLPGF = render . pp
-- | Main linearize function, to 'String'
linearize :: LPGF -> Language -> Expr -> String
linearize lpgf lang expr = T.unpack $ linearizeText lpgf lang expr
-- | Main linearize function, to 'Data.Text.Text'
linearizeText :: LPGF -> Language -> Expr -> Text
linearizeText lpgf lang =
case Map.lookup lang (LPGF.Internal.concretes lpgf) of
Just concr -> linearizeConcreteText concr
Nothing -> error $ printf "Unknown language: %s" (showCId lang)
-- | Language-specific linearize function, to 'String'
linearizeConcrete :: Concrete -> Expr -> String
linearizeConcrete concr expr = T.unpack $ linearizeConcreteText concr expr
-- | Language-specific linearize function, to 'Data.Text.Text'
linearizeConcreteText :: Concrete -> Expr -> Text
linearizeConcreteText concr expr = lin2string $ lin (expr2tree expr)
where
lin :: Tree -> LinFun
lin = \case
Fun f as ->
case Map.lookup f (lins concr) of
Just t -> eval cxt t
where cxt = Context { cxToks = toks concr, cxArgs = map lin as }
_ -> Missing f
Lit l -> Tuple [Token (T.pack s)]
where
s = case l of
LStr s -> s
LInt i -> show i
LFlt f -> showFFloat (Just 6) f ""
x -> error $ printf "Cannot lin: %s" (prTree x)
-- | Evaluation context
data Context = Context {
cxArgs :: [LinFun], -- ^ is a sequence of terms
cxToks :: IntMap.IntMap Text -- ^ token map
}
-- | Operational semantics
eval :: Context -> LinFun -> LinFun
eval cxt t = case t of
Error err -> error err
Pre pts df -> Pre pts' df'
where
pts' = [(pfxs, eval cxt t) | (pfxs, t) <- pts]
df' = eval cxt df
Concat s t -> Concat v w
where
v = eval cxt s
w = eval cxt t
Tuple ts -> Tuple vs
where vs = map (eval cxt) ts
Projection t u ->
case (eval cxt t, eval cxt u) of
(Missing f, _) -> Missing f
(Tuple vs, Missing _) | not (null vs) -> vs !! 0 -- cannot know how deep to unpack; this gives best results with current testsuite
(_, Missing f) -> Missing f
(Tuple vs, Ix i) -> vs !! (i-1)
(t', tv@(Tuple _)) -> eval cxt $ foldl Projection t' (flattenTuple tv)
(t',u') -> error $ printf "Incompatible projection:\n- %s\n⇓ %s\n- %s\n⇓ %s" (show t) (show t') (show u) (show u')
Argument i -> cxArgs cxt !! (i-1)
PreIx pts df -> Pre pts' df'
where
pts' = [(pfxs, eval cxt t) | (ix, t) <- pts, let pfxs = maybe [] (read . T.unpack) $ IntMap.lookup ix (cxToks cxt)]
df' = eval cxt df
TokenIx i -> maybe Empty Token $ IntMap.lookup i (cxToks cxt)
_ -> t
flattenTuple :: LinFun -> [LinFun]
flattenTuple = \case
Tuple vs -> concatMap flattenTuple vs
lf -> [lf]
-- | Turn concrete syntax terms into an actual string.
-- This is done in two passes, first to flatten concats & evaluate pre's, then to
-- apply BIND and other predefs.
lin2string :: LinFun -> Text
lin2string lf = T.unwords $ join $ flatten [lf]
where
-- Process bind et al into final token list
join :: [Either LinFun Text] -> [Text]
join elt = case elt of
Right tok:Left Bind:ls ->
case join ls of
next:ls' -> tok `T.append` next : ls'
_ -> []
Right tok:ls -> tok : join ls
Left Space:ls -> join ls
Left Capit:ls ->
case join ls of
next:ls' -> T.toUpper (T.take 1 next) `T.append` T.drop 1 next : ls'
_ -> []
Left AllCapit:ls ->
case join ls of
next:ls' -> T.toUpper next : ls'
_ -> []
Left (Missing cid):ls -> join (Right (T.pack (printf "[%s]" (show cid))) : ls)
[] -> []
x -> error $ printf "Unhandled term in lin2string: %s" (show x)
-- Process concats, tuples, pre into flat list
flatten :: [LinFun] -> [Either LinFun Text]
flatten [] = []
flatten (l:ls) = case l of
Empty -> flatten ls
Token "" -> flatten ls
Token tok -> Right tok : flatten ls
Concat l1 l2 -> flatten (l1 : l2 : ls)
Tuple [l] -> flatten (l:ls)
Tuple (l:_) -> flatten (l:ls) -- unselected table, just choose first option (see e.g. FoodsJpn)
Pre pts df ->
let
f = flatten ls
ch = case dropWhile isLeft f of
Right next:_ ->
let matches = [ l | (pfxs, l) <- pts, any (`T.isPrefixOf` next) pfxs ]
in if null matches then df else head matches
_ -> df
in flatten (ch:ls)
x -> Left x : flatten ls
-- | List indexing with more verbose error messages
(!!) :: (Show a) => [a] -> Int -> a
(!!) xs i
| i < 0 = error $ printf "!!: index %d too small for list: %s" i (show xs)
| i > length xs - 1 = error $ printf "!!: index %d too large for list: %s" i (show xs)
| otherwise = xs Prelude.!! i
isIx :: LinFun -> Bool
isIx (Ix _) = True
isIx _ = False

View File

@@ -0,0 +1,227 @@
{-# LANGUAGE LambdaCase #-}
module LPGF.Internal where
import PGF.CId
import PGF ()
import Control.Monad (liftM, liftM2, forM_)
import qualified Control.Monad.Writer as CMW
import Data.Binary (Binary, put, get, putWord8, getWord8, encodeFile)
import qualified Data.IntMap as IntMap
import qualified Data.Map.Strict as Map
import Data.Text (Text)
import qualified Data.Text as T
import qualified Data.Text.Encoding as TE
-- | Linearisation-only PGF
data LPGF = LPGF {
absname :: CId,
abstract :: Abstract,
concretes :: Map.Map CId Concrete
} deriving (Show)
-- | Abstract syntax (currently empty)
data Abstract = Abstract {
} deriving (Show)
-- | Concrete syntax
data Concrete = Concrete {
toks :: IntMap.IntMap Text, -- ^ all strings are stored exactly once here
-- lincats :: Map.Map CId LinType, -- ^ a linearization type for each category
lins :: Map.Map CId LinFun -- ^ a linearization function for each function
} deriving (Show)
-- | Abstract function type
-- data Type = Type [CId] CId
-- deriving (Show)
-- -- | Linearisation type
-- data LinType =
-- StrType
-- | IxType Int
-- | ProductType [LinType]
-- deriving (Show)
-- | Linearisation function
data LinFun =
-- Additions
Error String -- ^ a runtime error, should probably not be supported at all
| Bind -- ^ join adjacent tokens
| Space -- ^ space between adjacent tokens
| Capit -- ^ capitalise next character
| AllCapit -- ^ capitalise next word
| Pre [([Text], LinFun)] LinFun
| Missing CId -- ^ missing definition (inserted at runtime)
-- From original definition in paper
| Empty
| Token Text
| Concat LinFun LinFun
| Ix Int
| Tuple [LinFun]
| Projection LinFun LinFun
| Argument Int
-- For reducing LPGF file when stored
| PreIx [(Int, LinFun)] LinFun -- ^ index into `toks` map (must apply read to convert to list)
| TokenIx Int -- ^ index into `toks` map
deriving (Show, Read)
instance Binary LPGF where
put lpgf = do
put (absname lpgf)
put (abstract lpgf)
put (concretes lpgf)
get = do
an <- get
abs <- get
concs <- get
return $ LPGF {
absname = an,
abstract = abs,
concretes = concs
}
instance Binary Abstract where
put abs = return ()
get = return $ Abstract {}
instance Binary Concrete where
put concr = do
put (toks concr)
put (lins concr)
get = do
ts <- get
ls <- get
return $ Concrete {
toks = ts,
lins = ls
}
instance Binary LinFun where
put = \case
Error e -> putWord8 0 >> put e
Bind -> putWord8 1
Space -> putWord8 2
Capit -> putWord8 3
AllCapit -> putWord8 4
Pre ps d -> putWord8 5 >> put (ps,d)
Missing f -> putWord8 13 >> put f
Empty -> putWord8 6
Token t -> putWord8 7 >> put t
Concat l1 l2 -> putWord8 8 >> put (l1,l2)
Ix i -> putWord8 9 >> put i
Tuple ls -> putWord8 10 >> put ls
Projection l1 l2 -> putWord8 11 >> put (l1,l2)
Argument i -> putWord8 12 >> put i
PreIx ps d -> putWord8 15 >> put (ps,d)
TokenIx i -> putWord8 14 >> put i
get = do
tag <- getWord8
case tag of
0 -> liftM Error get
1 -> return Bind
2 -> return Space
3 -> return Capit
4 -> return AllCapit
5 -> liftM2 Pre get get
13 -> liftM Missing get
6 -> return Empty
7 -> liftM Token get
8 -> liftM2 Concat get get
9 -> liftM Ix get
10 -> liftM Tuple get
11 -> liftM2 Projection get get
12 -> liftM Argument get
15 -> liftM2 PreIx get get
14 -> liftM TokenIx get
_ -> fail "Failed to decode LPGF binary format"
instance Binary Text where
put = put . TE.encodeUtf8
get = liftM TE.decodeUtf8 get
encodeFile :: FilePath -> LPGF -> IO ()
encodeFile = Data.Binary.encodeFile
------------------------------------------------------------------------------
-- Utilities
-- | Helper for building concat trees
mkConcat :: [LinFun] -> LinFun
mkConcat [] = Empty
mkConcat [x] = x
mkConcat xs = foldl1 Concat xs
-- | Helper for unfolding concat trees
unConcat :: LinFun -> [LinFun]
unConcat (Concat l1 l2) = concatMap unConcat [l1, l2]
unConcat lf = [lf]
------------------------------------------------------------------------------
-- Pretty-printing
type Doc = CMW.Writer [String] ()
render :: Doc -> String
render = unlines . CMW.execWriter
class PP a where
pp :: a -> Doc
instance PP LPGF where
pp (LPGF _ _ cncs) = mapM_ pp cncs
instance PP Concrete where
pp (Concrete toks lins) = do
forM_ (IntMap.toList toks) $ \(i,tok) ->
CMW.tell [show i ++ " " ++ T.unpack tok]
CMW.tell [""]
forM_ (Map.toList lins) $ \(cid,lin) -> do
CMW.tell ["# " ++ showCId cid]
pp lin
CMW.tell [""]
instance PP LinFun where
pp = pp' 0
where
pp' n = \case
Pre ps d -> do
p "Pre"
CMW.tell [ replicate (2*(n+1)) ' ' ++ show p | p <- ps ]
pp' (n+1) d
c@(Concat l1 l2) -> do
let ts = unConcat c
if any isDeep ts
then do
p "Concat"
mapM_ (pp' (n+1)) ts
else
p $ "Concat " ++ show ts
Tuple ls | any isDeep ls -> do
p "Tuple"
mapM_ (pp' (n+1)) ls
Projection l1 l2 | isDeep l1 || isDeep l2 -> do
p "Projection"
pp' (n+1) l1
pp' (n+1) l2
t -> p $ show t
where
p :: String -> Doc
p t = CMW.tell [ replicate (2*n) ' ' ++ t ]
isDeep = not . isTerm
isTerm = \case
Pre _ _ -> False
Concat _ _ -> False
Tuple _ -> False
Projection _ _ -> False
_ -> True

View File

@@ -15,7 +15,7 @@ library
default-language: Haskell2010
build-depends:
array >= 0.5.1 && < 0.6,
base >= 4.9.1 && < 4.16,
base >= 4.9.1 && < 4.15,
bytestring >= 0.10.8 && < 0.11,
containers >= 0.5.7 && < 0.7,
ghc-prim >= 0.5.0 && < 0.7,

View File

@@ -1 +0,0 @@
.libs/

View File

@@ -0,0 +1,4 @@
# Deprecation notice
As of June 2019, this JavaScript version of the GF runtime is considered deprecated,
in favour of the TypeScript version in <https://github.com/GrammaticalFramework/gf-typescript>.

View File

@@ -1,48 +0,0 @@
FROM emscripten/emsdk:latest
RUN apt update
RUN apt install -y autoconf automake libtool make
WORKDIR /tmp/c
COPY gu/*.c gu/*.h /tmp/c/gu/
COPY pgf/*.c pgf/*.h /tmp/c/pgf/
COPY pgf/lightning/i386/*.h /tmp/c/pgf/lightning/i386/
COPY pgf/lightning/*.h /tmp/c/pgf/lightning/
COPY \
Makefile.am \
configure.ac \
lib*.pc.in \
/tmp/c/
RUN autoreconf -i
RUN emconfigure ./configure
RUN emmake make
RUN emcc .libs/libgu.a .libs/libpgf.a -o pgf.js \
-sALLOW_MEMORY_GROWTH \
-sEXPORTED_FUNCTIONS="\
_pgf_read,\
_pgf_abstract_name,\
_pgf_read_expr,\
_pgf_print_expr,\
_pgf_expr_arity,\
_gu_new_pool,\
_gu_new_exn,\
_gu_data_in,\
_gu_exn_is_raised,\
_gu_exn_caught_,\
_gu_exn_caught_data,\
_gu_exn_clear,\
_gu_new_string_buf,\
_gu_string_buf_out,\
_gu_string_buf_data,\
_malloc,\
_free\
"\
-sEXPORTED_RUNTIME_METHODS="\
ccall,\
FS,\
getValue,\
AsciiToString,\
stringToUTF8,\
UTF8ToString,\
allocateUTF8\
"

View File

@@ -1,11 +0,0 @@
# JavaScript runtime using Web Assembly
This folder contains very early work experimenting with a pure JavaScript runtime,
compiled to Web Assembly (WASM) using [Emscripten](https://emscripten.org/).
1. Compile the WASM files (inside Docker) using `build-wasm.sh`, placing them in `.libs/`
2. Test in Node.js by running `node test-node.js [path to PGF]`
3. Test in a web browser
1. Start a server with `npx serve -l 41296`
2. Browse to `http://localhost:41296/test-web.html`
3. Check JavaScript console

View File

@@ -1,10 +0,0 @@
#! /usr/bin/env bash
set -e
# Build inside Docker image
IMAGE="gf/build-c-runtime-wasm"
docker build ../c --file Dockerfile --tag $IMAGE
# Copy bulit files from container to host
mkdir -p .libs
docker run --rm --volume "$PWD":/tmp/host $IMAGE bash -c "cp pgf.js pgf.wasm /tmp/host/.libs/"

View File

@@ -0,0 +1,62 @@
abstract Editor = {
cat Adjective ;
Noun ;
Verb ;
Determiner ;
Sentence ;
fun Available : Adjective ;
Next : Adjective ;
Previous : Adjective ;
fun Bulgarian : Noun ;
Danish : Noun ;
English : Noun ;
Finnish : Noun ;
French : Noun ;
German : Noun ;
Italian : Noun ;
Norwegian : Noun ;
Russian : Noun ;
Spanish : Noun ;
Swedish : Noun ;
fun Float_N : Noun ;
Integer_N : Noun ;
String_N : Noun ;
Language : Noun ;
Node : Noun ;
Page : Noun ;
Refinement : Noun ;
Tree : Noun ;
Wrapper : Noun ;
fun Copy : Verb ;
Cut : Verb ;
Delete : Verb ;
Enter : Verb ;
Parse : Verb ;
Paste : Verb ;
Redo : Verb ;
Refine : Verb ;
Replace : Verb ;
Select : Verb ;
Show : Verb ;
Undo : Verb ;
Wrap : Verb ;
fun DefPlDet : Determiner ;
DefSgDet : Determiner ;
IndefPlDet : Determiner ;
IndefSgDet : Determiner ;
fun Command : Verb -> Determiner -> Noun -> Sentence ;
CommandAdj : Verb -> Determiner -> Adjective -> Noun -> Sentence ;
ErrorMessage : Adjective -> Noun -> Sentence ;
Label : Noun -> Sentence ;
RandomlyCommand : Verb -> Determiner -> Noun -> Sentence ;
SingleWordCommand : Verb -> Sentence ;
}

View File

@@ -0,0 +1,63 @@
--# -path=alltenses
concrete EditorEng of Editor = open GrammarEng, ParadigmsEng in {
lincat Adjective = A ;
Noun = N ;
Verb = V ;
Determiner = Det ;
Sentence = Utt ;
lin Available = mkA "available" ;
Next = mkA "next" ;
Previous = mkA "previous" ;
lin Bulgarian = mkN "Bulgarian" ;
Danish = mkN "Danish" ;
English = mkN "English" ;
Finnish = mkN "Finnish" ;
French = mkN "French" ;
German = mkN "German" ;
Italian = mkN "Italian" ;
Norwegian = mkN "Norwegian" ;
Russian = mkN "Russian" ;
Spanish = mkN "Spanish" ;
Swedish = mkN "Swedish" ;
lin Float_N = mkN "float" ;
Integer_N = mkN "integer" ;
String_N = mkN "string" ;
Language = mkN "language" ;
Node = mkN "node" ;
Page = mkN "page" ;
Refinement = mkN "refinement" ;
Tree = mkN "tree" ;
Wrapper = mkN "wrapper" ;
lin Copy = mkV "copy" ;
Cut = mkV "cut" ;
Delete = mkV "delete" ;
Enter = mkV "enter" ;
Parse = mkV "parse" ;
Paste = mkV "paste" ;
Redo = mkV "redo" ;
Refine = mkV "refine" ;
Replace = mkV "replace" ;
Select = mkV "select" ;
Show = mkV "show" ;
Undo = mkV "undo" ;
Wrap = mkV "wrap" ;
lin DefPlDet = DetQuant DefArt NumPl ;
DefSgDet = DetQuant DefArt NumSg ;
IndefPlDet = DetQuant IndefArt NumPl ;
IndefSgDet = DetQuant IndefArt NumSg ;
lin Command v d n = UttImpSg PPos (ImpVP (ComplSlash (SlashV2a (mkV2 v)) (DetCN d (UseN n)))) ;
CommandAdj v d a n = UttImpSg PPos (ImpVP (ComplSlash (SlashV2a (mkV2 v)) (DetCN d (AdjCN (PositA a) (UseN n))))) ;
ErrorMessage a n = UttNP (DetCN (DetQuant no_Quant NumPl) (AdjCN (PositA a) (UseN n))) ;
Label n = UttNP (MassNP (UseN n)) ;
RandomlyCommand v d n = UttImpSg PPos (ImpVP (AdvVP (ComplSlash (SlashV2a (mkV2 v)) (DetCN d (UseN n))) (PrepNP (mkPrep "at") (MassNP (UseN (mkN "random")))))) ;
SingleWordCommand v = UttImpSg PPos (ImpVP (UseV v)) ;
}

View File

@@ -0,0 +1,17 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="style.css" />
<script type="text/javascript" src="gflib.js"></script>
<script type="text/javascript" src="editorGrammar.js"></script>
<script type="text/javascript" src="grammar.js"></script>
<script type="text/javascript" src="gfjseditor.js"></script>
<title>Web-based Syntax Editor</title>
</head>
<body onload="mkEditor('editor', Foods)" onkeydown="return hotKeys(event)">
<div id="editor">
</div>
</body>
</html>

File diff suppressed because one or more lines are too long

Binary file not shown.

After

Width:  |  Height:  |  Size: 161 B

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,54 @@
/* Output */
function sayText(text) {
document.voice_output_text = text;
activateForm("voice_output");
}
/* XHTML+Voice Utilities */
function activateForm(formid) {
var form = document.getElementById(formid);
var e = document.createEvent("UIEvents");
e.initEvent("DOMActivate","true","true");
form.dispatchEvent(e);
}
/* DOM utilities */
/* Gets the head element of the document. */
function getHeadElement() {
var hs = document.getElementsByTagName("head");
if (hs.length == 0) {
var head = document.createElement("head");
document.documentElement.insertBefore(head, document.documentElement.firstChild);
return head;
} else {
return hs[0];
}
}
/* Gets the body element of the document. */
function getBodyElement() {
var bs = document.getElementsByTagName("body");
if (bs.length == 0) {
var body = document.createElement("body");
document.documentElement.appendChild(body);
return body;
} else {
return bs[0];
}
}
/* Removes all the children of a node */
function removeChildren(node) {
while (node.hasChildNodes()) {
node.removeChild(node.firstChild);
}
}
function setText(node, text) {
removeChildren(node);
node.appendChild(document.createTextNode(text));
}

File diff suppressed because it is too large Load Diff

File diff suppressed because one or more lines are too long

View File

@@ -1,543 +0,0 @@
/**
* This module is the high-level JavaScript wrapper around the WASM-compiled version.
*/
async function mkAPI() {
const sizeof_GuMapItor = 4;
const offsetof_GuMapItor_fn = 0;
var asm = null;
var wasmTable = null;
var freeTableIndexes = [];
function setErrNo(value) {
HEAP32[asm.__errno_location() >> 2] = value;
return value;
}
function abortOnCannotGrowMemory(requestedSize) {
abort('Cannot enlarge memory arrays to size ' + requestedSize + ' bytes (OOM). Either (1) compile with -s INITIAL_MEMORY=X with X higher than the current value ' + HEAP8.length + ', (2) compile with -s ALLOW_MEMORY_GROWTH=1 which allows increasing the size at runtime, or (3) if you want malloc to return NULL (0) instead of this abort, compile with -s ABORTING_MALLOC=0 ');
}
function _emscripten_resize_heap(requestedSize) {
var oldSize = HEAPU8.length;
requestedSize = requestedSize >>> 0;
abortOnCannotGrowMemory(requestedSize);
}
var tempRet0 = 0;
var urlData = {};
var fdData = {};
var fdMax = 0;
var asmLibraryArg = {
"__syscall_fcntl64":
function (fd, cmd, varargs) {
setErrNo(134);
return -1;
},
"__syscall_ioctl":
function (fd, op, varargs) {
setErrNo(134);
return -1;
},
"__syscall_open":
function (pathPtr, flags, varargs) {
const path = UTF8ToString(pathPtr);
const data = urlData[path];
if (data == null) {
setErrNo(129);
return -1;
}
fdMax++;
fdData[fdMax] = {data: data, pos: 0};
delete urlData[path];
return fdMax;
},
"_munmap_js":
function (addr, len, prot, flags, fd, offset) {
setErrNo(134);
return -1;
},
"abort":
function () {
console.log('native code called abort()');
},
"emscripten_memcpy_big":
function (dest, src, num) {
HEAPU8.copyWithin(dest, src, src + num);
},
"emscripten_resize_heap":
function _emscripten_resize_heap(requestedSize) {
var oldSize = HEAPU8.length;
requestedSize = requestedSize >>> 0;
abortOnCannotGrowMemory(requestedSize);
},
"fd_close":
function (fd) {
delete fdData[fd];
return 0;
},
"fd_read":
function (fd, iov, iovcnt, pnum) {
const info = fdData[fd];
if (info == null) {
setErrNo(121);
return -1;
}
let num = 0;
for (let i = 0; i < iovcnt; i++) {
const ptr = HEAP32[(((iov)+(i*8))>>2)];
const len = HEAP32[(((iov)+(i*8 + 4))>>2)];
let cnt = 0;
while (cnt < len && info.pos < info.data.length) {
HEAP8[ptr+cnt] = info.data[info.pos];
info.pos++
cnt++;
}
num += cnt;
if (cnt < len) break; // nothing more to read
}
HEAP32[((pnum)>>2)] = num;
return 0;
},
"fd_seek":
function (fd, offset_low, offset_high, whence, newOffset) {
setErrNo(134);
return -1;
},
"fd_write":
function _fd_write(fd, iov, iovcnt, pnum) {
setErrNo(134);
return -1;
},
"setTempRet0":
function (value) {
tempRet0 = value;
},
"__assert_fail":
function (condition, filename, line, func) {
abort('Assertion failed: ' + UTF8ToString(condition) + ', at: ' + [filename ? UTF8ToString(filename) : 'unknown filename', line, func ? UTF8ToString(func) : 'unknown function']);
}
};
// Wraps a JS function as a wasm function with a given signature.
function convertJsFunctionToWasm(func, sig) {
// If the type reflection proposal is available, use the new
// "WebAssembly.Function" constructor.
// Otherwise, construct a minimal wasm module importing the JS function and
// re-exporting it.
if (typeof WebAssembly.Function == "function") {
var typeNames = {
'i': 'i32',
'j': 'i64',
'f': 'f32',
'd': 'f64'
};
var type = {
parameters: [],
results: sig[0] == 'v' ? [] : [typeNames[sig[0]]]
};
for (var i = 1; i < sig.length; ++i) {
type.parameters.push(typeNames[sig[i]]);
}
return new WebAssembly.Function(type, func);
}
// The module is static, with the exception of the type section, which is
// generated based on the signature passed in.
var typeSection = [
0x01, // id: section,
0x00, // length: 0 (placeholder)
0x01, // count: 1
0x60, // form: func
];
var sigRet = sig.slice(0, 1);
var sigParam = sig.slice(1);
var typeCodes = {
'i': 0x7f, // i32
'j': 0x7e, // i64
'f': 0x7d, // f32
'd': 0x7c, // f64
};
// Parameters, length + signatures
typeSection.push(sigParam.length);
for (var i = 0; i < sigParam.length; ++i) {
typeSection.push(typeCodes[sigParam[i]]);
}
// Return values, length + signatures
// With no multi-return in MVP, either 0 (void) or 1 (anything else)
if (sigRet == 'v') {
typeSection.push(0x00);
} else {
typeSection = typeSection.concat([0x01, typeCodes[sigRet]]);
}
// Write the overall length of the type section back into the section header
// (excepting the 2 bytes for the section id and length)
typeSection[1] = typeSection.length - 2;
// Rest of the module is static
var bytes = new Uint8Array([
0x00, 0x61, 0x73, 0x6d, // magic ("\0asm")
0x01, 0x00, 0x00, 0x00, // version: 1
].concat(typeSection, [
0x02, 0x07, // import section
// (import "e" "f" (func 0 (type 0)))
0x01, 0x01, 0x65, 0x01, 0x66, 0x00, 0x00,
0x07, 0x05, // export section
// (export "f" (func 0 (type 0)))
0x01, 0x01, 0x66, 0x00, 0x00,
]));
// We can compile this wasm module synchronously because it is very small.
// This accepts an import (at "e.f"), that it reroutes to an export (at "f")
var module = new WebAssembly.Module(bytes);
var instance = new WebAssembly.Instance(module, {
'e': {'f': func}
});
var wrappedFunc = instance.exports['f'];
return wrappedFunc;
}
function addFunction(func, sig) {
func = convertJsFunctionToWasm(func, sig);
let index;
// Reuse a free index if there is one, otherwise grow.
if (freeTableIndexes.length) {
index = freeTableIndexes.pop();
} else {
// Grow the table
try {
wasmTable.grow(1);
} catch (err) {
if (!(err instanceof RangeError)) {
throw err;
}
throw 'Unable to grow wasm table. Set ALLOW_TABLE_GROWTH.';
}
index = wasmTable.length - 1;
}
wasmTable.set(index, func);
return index;
}
function removeFunction(index) {
freeTableIndexes.push(index);
}
const response = await fetch("pgf.wasm", { credentials: 'same-origin' });
const info = {
'env': asmLibraryArg,
'wasi_snapshot_preview1': asmLibraryArg,
};
// Suppress closure warning here since the upstream definition for
// instantiateStreaming only allows Promise<Repsponse> rather than
// an actual Response.
// TODO(https://github.com/google/closure-compiler/pull/3913): Remove if/when upstream closure is fixed.
/** @suppress {checkTypes} */
const result = await WebAssembly.instantiateStreaming(response, info);
asm = result["instance"].exports;
wasmTable = asm['__indirect_function_table'];
const buf = asm['memory'].buffer;
const HEAP8 = new Int8Array(buf);
const HEAP16 = new Int16Array(buf);
const HEAP32 = new Int32Array(buf);
const HEAPU8 = new Uint8Array(buf);
const HEAPU16 = new Uint16Array(buf);
const HEAPU32 = new Uint32Array(buf);
const HEAPF32 = new Float32Array(buf);
const HEAPF64 = new Float64Array(buf);
// Returns the number of bytes the given Javascript string takes if encoded as a UTF8 byte array, EXCLUDING the null terminator byte.
function lengthBytesUTF8(str) {
var len = 0;
for (var i = 0; i < str.length; ++i) {
// Gotcha: charCodeAt returns a 16-bit word that is a UTF-16 encoded code unit, not a Unicode code point of the character! So decode UTF16->UTF32->UTF8.
// See http://unicode.org/faq/utf_bom.html#utf16-3
var u = str.charCodeAt(i); // possibly a lead surrogate
if (u >= 0xD800 && u <= 0xDFFF) u = 0x10000 + ((u & 0x3FF) << 10) | (str.charCodeAt(++i) & 0x3FF);
if (u <= 0x7F) ++len;
else if (u <= 0x7FF) len += 2;
else if (u <= 0xFFFF) len += 3;
else len += 4;
}
return len;
}
function stringToUTF8Array(str, heap, outIdx, maxBytesToWrite) {
if (!(maxBytesToWrite > 0)) // Parameter maxBytesToWrite is not optional. Negative values, 0, null, undefined and false each don't write out any bytes.
return 0;
var startIdx = outIdx;
var endIdx = outIdx + maxBytesToWrite - 1; // -1 for string null terminator.
for (var i = 0; i < str.length; ++i) {
// Gotcha: charCodeAt returns a 16-bit word that is a UTF-16 encoded code unit, not a Unicode code point of the character! So decode UTF16->UTF32->UTF8.
// See http://unicode.org/faq/utf_bom.html#utf16-3
// For UTF8 byte structure, see http://en.wikipedia.org/wiki/UTF-8#Description and https://www.ietf.org/rfc/rfc2279.txt and https://tools.ietf.org/html/rfc3629
var u = str.charCodeAt(i); // possibly a lead surrogate
if (u >= 0xD800 && u <= 0xDFFF) {
var u1 = str.charCodeAt(++i);
u = 0x10000 + ((u & 0x3FF) << 10) | (u1 & 0x3FF);
}
if (u <= 0x7F) {
if (outIdx >= endIdx) break;
heap[outIdx++] = u;
} else if (u <= 0x7FF) {
if (outIdx + 1 >= endIdx) break;
heap[outIdx++] = 0xC0 | (u >> 6);
heap[outIdx++] = 0x80 | (u & 63);
} else if (u <= 0xFFFF) {
if (outIdx + 2 >= endIdx) break;
heap[outIdx++] = 0xE0 | (u >> 12);
heap[outIdx++] = 0x80 | ((u >> 6) & 63);
heap[outIdx++] = 0x80 | (u & 63);
} else {
if (outIdx + 3 >= endIdx) break;
if (u > 0x10FFFF) warnOnce('Invalid Unicode code point 0x' + u.toString(16) + ' encountered when serializing a JS string to a UTF-8 string in wasm memory! (Valid unicode code points should be in range 0-0x10FFFF).');
heap[outIdx++] = 0xF0 | (u >> 18);
heap[outIdx++] = 0x80 | ((u >> 12) & 63);
heap[outIdx++] = 0x80 | ((u >> 6) & 63);
heap[outIdx++] = 0x80 | (u & 63);
}
}
// Null-terminate the pointer to the buffer.
heap[outIdx] = 0;
return outIdx - startIdx;
}
function allocateUTF8(pool,str) {
var size = lengthBytesUTF8(str) + 1;
var ptr = asm.gu_malloc(pool,size);
if (ptr) stringToUTF8Array(str, HEAP8, ptr, size);
return ptr;
}
const UTF8Decoder = typeof TextDecoder != 'undefined' ? new TextDecoder('utf8') : undefined;
/**
* @param {number} idx
* @param {number=} maxBytesToRead
* @return {string}
*/
function UTF8ArrayToString(heap, idx, maxBytesToRead) {
var endIdx = idx + maxBytesToRead;
var endPtr = idx;
// TextDecoder needs to know the byte length in advance, it doesn't stop on null terminator by itself.
// Also, use the length info to avoid running tiny strings through TextDecoder, since .subarray() allocates garbage.
// (As a tiny code save trick, compare endPtr against endIdx using a negation, so that undefined means Infinity)
while (heap[endPtr] && !(endPtr >= endIdx)) ++endPtr;
if (endPtr - idx > 16 && heap.subarray && UTF8Decoder) {
return UTF8Decoder.decode(heap.subarray(idx, endPtr));
} else {
var str = '';
// If building with TextDecoder, we have already computed the string length above, so test loop end condition against that
while (idx < endPtr) {
// For UTF8 byte structure, see:
// http://en.wikipedia.org/wiki/UTF-8#Description
// https://www.ietf.org/rfc/rfc2279.txt
// https://tools.ietf.org/html/rfc3629
var u0 = heap[idx++];
if (!(u0 & 0x80)) { str += String.fromCharCode(u0); continue; }
var u1 = heap[idx++] & 63;
if ((u0 & 0xE0) == 0xC0) { str += String.fromCharCode(((u0 & 31) << 6) | u1); continue; }
var u2 = heap[idx++] & 63;
if ((u0 & 0xF0) == 0xE0) {
u0 = ((u0 & 15) << 12) | (u1 << 6) | u2;
} else {
if ((u0 & 0xF8) != 0xF0) warnOnce('Invalid UTF-8 leading byte 0x' + u0.toString(16) + ' encountered when deserializing a UTF-8 string in wasm memory to a JS string!');
u0 = ((u0 & 7) << 18) | (u1 << 12) | (u2 << 6) | (heap[idx++] & 63);
}
if (u0 < 0x10000) {
str += String.fromCharCode(u0);
} else {
var ch = u0 - 0x10000;
str += String.fromCharCode(0xD800 | (ch >> 10), 0xDC00 | (ch & 0x3FF));
}
}
}
return str;
}
function UTF8ToString(ptr, maxBytesToRead) {
return ptr ? UTF8ArrayToString(HEAPU8, ptr, maxBytesToRead) : '';
}
const GuErrnoStrPtr = asm.malloc(8);
stringToUTF8Array("GuErrno", HEAP8, GuErrnoStrPtr, 8);
const PgfExnStrPtr = asm.malloc(8);
stringToUTF8Array("PgfExn", HEAP8, PgfExnStrPtr, 8);
function pgfError(err) {
if (asm.gu_exn_caught_(err, GuErrnoStrPtr)) {
errDataPtr = asm.gu_exn_caught_data(err);
return new Error("errno="+HEAP32[errDataPtr >> 2]);
} else if (asm.gu_exn_caught_(err, PgfExnStrPtr)) {
msgPtr = asm.gu_exn_caught_data(err);
return new Error(UTF8ToString(msgPtr));
}
return new Error();
}
const registry = new FinalizationRegistry((pool) => {
asm.gu_pool_free(pool);
});
function PGF(pgfPtr,name,pool) {
this.pgfPtr = pgfPtr;
this.abstractName = name;
this.pool = pool;
this.languages = {};
registry.register(this,pool);
}
function Concr(pgf,concrPtr,name) {
this.pgf = pgf;
this.name = name;
this.concrPtr = concrPtr;
}
Concr.prototype.linearize = function(expr) {
const tmp_pool = asm.gu_new_pool();
const err = asm.gu_new_exn(tmp_pool);
const sb = asm.gu_new_string_buf(tmp_pool);
const out = asm.gu_string_buf_out(sb);
asm.pgf_linearize(this.concrPtr, expr.exprPtr, out, err);
if (asm.gu_exn_is_raised(err)) {
const e = pgfError(err);
asm.gu_pool_free(tmp_pool);
throw e;
}
const strPtr = asm.gu_string_buf_data(sb);
const len = asm.gu_string_buf_length(sb);
const str = UTF8ToString(strPtr,len);
asm.gu_pool_free(tmp_pool);
return str
}
async function readPGF(pgfURL) {
const response = await fetch(pgfURL);
urlData[pgfURL] = new Int8Array(await response.arrayBuffer());
const pool = asm.gu_new_pool();
const tmp_pool = asm.gu_new_pool();
const err = asm.gu_new_exn(tmp_pool);
const strPtr = allocateUTF8(tmp_pool,pgfURL);
const pgfPtr = asm.pgf_read(strPtr,pool,err);
if (asm.gu_exn_is_raised(err)) {
const e = pgfError(err);
asm.gu_pool_free(tmp_pool);
throw e;
}
const namePtr = asm.pgf_abstract_name(pgfPtr);
const abstractName = UTF8ToString(namePtr);
const pgf = new PGF(pgfPtr,abstractName,pool);
const itor = asm.gu_malloc(tmp_pool,sizeof_GuMapItor);
const fn =
addFunction(
(itor,namePtr,concrPtrPtr,err) => {
const name = UTF8ToString(namePtr);
const concrPtr = HEAP32[concrPtrPtr >> 2];
pgf.languages[name] = new Concr(pgf,concrPtr,name);
},
"viiii"
);
HEAP32[(itor+offsetof_GuMapItor_fn) >> 2] = fn;
asm.pgf_iter_languages(pgfPtr,itor,err);
removeFunction(fn);
asm.gu_pool_free(tmp_pool);
return pgf;
}
function Expr(exprPtr,pool) {
this.exprPtr = exprPtr;
this.pool = pool;
registry.register(this,pool);
}
Expr.prototype.toString = function() {
const tmp_pool = asm.gu_new_pool();
const sb = asm.gu_new_string_buf(tmp_pool);
const out = asm.gu_string_buf_out(sb);
const err = asm.gu_new_exn(tmp_pool);
asm.pgf_print_expr(this.exprPtr, 0, 0, out, err);
if (asm.gu_exn_is_raised(err)) {
const e = pgfError(err);
asm.gu_pool_free(tmp_pool);
throw e;
}
const strPtr = asm.gu_string_buf_data(sb);
const len = asm.gu_string_buf_length(sb);
const str = UTF8ToString(strPtr,len);
asm.gu_pool_free(tmp_pool);
return str;
};
Expr.prototype.arity = function(expr) {
return asm.pgf_expr_arity(this.expr);
}
function readExpr(exprStr) {
const tmp_pool = asm.gu_new_pool();
const strPtr = allocateUTF8(tmp_pool,exprStr);
const in_ = asm.gu_data_in(strPtr, exprStr.length, tmp_pool);
const err = asm.gu_new_exn(tmp_pool);
const pool = asm.gu_new_pool();
const expr = asm.pgf_read_expr(in_, pool, tmp_pool, err);
asm.gu_pool_free(tmp_pool);
if (asm.gu_exn_is_raised(err)) {
throw pgfError(err);
}
if (expr == 0) {
throw new Error('Expression cannot be parsed');
}
return new Expr(expr,pool);
}
return { readPGF, readExpr };
}
// This allows us to use both from Node and in browser
if (typeof module != 'undefined') {
module.exports = mkAPI;
}

Binary file not shown.

After

Width:  |  Height:  |  Size: 201 B

Binary file not shown.

After

Width:  |  Height:  |  Size: 229 B

View File

@@ -0,0 +1,252 @@
body {
font-family:arial,helvetica,sans-serif;
font-size:12px;
background-color: white;
}
#wrapper {
width:740px;
height:520px;
margin:auto 50px;
border:1px solid gray;
padding:10px;
}
#absFrame {
width:250px;
height:250px;
padding:10px;
border:1px solid gray;
float:left;
white-space: nowrap;
}
#conFrame {
width:436px;
height:250px;
margin-left:10px;
padding:10px;
border:1px solid gray;
float:left;
white-space: normal;
overflow:auto;
}
#actFrame {
width:250px;
height:170px;
margin-top:10px;
padding:10px;
border:1px solid gray;
float:left;
overflow:auto;
}
#refFrame {
width:436px;
height:170px;
margin-left:10px;
margin-top:10px;
padding:10px;
border:1px solid gray;
float:left;
overflow:auto;
}
#messageFrame {
width:506px;
height:15px;
margin-top:10px;
margin-right:10px;
padding:10px;
border:1px solid gray;
float:left;
overflow:hidden;
}
#clipboardFrame {
width:180px;
height:15px;
margin-top:10px;
padding:10px;
border:1px solid gray;
float:left;
overflow:auto;
}
#tree {
left: -10px;
top: -10px;
margin: 0px;
padding: 10px;
overflow: auto;
}
ul {
position: relative;
list-style: none;
margin-left: 20px;
padding: 0px;
}
li {
position: relative;
}
img.tree-menu {
margin-right: 5px;
}
a.tree:link, a.tree:visited, a.tree:active {
color: black;
background-color: white;
text-decoration: none;
margin-right:10px;
}
a.tree:hover {
color: blue;
background-color: white;
text-decoration: underline;
margin-right:10px;
}
a.treeSelected:link, a.treeSelected:visited, a.treeSelected:active {
color: white;
background-color: #3366CC;
text-decoration: none;
margin-right:10px;
}
a.treeSelected:hover {
color: white;
background-color: #3366CC;
text-decoration: underline;
margin-right:10px;
}
a.treeGray:link, a.treeGray:visited, a.treeGray:active {
color: silver;
background-color: white;
text-decoration: none;
margin-right:10px;
}
a.treeGray:hover {
color: silver;
background-color: white;
text-decoration: none;
margin-right:10px;
}
table.action, table.refinement, table.wrapper, table.tree, table.language {
margin: 0px;
padding: 0px;
border-style: none;
border-collapse: collapse;
border-spacing: 0px;
}
tr.selected {
color: white;
background-color: #3366CC;
}
tr.unavailable, tr.closed {
color: silver;
background-color: white;
}
tr.unavailable:hover {
color: silver;
background-color: #3366CC;
}
tr.action, tr.refinement, tr.wrapper, tr.tree {
color: black;
background-color: white;
}
tr.action:hover, tr.refinement:hover, tr.wrapper:hover, tr.tree:hover {
color: white;
background-color: #3366CC;
}
td.action {
width: 220px;
margin: 0px;
padding: 0px;
}
td.refinement, td.wrapper, td.tree {
width: 515px;
margin: 0px;
padding: 0px;
}
td.hotKey {
width: 30px;
margin: 0px;
padding: 0px;
text-align: right;
}
td.language {
color: black;
background-color: white;
margin: 1px;
padding: 1px;
}
td.language:hover {
color: blue;
background-color: white;
text-decoration: underline;
margin: 1px;
padding: 1px;
}
td.selected {
color: white;
background-color: #3366CC;
margin: 1px;
padding: 1px;
}
td.selected:hover {
color: white;
background-color: #3366CC;
text-decoration: underline;
margin: 1px;
padding: 1px;
}
p {
margin-bottom: 40px;
}
span.normal {
color: black;
background-color: white;
text-decoration: none;
padding-left: 2px;
padding-right: 2px;
}
span.edit {
color: black;
background-color: white;
text-decoration: none;
border:2px inset;
padding-left: 2px;
padding-right: 2px;
}
span.selected {
color: white;
background-color: #3366CC;
text-decoration: none;
padding-left: 2px;
padding-right: 2px;
}

View File

@@ -1,33 +0,0 @@
const Module = require('./.libs/pgf.js');
const JSPGF = require('./jspgf.js')(Module);
const fs = require('fs');
const path = require('path');
Module.onRuntimeInitialized = () => {
// Read PGF path from args
if (process.argv.length > 2) {
const pgfPathHost = process.argv[2];
// Copy file into filesystem
const pgfPathFS = '/tmp/' + path.basename(pgfPathHost);
const rawPgf = fs.readFileSync(pgfPathHost);
Module.FS.writeFile(pgfPathFS, rawPgf);
// Read PGF
const pgf = JSPGF.readPGF(pgfPathFS);
// Print its name
console.log(JSPGF.abstractName(pgf));
}
// Parse expression
const expr = JSPGF.readExpr("Pred (Another (x f))");
// Show it
console.log(JSPGF.showExpr(expr));
// Print its arity
console.log('arity', JSPGF.arity(expr));
}

View File

@@ -1,13 +0,0 @@
<!doctype html>
<html lang="en-us">
<head>
<meta charset="utf-8">
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
</head>
<body>
<script type="text/javascript" src="./jspgf.js"></script>
<script type="text/javascript" src="./test-web.js"></script>
</body>
</html>

View File

@@ -1,21 +0,0 @@
mkAPI().then((pgf) => {
// Parse expression
const expr = pgf.readExpr("Pred (This Fish) Fresh");
// Show it
console.log(expr.toString());
// Print its arity
console.log('arity', expr.arity());
pgf.readPGF("Foods.pgf").then((gr) => {
// Print the grammar name
console.log(gr.abstractName);
// Access a language and print the concrete name
console.log(gr.languages["FoodsEng"].name);
// Linearize an expression
console.log(gr.languages["FoodsEng"].linearize(expr));
});
});

View File

@@ -0,0 +1,54 @@
body {
color: black;
background-color: white;
}
dl {
}
dt {
margin: 0;
padding: 0;
}
dl dd {
margin: 0;
padding: 0;
}
dl.fromLang dt {
display: none;
}
dl.toLang {
border-width: 1px 0 0 0;
border-style: solid;
border-color: #c0c0c0;
}
dl.toLang dt {
color: #c0c0c0;
display: block;
float: left;
width: 5em;
}
dl.toLang dd {
border-width: 0 0 1px 0;
border-style: solid;
border-color: #c0c0c0;
}
ul {
margin: 0;
padding: 0;
}
li {
list-style-type: none;
margin: 0;
padding: 0;
}

View File

@@ -0,0 +1,48 @@
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-strict.dtd">
<html xmlns="http://www.w3.org/1999/xhtml" xml:lang="en" lang="en">
<head>
<meta http-equiv="Content-Type" content="text/html; charset=UTF-8" />
<link rel="stylesheet" type="text/css" href="translator.css" />
<script type="text/javascript" src="gflib.js"></script>
<script type="text/javascript" src="grammar.js"></script>
<script type="text/javascript" src="translator.js"></script>
<script type="text/javascript">
/* CHANGE ME */
var grammar = Foods;
function updateTranslation () {
var input = document.getElementById('inputText').value;
var fromLang = document.getElementById('fromLang').value;
var toLang = document.getElementById('toLang').value;
var output = document.getElementById('output');
var translation = grammar.translate(input, fromLang, toLang);
removeChildren(output);
output.appendChild(formatTranslation(translation));
}
function populateLangs () {
var f = document.getElementById('fromLang');
var t = document.getElementById('toLang');
for (var c in grammar.concretes) {
addOption(f, c, c);
addOption(t, c, c);
}
}
</script>
<title>Web-based GF Translator</title>
</head>
<body onload="populateLangs(grammar, 'fromLang', 'toLang')">
<form id="translate">
<p>
<input type="text" name="inputText" id="inputText" value="this cheese is warm" size="50" />
</p>
<p>
From: <select name="fromLang" id="fromLang" onchange=""><option value="">Any language</option></select>
To: <select name="toLang" id="toLang"><option value="">All languages</option></select>
<input type="button" value="Translate" onclick="updateTranslation()" />
</p>
</form>
<div id="output"></div>
</body>
</html>

View File

@@ -0,0 +1,51 @@
function formatTranslation (outputs) {
var dl1 = document.createElement("dl");
dl1.className = "fromLang";
for (var fromLang in outputs) {
var ul = document.createElement("ul");
addDefinition(dl1, document.createTextNode(fromLang), ul);
for (var i in outputs[fromLang]) {
var dl2 = document.createElement("dl");
dl2.className = "toLang";
for (var toLang in outputs[fromLang][i]) {
addDefinition(dl2, document.createTextNode(toLang), document.createTextNode(outputs[fromLang][i][toLang]));
}
addItem(ul, dl2);
}
}
return dl1;
}
/* DOM utilities for specific tags */
function addDefinition (dl, t, d) {
var dt = document.createElement("dt");
dt.appendChild(t);
dl.appendChild(dt);
var dd = document.createElement("dd");
dd.appendChild(d);
dl.appendChild(dd);
}
function addItem (ul, i) {
var li = document.createElement("li");
li.appendChild(i);
ul.appendChild(li);
}
function addOption (select, value, content) {
var option = document.createElement("option");
option.value = value;
option.appendChild(document.createTextNode(content));
select.appendChild(option);
}
/* General DOM utilities */
/* Removes all the children of a node */
function removeChildren(node) {
while (node.hasChildNodes()) {
node.removeChild(node.firstChild);
}
}

View File

@@ -1,152 +0,0 @@
* INSTALL
You will need the python-devel package or similar.
You must have installed the PGF C runtime (see ../c/INSTALL)
#+begin_src sh
$ python setup.py build
$ sudo python setup.py install
#+end_src
* Apple Silicon
The following install instructions were written with the following config in mind:
| OS | Hardware | GF |
|-----------------------+----------+--------------------------+
| MacOS Monterey 12.2.1 | Apple M1 | 3.11 from binary package |
We assume that you may have installed GF as a binary package downloaded from Github.
From that starting point, try all the solutions below, in sequence, until you achieve success.
** Validation Goal
Our goal is to be able to
- run python 3
- import pgf
- type "pgf."
- hit tab
and get this:
#+begin_example
>>> import pgf
>>> pgf.
pgf.BIND( pgf.Concr( pgf.Iter( pgf.PGFError( pgf.Type( pgf.readExpr( pgf.readType(
pgf.Bracket( pgf.Expr( pgf.PGF( pgf.ParseError( pgf.TypeError( pgf.readPGF(
#+end_example
When that works, we can consider the Python PGF bindings to be installed successfully.
** The GF binary package won't install
We assume you've tried [[https://github.com/GrammaticalFramework/gf-core/releases][downloading a binary package]].
If MacOS is being secure, go to System Preferences, Security & Privacy, General, and click Open Anyway.
** gu/mem.h file not found
Maybe you tried running something like ~pip install pgf~ or ~pip3 install pgf~.
Did you get this error?
#+begin_example
python3 setup.py build
running build
running build_ext
creating build/temp.macosx-12-arm64-3.9
clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -isysroot /Library/Developer/CommandLineTools/SDKs/MacOSX12.sdk -I/opt/homebrew/opt/python@3.9/Frameworks/Python.framework/Versions/3.9/include/python3.9 -c pypgf.c -o build/temp.macosx-12-arm64-3.9/pypgf.o -std=c99
pypgf.c:5:10: fatal error: 'gu/mem.h' file not found
#include <gu/mem.h>
^~~~~~~~~~
1 error generated.
error: command '/usr/bin/clang' failed with exit code 1
#+end_example
Solution:
#+begin_example
$ EXTRA_INCLUDE_DIRS=/usr/local/include EXTRA_LIB_DIRS=/usr/local/lib pip install pgf
#+end_example
This should tell the build where to find the include and lib files it needs to compile:
#+begin_example
$ ls /usr/local/include/gu
assert.h choice.h enum.h file.h hash.h map.h out.h seq.h sysdeps.h utf8.h
bits.h defs.h exn.h fun.h in.h mem.h prime.h string.h ucs.h variant.h
$ ls /usr/local/lib/libgu*
/usr/local/lib/libgu.0.dylib /usr/local/lib/libgu.a /usr/local/lib/libgu.dylib /usr/local/lib/libgu.la
#+end_example
If those files don't exist, or you get the following error, you will need to rebuild the C runtime.
** symbol not found in flat namespace
Did you get this error?
#+begin_example
Python 3.9.10 (main, Jan 15 2022, 11:40:53)
[Clang 13.0.0 (clang-1300.0.29.3)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pgf
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ImportError: dlopen(/opt/homebrew/lib/python3.9/site-packages/pgf.cpython-39-darwin.so, 0x0002): symbol not found in flat namespace '_gu_alloc_variant'
#+end_example
This may be a sign that you're trying to get binaries and libraries compiled with different compilers to play nicely. We're trying to get three things to align:
- the Python interpreter
- the C runtime libraries
- the Python pgf libraries
Solution:
Maybe your Python isn't the Apple-provided Python. In the above error message we see ~python3~ is provided by Homebrew. Assuming you prefer to keep things this way, we'll try to rebuild things to match your Python.
Rebuilding needs a C compiler. The Apple-provided system ~clang~ is preferred. If you have multiple ~clang~ compilers installed, try disabling the others. For example, if your ~clang~ was provided by nix, run ~nix-env --uninstall clang~. Similarly for brew.
Then try rebuilding the C runtime.
** How to re-build the C runtime
Maybe the C runtime is missing from ~/usr/local/lib~, or maybe the version you have installed is causing the "symbol not found" error.
Build the C runtime by following the instructions in ~gf-core/src/runtime/c/INSTALL~.
After a successful ~make install~, rebuild the Python bindings.
** How to re-build the Python bindings using pip
Sometimes a ~pip install pgf~ will decline to recompile, because a cached wheel exists.
To return to a more pristine state,
#+begin_example
pip uninstall pgf
pip cache remove pgf
#+end_example
You may need to ~sudo~ some of the above commands.
Then you can repeat
#+begin_example
$ EXTRA_INCLUDE_DIRS=/usr/local/include EXTRA_LIB_DIRS=/usr/local/lib pip install pgf
#+end_example
** How to re-build the Python bindings manually
If the ~pip install pgf~ just isn't working, try building it directly in ~gf-core/src/runtime/python~:
#+begin_example
$ python setup.py build
$ sudo python setup.py install
#+end_example
You may need to add the ~EXTRA~ environment prefixes as shown in previous commands.

View File

@@ -1155,80 +1155,6 @@ Iter_fetch_expr(IterObject* self)
return res;
}
typedef struct {
PyObject_HEAD
} BINDObject;
static PyObject *BIND_instance = NULL;
static void
BIND_dealloc(PyTypeObject *self)
{
BIND_instance = NULL;
}
static PyObject *
BIND_repr(BINDObject *self)
{
return PyString_FromString("pgf.BIND");
}
static PyObject *
BIND_str(BINDObject *self)
{
return PyString_FromString("&+");
}
static PyObject *
BIND_alloc(PyTypeObject *self, Py_ssize_t nitems)
{
if (BIND_instance == NULL)
BIND_instance = PyType_GenericAlloc(self, nitems);
return BIND_instance;
}
static PyTypeObject pgf_BINDType = {
PyVarObject_HEAD_INIT(NULL, 0)
//0, /*ob_size*/
"pgf.BINDType", /*tp_name*/
sizeof(BINDObject), /*tp_basicsize*/
0, /*tp_itemsize*/
(destructor) BIND_dealloc, /*tp_dealloc*/
0, /*tp_print*/
0, /*tp_getattr*/
0, /*tp_setattr*/
0, /*tp_compare*/
(reprfunc) BIND_repr, /*tp_repr*/
0, /*tp_as_number*/
0, /*tp_as_sequence*/
0, /*tp_as_mapping*/
0, /*tp_hash */
0, /*tp_call*/
(reprfunc) BIND_str, /*tp_str*/
0, /*tp_getattro*/
0, /*tp_setattro*/
0, /*tp_as_buffer*/
Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
"a marker for BIND in a bracketed string", /*tp_doc*/
0, /*tp_traverse */
0, /*tp_clear */
0, /*tp_richcompare */
0, /*tp_weaklistoffset */
0, /*tp_iter */
0, /*tp_iternext */
0, /*tp_methods */
0, /*tp_members */
0, /*tp_getset */
0, /*tp_base */
0, /*tp_dict */
0, /*tp_descr_get */
0, /*tp_descr_set */
0, /*tp_dictoffset */
0, /*tp_init */
BIND_alloc, /*tp_alloc */
0, /*tp_new */
};
static PyObject*
Iter_fetch_token(IterObject* self)
{
@@ -1236,9 +1162,7 @@ Iter_fetch_token(IterObject* self)
if (tp == NULL)
return NULL;
PyObject* py_tok =
(tp->tok != NULL) ? PyString_FromString(tp->tok)
: pgf_BINDType.tp_alloc(&pgf_BINDType, 0);
PyObject* py_tok = PyString_FromString(tp->tok);
PyObject* py_cat = PyString_FromString(tp->cat);
PyObject* py_fun = PyString_FromString(tp->fun);
PyObject* res = Py_BuildValue("(f,O,O,O)", tp->prob, py_tok, py_cat, py_fun);
@@ -1675,18 +1599,16 @@ Concr_parse(ConcrObject* self, PyObject *args, PyObject *keywds)
static IterObject*
Concr_complete(ConcrObject* self, PyObject *args, PyObject *keywds)
{
static char *kwlist[] = {"sentence", "cat", "prefix", "n", NULL};
static char *kwlist[] = {"sentence", "cat", "prefix", "n", NULL};
PyObject* sentence0 = NULL;
char* sentence = NULL;
const char *sentence = NULL;
PyObject* start = NULL;
GuString prefix = "";
bool prefix_bind = false;
int max_count = -1;
if (!PyArg_ParseTupleAndKeywords(args, keywds, "O|Osi", kwlist,
&sentence0, &start,
&prefix, &max_count))
return NULL;
GuString prefix = "";
int max_count = -1;
if (!PyArg_ParseTupleAndKeywords(args, keywds, "s|Osi", kwlist,
&sentence, &start,
&prefix, &max_count))
return NULL;
IterObject* pyres = (IterObject*)
pgf_IterType.tp_alloc(&pgf_IterType, 0);
@@ -1708,20 +1630,6 @@ Concr_complete(ConcrObject* self, PyObject *args, PyObject *keywds)
GuExn* parse_err = gu_new_exn(tmp_pool);
if (PyTuple_Check(sentence0) &&
PyTuple_GET_SIZE(sentence0) == 2 &&
PyTuple_GET_ITEM(sentence0,1) == pgf_BINDType.tp_alloc(&pgf_BINDType, 0))
{
sentence0 = PyTuple_GET_ITEM(sentence0,0);
prefix_bind = true;
}
if (PyUnicode_Check(sentence0)) {
sentence = PyUnicode_AsUTF8(sentence0);
} else {
PyErr_SetString(PyExc_TypeError, "The sentence must be either a string or a tuple of string and pgf.BIND");
}
PgfType* type;
if (start == NULL) {
type = pgf_start_cat(self->grammar->pgf, pyres->pool);
@@ -1734,7 +1642,7 @@ Concr_complete(ConcrObject* self, PyObject *args, PyObject *keywds)
}
pyres->res =
pgf_complete(self->concr, type, sentence, prefix, prefix_bind, parse_err, pyres->pool);
pgf_complete(self->concr, type, sentence, prefix, parse_err, pyres->pool);
if (!gu_ok(parse_err)) {
Py_DECREF(pyres);
@@ -2169,6 +2077,58 @@ static PyTypeObject pgf_BracketType = {
0, /*tp_new */
};
typedef struct {
PyObject_HEAD
} BINDObject;
static PyObject *
BIND_repr(BINDObject *self)
{
return PyString_FromString("&+");
}
static PyTypeObject pgf_BINDType = {
PyVarObject_HEAD_INIT(NULL, 0)
//0, /*ob_size*/
"pgf.BIND", /*tp_name*/
sizeof(BINDObject), /*tp_basicsize*/
0, /*tp_itemsize*/
0, /*tp_dealloc*/
0, /*tp_print*/
0, /*tp_getattr*/
0, /*tp_setattr*/
0, /*tp_compare*/
0, /*tp_repr*/
0, /*tp_as_number*/
0, /*tp_as_sequence*/
0, /*tp_as_mapping*/
0, /*tp_hash */
0, /*tp_call*/
(reprfunc) BIND_repr, /*tp_str*/
0, /*tp_getattro*/
0, /*tp_setattro*/
0, /*tp_as_buffer*/
Py_TPFLAGS_DEFAULT | Py_TPFLAGS_BASETYPE, /*tp_flags*/
"a marker for BIND in a bracketed string", /*tp_doc*/
0, /*tp_traverse */
0, /*tp_clear */
0, /*tp_richcompare */
0, /*tp_weaklistoffset */
0, /*tp_iter */
0, /*tp_iternext */
0, /*tp_methods */
0, /*tp_members */
0, /*tp_getset */
0, /*tp_base */
0, /*tp_dict */
0, /*tp_descr_get */
0, /*tp_descr_set */
0, /*tp_dictoffset */
0, /*tp_init */
0, /*tp_alloc */
0, /*tp_new */
};
typedef struct {
PgfLinFuncs* funcs;
GuBuf* stack;
@@ -2766,11 +2726,6 @@ static PyMethodDef Concr_methods[] = {
},
{"complete", (PyCFunction)Concr_complete, METH_VARARGS | METH_KEYWORDS,
"Parses a partial string and returns a list with the top n possible next tokens"
"Named arguments:\n"
"- sentence (string or a (string,pgf.BIND) tuple. The later indicates that the sentence ends with a BIND token)\n"
"- cat (string); OPTIONAL, default: the startcat of the grammar\n"
"- prefix (string); OPTIONAL, the prefix of predicted tokens"
"- n (int), max. number of predicted tokens"
},
{"parseval", (PyCFunction)Concr_parseval, METH_VARARGS,
"Computes precision, recall and exact match for the parser on a given abstract tree"
@@ -3715,7 +3670,7 @@ MOD_INIT(pgf)
PyModule_AddObject(m, "Bracket", (PyObject *) &pgf_BracketType);
Py_INCREF(&pgf_BracketType);
PyModule_AddObject(m, "BIND", pgf_BINDType.tp_alloc(&pgf_BINDType, 0));
PyModule_AddObject(m, "BIND", (PyObject *) &pgf_BINDType);
Py_INCREF(&pgf_BINDType);
return MOD_SUCCESS_VAL(m);

View File

@@ -0,0 +1,7 @@
# Project moved
The GF TypeScript runtime has been moved to the repository:
<https://github.com/GrammaticalFramework/gf-typescript>
If you are looking for an updated version of the JavaScript runtime,
you should also look there.

View File

@@ -22,7 +22,7 @@
(bilingual document editor)
<!--<li><a href="wc.html">Wide Coverage Translation Demo</a>-->
<li><a href="gfmorpho/">Word inflection with smart paradigms</a>
<li><a href="wordnet/">GF WordNet</a> (an online browser and editor for the WordNet lexicon)</li>
<li><a href="https://cloud.grammaticalframework.org/wordnet">GF WordNet</a> (an online browser and editor for the WordNet lexicon)</li>
</ul>
<h2>Documentation</h2>

14
stack-ghc8.10.4.yaml Normal file
View File

@@ -0,0 +1,14 @@
resolver: lts-18.0 # ghc 8.10.4
extra-deps:
- network-2.6.3.6
- httpd-shed-0.4.0.3
- cgi-3001.5.0.0@sha256:3d1193a328d5f627a021a0ef3927c1ae41dd341e32dba612fed52d0e3a6df056,2990
- json-0.10@sha256:d9fc6b07ce92b8894825a17d2cf14799856767eb30c8bf55962baa579207d799,3210
- multipart-0.2.0@sha256:b8770e3ff6089be4dd089a8250894b31287cca671f3d258190a505f9351fa8a9,1084
# flags:
# gf:
# c-runtime: true
# extra-lib-dirs:
# - /usr/local/lib

View File

@@ -1,12 +0,0 @@
resolver: lts-18.27 # ghc 8.10.7
extra-deps:
- network-2.6.3.6
- httpd-shed-0.4.0.3
# flags:
# gf:
# server: true
# c-runtime: true
# extra-lib-dirs:
# - /usr/local/lib

View File

@@ -1,14 +0,0 @@
resolver: lts-19.6
extra-deps:
# - network-2.6.3.6
# - httpd-shed-0.4.0.3
# - cgi-3001.5.0.0@sha256:3d1193a328d5f627a021a0ef3927c1ae41dd341e32dba612fed52d0e3a6df056,2990
# - json-0.10@sha256:d9fc6b07ce92b8894825a17d2cf14799856767eb30c8bf55962baa579207d799,3210
# - multipart-0.2.0@sha256:b8770e3ff6089be4dd089a8250894b31287cca671f3d258190a505f9351fa8a9,1084
# flags:
# gf:
# c-runtime: true
# extra-lib-dirs:
# - /usr/local/lib

View File

@@ -1,15 +1,18 @@
# This default stack file is a copy of stack-ghc8.10.7.yaml
# This default stack file is a copy of stack-ghc8.10.4.yaml
# But committing a symlink can be problematic on Windows, so it's a real copy.
# See: https://github.com/GrammaticalFramework/gf-core/pull/106
resolver: lts-18.27 # ghc 8.10.7
resolver: lts-18.0 # ghc 8.10.4
extra-deps:
- network-2.6.3.6
- httpd-shed-0.4.0.3
- cgi-3001.5.0.0@sha256:3d1193a328d5f627a021a0ef3927c1ae41dd341e32dba612fed52d0e3a6df056,2990
- json-0.10@sha256:d9fc6b07ce92b8894825a17d2cf14799856767eb30c8bf55962baa579207d799,3210
- multipart-0.2.0@sha256:b8770e3ff6089be4dd089a8250894b31287cca671f3d258190a505f9351fa8a9,1084
# flags:
# gf:
# server: true
# c-runtime: true
# extra-lib-dirs:
# - /usr/local/lib

View File

@@ -1,12 +1,2 @@
i -retain testsuite/compiler/compute/Variants.gf
cc hello
cc <\x -> x++x : Str -> Str> ("a"|"b")
cc <\x -> x : Str -> Str> ("a"|"b")
cc <\x -> "c" : Str -> Str> ("a"|"b")
cc <let x = ("a"|"b") in x++x : Str>
cc <let x = ("a"|"b") in x : Str>
cc <let x = ("a"|"b") in "c" : Str>
cc <\x -> x.p1++x.p1 : Str*Str -> Str> <"a"|"b","c">
cc <\x -> x.p1 : Str*Str -> Str> <"a"|"b","c">
cc <\x -> x.p2++x.p2 : Str*Str -> Str> <"a"|"b","c">
cc <\x -> x.p2 : Str*Str -> Str> <"a"|"b","c">

View File

@@ -1,11 +1 @@
variants {"hello"; "hello" ++ "hello"}
variants {"a" ++ "a"; "b" ++ "b"}
variants {"a"; "b"}
"c"
variants {"a"; "b"} ++ variants {"a"; "b"}
variants {"a"; "b"}
"c"
variants {"a"; "b"} ++ variants {"a"; "b"}
variants {"a"; "b"}
"c" ++ "c"
"c"

392
testsuite/lpgf/README.md Normal file
View File

@@ -0,0 +1,392 @@
# LPGF testsuite & benchmark
## Testsuite
LPGF must be equivalent to PGF in terms of linearisation output.
Possible exceptions:
- No handling of variants (design choice)
- Rendering of missing functions
**N.B.**
Phrasebook doesn't compile with RGL after 1131058b68c204a8d1312d2e2a610748eb8032cb
### Running
Because Stack insists on rebuilding things all the time, I use separate `.stack-work` folders for testing and benchmarking.
Assumes treebank in same folder with same abstract name as grammar, e.g. `unittests/Params.treebank`
```
stack build --work-dir .stack-work-test --test --no-run-tests
stack test --work-dir .stack-work-test gf:test:lpgf # all LPGF tests
stack test --work-dir .stack-work-test gf:test:lpgf --test-arguments="unittests/Params" # specific grammar
stack test --work-dir .stack-work-test gf:test:lpgf --test-arguments="foods/Foods Fre Ger" # specific grammar and languages
stack test --work-dir .stack-work-test gf:test:lpgf --test-arguments="phrasebook/Phrasebook"
```
Set environment variable `DEBUG=1` to enable dumping of intermediate formats into `DEBUG/` folder.
---
## Benchmark
Compare performance metrics between LPGF and PGF[2]. Note: correctness is not checked here.
### Compilation
Comparing PGF, LPGF along following criteria:
- Time
- Memory
- Binary file size
### Runtime (linearisation)
Comparing PGF, PGF2, LPGF along following criteria:
- Time
- Memory
### Running
Run each command separately so that memory measurements are isolated.
The `+RTS -T -RTS` is so that GHC can report its own memory usage.
```
stack build --work-dir .stack-work-bench --bench --no-run-benchmarks &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "compile pgf testsuite/lpgf/foods/Foods*.gf +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "compile lpgf testsuite/lpgf/foods/Foods*.gf +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run pgf Foods.pgf testsuite/lpgf/foods/Foods-all.trees +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run pgf2 Foods.pgf testsuite/lpgf/foods/Foods-all.trees +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run lpgf Foods.lpgf testsuite/lpgf/foods/Foods-all.trees +RTS -T -RTS"
```
```
stack build --work-dir .stack-work-bench --bench --no-run-benchmarks &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "compile pgf testsuite/lpgf/phrasebook/Phrasebook*.gf +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "compile lpgf testsuite/lpgf/phrasebook/Phrasebook*.gf +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run pgf Phrasebook.pgf testsuite/lpgf/phrasebook/Phrasebook-10000.trees +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run pgf2 Phrasebook.pgf testsuite/lpgf/phrasebook/Phrasebook-10000.trees +RTS -T -RTS" &&
stack bench --work-dir .stack-work-bench --benchmark-arguments "run lpgf Phrasebook.lpgf testsuite/lpgf/phrasebook/Phrasebook-10000.trees +RTS -T -RTS"
```
## Profiling
```
stack build --work-dir .stack-work-profile --profile --bench --no-run-benchmarks &&
stack bench --work-dir .stack-work-profile --profile --benchmark-arguments "compile lpgf testsuite/lpgf/phrasebook/PhrasebookFre.gf +RTS -T -p -h -RTS"
```
Produced files:
- `lpgf-bench.prof` - total time and memory allocation (`-p`)
- `lpgf-bench.hp` - heap profile (`-h`)
Open heap profile graph on-the-fly:
```
stack exec -- hp2ps -c lpgf-bench.hp && open lpgf-bench.ps
```
Convert and copy timestamped files into `PROF/`:
```
TS="$(date +%Y-%m-%d_%H%M)" &&
stack exec -- hp2ps -c lpgf-bench.hp &&
mv lpgf-bench.prof PROF/$TS.prof &&
mv lpgf-bench.ps PROF/$TS.ps &&
mv lpgf-bench.hs PROF/$TS.hp
```
**Resources**
- https://downloads.haskell.org/ghc/8.6.5/docs/html/users_guide/profiling.html
- http://book.realworldhaskell.org/read/profiling-and-optimization.html
- https://wiki.haskell.org/Performance
### Honing in
```
stack build --test --bench --no-run-tests --no-run-benchmarks &&
stack bench --benchmark-arguments "compile lpgf testsuite/lpgf/phrasebook/PhrasebookFre.gf +RTS -T -RTS"
```
**Baseline PGF**
- compile: 1.600776s
- size: 2.88 MB Phrasebook.pgf
Max memory: 328.20 MB
**Baseline LPGF = B**
- compile: 12.401099s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 1.33 GB
**Baseline LPGF String instead of Text**
- compile: 12.124689s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 1.34 GB
**Baseline LPGF with impossible pruning**
- compile: 7.406503s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 1.13 GB
**B -extractStrings**
- compile: 13.822735s
- size: 5.78 MB Phrasebook.lpgf
Max memory: 1.39 GB
**B -cleanupRecordFields**
- compile: 13.670776s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 1.48 GB
**No generation at all = E**
- compile: 0.521001s
- size: 3.27 KB Phrasebook.lpgf
Max memory: 230.69 MB
**+ Concat, Literal, Error, Predef, Tuple, Variant, Commented**
- compile: 1.503594s
- size: 3.27 KB Phrasebook.lpgf
Max memory: 395.31 MB
**+ Var, Pre, Selection**
- compile: 1.260184s
- size: 3.28 KB Phrasebook.lpgf
Max memory: 392.17 MB
**+ Record**
- compile: 1.659233s
- size: 7.07 KB Phrasebook.lpgf
Max memory: 397.41 MB
**+ Projection = X**
- compile: 1.446217s
- size: 7.94 KB Phrasebook.lpgf
Max memory: 423.62 MB
**X + Param**
- compile: 2.073838s
- size: 10.82 KB Phrasebook.lpgf
Max memory: 619.71 MB
**X + Table**
- compile: 11.26558s
- size: 2.48 MB Phrasebook.lpgf
Max memory: 1.15 GB
**RawIdents**
- compile: 5.393466s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 1.12 GB
### Repeated terms in compilation
**Param and Table**
| Concr | Total | Unique | Perc |
|:--------------|-------:|-------:|-----:|
| PhrasebookEng | 8673 | 1724 | 20% |
| PhrasebookSwe | 14802 | 2257 | 15% |
| PhrasebookFin | 526225 | 4866 | 1% |
**Param**
| Concr | Total | Unique | Perc |
|:--------------|-------:|-------:|-----:|
| PhrasebookEng | 3211 | 78 | 2% |
| PhrasebookSwe | 7567 | 69 | 1% |
| PhrasebookFin | 316355 | 310 | 0.1% |
**Table**
| Concr | Total | Unique | Perc |
|:--------------|-------:|-------:|-----:|
| PhrasebookEng | 5470 | 1654 | 30% |
| PhrasebookSwe | 7243 | 2196 | 30% |
| PhrasebookFin | 209878 | 4564 | 2% |
### After impelementing state monad for table memoisation
**worse!**
- compile: 12.55848s
- size: 3.01 MB Phrasebook.lpgf
Max memory: 2.25 GB
**Params**
| Concr | Total | Misses | Perc |
|:--------------|-------:|-------:|------:|
| PhrasebookEng | 3211 | 72 | 2% |
| PhrasebookSwe | 7526 | 61 | 1% |
| PhrasebookFin | 135268 | 333 | 0.2% |
| PhrasebookFre | 337102 | 76 | 0.02% |
**Tables**
| Concr | Total | Misses | Perc |
|:--------------|------:|-------:|-----:|
| PhrasebookEng | 3719 | 3170 | 85% |
| PhrasebookSwe | 4031 | 3019 | 75% |
| PhrasebookFin | 36875 | 21730 | 59% |
| PhrasebookFre | 41397 | 32967 | 80% |
Conclusions:
- map itself requires more memory than actual compilation
- lookup/insert is also as bad as actual compilation
Tried HashMap (deriving Hashable for LinValue), no inprovement.
Using show on LinValue for keys is incredibly slow.
# Notes on compilation
## 1 (see unittests/Params4)
**param defns**
P = P1 | P2
Q = Q1 | Q2
R = RP P | RPQ P Q | R0
X = XPQ P Q
**translation**
NB: tuples may be nested, but will be concatted at runtime
P1 = <1>
P2 = <2>
Q1 = <1>
Q2 = <2>
R P1 = <1,1>
R P2 = <1,2>
RPQ P1 Q1 = <2,1,1>
RPQ P1 Q2 = <2,1,2>
RPQ P2 Q1 = <2,2,1>
RPQ P2 Q2 = <2,2,2>
R0 = <3>
XPQ P1 Q1 = <1,1,1>
XPQ P1 Q2 = <1,1,2>
XPQ P2 Q1 = <1,2,1>
XPQ P2 Q2 = <1,2,2>
P => Str
<"P1","P2">
{p:P ; q:Q} => Str
<<"P1;Q1","P1;Q2">,<"P2;Q1","P2;Q2">>
{p=P2; q=Q1}
<<2>,<1>>
R => Str
< <"RP P1","RP P2">,
< <"RPQ P1 Q1","RPQ P1 Q2">,
<"RPQ P2 Q1","RPQ P2 Q2"> >,
"R0"
>
X => Str
<<<"XPQ P1 Q1","XPQ P1 Q2">,
<"XPQ P2 Q1","XPQ P2 Q2">>>
{p=P2 ; r=R0}
<<2>,<3>>
{p=P2 ; r1=RP P1 ; r2=RPQ P1 Q2 ; r3=R0 }
< <2> , <1, 1> , <2, 1, 2> , <3>>
## 2 (see unittests/Params5)
**param defns**
P = P1 | PQ Q
Q = Q1 | QR R
R = R1 | R2
**translation**
P1 = <1>
PQ Q1 = <2,1>
PQ QR R1 = <2,2,1>
PQ QR R2 = <2,2,2>
Q1 = <1>
QR R1 = <2,1>
QR R2 = <2,2>
R1 = <1>
R2 = <2>
P => Str
<"P1",<"PQ Q1",<"PQ (QR R1)","PQ (QR R2)">>>
{q:Q ; p:P} => Str
< <"Q1;P1",<"Q1;PQ Q1",<"Q1;PQ (QR R1)","Q1;PQ (QR R2)">>>,
<
<"QR R1;P1",<"QR R1;PQ Q1",<"QR R1;PQ (QR R1)","QR R1;PQ (QR R2)">>>,
<"QR R2;P1",<"QR R2;PQ Q1",<"QR R2;PQ (QR R1)","QR R2;PQ (QR R2)">>>
>
>
{q=Q1 ; p=P1} = <<1>,<1>>
{q=Q1 ; p=PQ Q1} = <<1>,<2,1>>
{q=Q1 ; p=PQ (QR R1)} = <<1>,<2,2,1>>
{q=Q1 ; p=PQ (QR R2)} = <<1>,<2,2,2>>
{q=QR R1 ; p=P1} = <<2,1>,<1>>
{q=QR R1 ; p=PQ Q1} = <<2,1>,<2,1>>
{q=QR R1 ; p=PQ (QR R1)} = <<2,1>,<2,2,1>>
{q=QR R1 ; p=PQ (QR R2)} = <<2,1>,<2,2,2>>
{q=QR R2 ; p=P1} = <<2,2>,<1>>
{q=QR R2 ; p=PQ Q1} = <<2,2>,<2,1>>
{q=QR R2 ; p=PQ (QR R1)} = <<2,2>,<2,2,1>>
{q=QR R2 ; p=PQ (QR R2)} = <<2,2>,<2,2,2>>
**NOTE**: GF will swap q and p in record, as part of record field sorting, resulting in the following:
{p:P ; q:Q} => Str
< <"P1;Q1", <"P1;QR R1","P1;QR R2">>,
< <"PQ Q1;Q1", <"PQ Q1;QR R1","PQ Q1;QR R2">>,
< <"PQ (QR R1);Q1", <"PQ (QR R1);QR R1","PQ (QR R1);QR R2">>,
<"PQ (QR R2);Q1", <"PQ (QR R2);QR R1","PQ (QR R2);QR R2">>
>
>
>
{p=P1 ; q=Q1} = <<1>,<1>>
{p=P1 ; q=QR R1} = <<1>,<2,1>>
{p=P1 ; q=QR R2} = <<1>,<2,2>>
{p=PQ Q1 ; q=Q1} = <<2,1>,<1>>
{p=PQ Q1 ; q=QR R1} = <<2,1>,<2,1>>
{p=PQ Q1 ; q=QR R2} = <<2,1>,<2,2>>
{p=PQ (QR R1) ; q=Q1} = <<2,2,1>,<1>>
{p=PQ (QR R1) ; q=QR R1} = <<2,2,1>,<2,1>>
{p=PQ (QR R1) ; q=QR R2} = <<2,2,1>,<2,2>>
{p=PQ (QR R2) ; q=Q1} = <<2,2,2>,<1>>
{p=PQ (QR R2) ; q=QR R1} = <<2,2,2>,<2,1>>
{p=PQ (QR R2) ; q=QR R2} = <<2,2,2>,<2,2>>
{pp: {p:P} ; q:Q} => Str
{pp={p=P1} ; q=Q1} = <<<1>>,<1>>
{pp={p=P1} ; q=QR R1} = <<<1>>,<2,1>>
{pp={p=P1} ; q=QR R2} = <<<1>>,<2,2>>
{pp={p=PQ Q1} ; q=Q1} = <<<2,1>>, <1>>
{pp={p=PQ Q1} ; q=QR R1} = <<<2,1>>, <2,1>>
{pp={p=PQ Q1} ; q=QR R2} = <<<2,1>>, <2,2>>
{pp={p=PQ (QR R1)} ; q=Q1} = <<<2,2,1>>,<1>>
{pp={p=PQ (QR R1)} ; q=QR R1} = <<<2,2,1>>,<2,1>>
{pp={p=PQ (QR R1)} ; q=QR R2} = <<<2,2,1>>,<2,2>>
{pp={p=PQ (QR R2)} ; q=Q1} = <<<2,2,2>>,<1>>
{pp={p=PQ (QR R2)} ; q=QR R1} = <<<2,2,2>>,<2,1>>
{pp={p=PQ (QR R2)} ; q=QR R2} = <<<2,2,2>>,<2,2>>

193
testsuite/lpgf/bench.hs Normal file
View File

@@ -0,0 +1,193 @@
{-# LANGUAGE ScopedTypeVariables #-}
module Main where
import qualified LPGF
import qualified PGF
import qualified PGF2
import GF (compileToPGF, compileToLPGF, writePGF, writeLPGF)
import GF.Support (Options, Flags (..), Verbosity (..), noOptions, addOptions, modifyFlags)
import Control.DeepSeq (NFData, force)
import qualified Control.Exception as EX
import Control.Monad (when, forM)
import Data.Either (isLeft)
import qualified Data.List as L
import Data.Maybe (fromJust, isJust, isNothing)
import qualified Data.Map as Map
import Data.Text (Text)
import Data.Time.Clock (getCurrentTime, diffUTCTime)
import System.Console.ANSI
import System.Directory (listDirectory, getFileSize)
import System.Environment (getArgs)
import System.Exit (die)
import System.FilePath ((</>), (<.>), takeFileName, takeDirectory, dropExtension)
import Text.Printf (printf)
import GHC.Stats
options :: Options
options = addOptions (modifyFlags (\f -> f{optVerbosity=Quiet})) noOptions
usage :: String
usage = "Arguments:\n\
\ compile [pgf|lpgf] FoodsEng.gf FoodsGer.gf ...\n\
\ run [pgf|pgf2|lpgf] Foods.pgf test.trees\
\"
main :: IO ()
main = do
-- Parse command line arguments
args <- getArgs
let argc = length args
when (argc < 1) (die usage)
let (mode:_) = args
when (mode `L.notElem` ["compile","run"]) (die usage)
when (mode == "compile" && argc < 2) (die usage)
when (mode == "run" && argc < 3) (die usage)
let target = let a1 = args !! 1 in if a1 `elem` ["pgf", "pgf2", "lpgf"] then Just a1 else Nothing
let mods' = if mode == "compile" then drop (if isJust target then 2 else 1) args else []
mods <- concat <$> forM mods' (\mod ->
-- If * is supplied in module name, collect modules ourselves
if '*' `elem` mod
then do
let
dir = takeDirectory mod
pre = takeWhile (/='*') (takeFileName mod)
post = drop 1 $ dropWhile (/='*') (takeFileName mod)
map (dir </>)
. filter (\p -> let fn = takeFileName p in pre `L.isPrefixOf` fn && post `L.isSuffixOf` fn)
<$> listDirectory dir
else
return [mod]
)
let binaryFile = if mode == "run" then Just $ args !! (if isJust target then 2 else 1) else Nothing
let treesFile = if mode == "run" then Just $ args !! (if isJust target then 3 else 2) else Nothing
let doPGF = isNothing target || target == Just "pgf"
let doPGF2 = isNothing target || target == Just "pgf2"
let doLPGF = isNothing target || target == Just "lpgf"
-- Compilation
when (mode == "compile") $ do
when doPGF $ do
heading "PGF"
(path, pgf) <- time "- compile: " (compilePGF mods)
size <- getFileSize path
printf "- size: %s %s\n" (convertSize size) path
when doLPGF $ do
heading "LPGF"
(path, lpgf) <- time "- compile: " (compileLPGF mods)
size <- getFileSize path
printf "- size: %s %s\n" (convertSize size) path
-- Linearisation
when (mode == "run") $ do
-- Read trees
lns <- lines <$> readFile (fromJust treesFile)
let trees = map (fromJust . PGF.readExpr) lns
let trees2 = map (fromJust . PGF2.readExpr) lns
printf "Read %d trees\n" (length trees)
when doPGF $ do
heading "PGF"
pgf <- PGF.readPGF (dropExtension (fromJust binaryFile) <.> "pgf")
timePure "- linearise: " (linPGF pgf trees)
return ()
when doPGF2 $ do
heading "PGF2"
pgf <- PGF2.readPGF (dropExtension (fromJust binaryFile) <.> "pgf")
timePure "- linearise: " (linPGF2 pgf trees2)
return ()
when doLPGF $ do
heading "LPGF"
lpgf <- LPGF.readLPGF (dropExtension (fromJust binaryFile) <.> "lpgf")
-- timePure "- linearise: " (linLPGF lpgf trees)
ress <- time "- linearise: " (linLPGF' lpgf trees)
when (any (any isLeft) ress) $ do
setSGR [SetColor Foreground Dull Red]
putStrLn "Teminated with errors"
setSGR [Reset]
stats <- getRTSStats
printf "Max memory: %s\n" (convertSize (fromIntegral (max_mem_in_use_bytes stats)))
heading :: String -> IO ()
heading s = do
setSGR [SetColor Foreground Vivid Yellow, SetConsoleIntensity BoldIntensity]
putStrLn s
setSGR [Reset]
-- For accurate timing, IO action must for evaluation itself (e.g., write to file)
time :: String -> IO a -> IO a
time desc io = do
start <- getCurrentTime
r <- io >>= EX.evaluate -- only WHNF
end <- getCurrentTime
putStrLn $ desc ++ show (diffUTCTime end start)
return r
-- Performs deep evaluation
timePure :: (NFData a) => String -> a -> IO a
timePure desc val = time desc (return $ force val)
compilePGF :: [FilePath] -> IO (FilePath, PGF.PGF)
compilePGF mods = do
pgf <- compileToPGF options mods
files <- writePGF options pgf
return (head files, pgf)
compileLPGF :: [FilePath] -> IO (FilePath, LPGF.LPGF)
compileLPGF mods = do
lpgf <- compileToLPGF options mods
file <- writeLPGF options lpgf
return (file, lpgf)
linPGF :: PGF.PGF -> [PGF.Expr] -> [[String]]
linPGF pgf trees =
[ map (PGF.linearize pgf lang) trees | lang <- PGF.languages pgf ]
linPGF2 :: PGF2.PGF -> [PGF2.Expr] -> [[String]]
linPGF2 pgf trees =
[ map (PGF2.linearize concr) trees | (_, concr) <- Map.toList (PGF2.languages pgf) ]
linLPGF :: LPGF.LPGF -> [PGF.Expr] -> [[Text]]
linLPGF lpgf trees =
[ map (LPGF.linearizeConcreteText concr) trees | (_,concr) <- Map.toList (LPGF.concretes lpgf) ]
linLPGF' :: LPGF.LPGF -> [PGF.Expr] -> IO [[Either String Text]]
linLPGF' lpgf trees =
forM (Map.toList (LPGF.concretes lpgf)) $ \(_,concr) -> mapM (try . LPGF.linearizeConcreteText concr) trees
-- | Produce human readable file size
-- Adapted from https://hackage.haskell.org/package/hrfsize
convertSize :: Integer -> String
convertSize = convertSize'' . fromInteger
convertSize' :: Double -> String
convertSize' size
| size < 1024.0 = printf "%.0v bytes" size
| size < 1024.0 ^ (2 :: Int) = printf "%.2v KiB" $ size / 1024.0
| size < 1024.0 ^ (3 :: Int) = printf "%.2v MiB" $ size / 1024.0 ^ (2 :: Int)
| size < 1024.0 ^ (4 :: Int) = printf "%.2v GiB" $ size / 1024.0 ^ (3 :: Int)
| otherwise = printf "%.2v TiB" $ size / 1024.0 ^ (4 :: Int)
convertSize'' :: Double -> String
convertSize'' size
| size < 1000 = printf "%.0v bytes" size
| size < 1000 ^ (2 :: Int) = printf "%.2v KB" $ size / 1000
| size < 1000 ^ (3 :: Int) = printf "%.2v MB" $ size / 1000 ^ (2 :: Int)
| size < 1000 ^ (4 :: Int) = printf "%.2v GB" $ size / 1000 ^ (3 :: Int)
| otherwise = printf "%.2v TB" $ size / 1000 ^ (4 :: Int)
-- | Run a computation and catch any exception/errors.
try :: a -> IO (Either String a)
try comp = do
let f = Right <$> EX.evaluate comp
EX.catch f (\(e :: EX.SomeException) -> return $ Left (show e))

View File

@@ -0,0 +1,13 @@
--# -coding=latin1
resource CharactersGla = {
--Character classes
oper
vowel : pattern Str = #("a"|"e"|"i"|"o"|"u"|"à"|"è"|"ì"|"ò"|"ù") ;
vowelCap : pattern Str = #("A"|"E"|"I"|"O"|"U"|"À"|"É"|"Ì"|"Ò"|"Ù") ;
consonant : pattern Str = #("b"|"c"|"d"|"f"|"g"|"h"|"j"|"k"|"l"|"m"|"n"|"p"|"q"|"r"|"s"|"t"|"v"|"w"|"x"|"z") ;
consonantCap : pattern Str = #("B"|"C"|"D"|"F"|"G"|"H"|"J"|"K"|"L"|"M"|"N"|"P"|"Q"|"R"|"S"|"T"|"V"|"W"|"X"|"Z") ;
broadVowel : pattern Str = #("a"|"o"|"u"|"à"|"ò"|"ù") ;
slenderVowel : pattern Str = #("e"|"i"|"è"|"ì") ;
}

View File

@@ -0,0 +1,13 @@
--# -coding=latin1
resource CharactersGle = {
--Character classes
oper
vowel : pattern Str = #("a"|"e"|"i"|"o"|"u"|"á"|"é"|"í"|"ó"|"ú") ;
vowelCap : pattern Str = #("A"|"E"|"I"|"O"|"U"|"Á"|"É"|"Í"|"Ó"|"Ú") ;
consonant : pattern Str = #("b"|"c"|"d"|"f"|"g"|"h"|"j"|"k"|"l"|"m"|"n"|"p"|"q"|"r"|"s"|"t"|"v"|"w"|"x"|"z") ;
consonantCap : pattern Str = #("B"|"C"|"D"|"F"|"G"|"H"|"J"|"K"|"L"|"M"|"N"|"P"|"Q"|"R"|"S"|"T"|"V"|"W"|"X"|"Z") ;
broadVowel : pattern Str = #("a"|"o"|"u"|"á"|"ó"|"ú") ;
slenderVowel : pattern Str = #("e"|"i"|"é"|"í") ;
}

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,15 @@
-- (c) 2009 Aarne Ranta under LGPL
abstract Foods = {
flags startcat = Comment ;
cat
Comment ; Item ; Kind ; Quality ;
fun
Pred : Item -> Quality -> Comment ;
This, That, These, Those : Kind -> Item ;
Mod : Quality -> Kind -> Kind ;
Wine, Cheese, Fish, Pizza : Kind ;
Very : Quality -> Quality ;
Fresh, Warm, Italian,
Expensive, Delicious, Boring : Quality ;
}

View File

@@ -0,0 +1,185 @@
Foods: Pred (That Wine) Delicious
FoodsAfr: daardie wyn is heerlik
FoodsAmh: ያ ወይን ጣፋጭ ነው::
FoodsBul: онова вино е превъзходно
FoodsCat: aquell vi és deliciós
FoodsChi: 那 瓶 酒 是 美 味 的
FoodsCze: tamto víno je vynikající
FoodsDut: die wijn is lekker
FoodsEng: that wine is delicious
FoodsEpo: tiu vino estas bongusta
FoodsFin: tuo viini on herkullinen
FoodsFre: ce vin est délicieux
FoodsGer: jener Wein ist köstlich
FoodsGla: tha an fìon sin blasta
FoodsGle: tá an fíon sin blasta
FoodsHeb: היין ההוא טעים
FoodsHin: वह मदिरा स्वादिष्ट है
FoodsIce: þetta vín er ljúffengt
FoodsIta: quel vino è delizioso
FoodsJpn: その ワインは おいしい
FoodsLat: id vinum est iucundum
FoodsLav: tas vīns ir garšīgs
FoodsMkd: она вино е вкусно
FoodsMlt: dak l- inbid tajjeb
FoodsMon: тэр дарс бол амттай
FoodsNep: त्यो रक्सी स्वादिष्ट छ
FoodsOri: ସେଇ ମଦ ସ୍ଵାଦିସ୍ଟ ଅଟେ
FoodsPes: آن شراب لذىذ است
FoodsPor: esse vinho é delicioso
FoodsRon: acel vin este delicios
FoodsSpa: ese vino es delicioso
FoodsSwe: det där vinet är läckert
FoodsTha: เหล้าองุ่น ขวด นั้น อร่อย
FoodsTsn: bojalwa boo bo monate
FoodsTur: şu şarap lezzetlidir
FoodsUrd: وہ شراب مزیدار ہے
Foods: Pred (This Pizza) (Very Boring)
FoodsAfr: hierdie pizza is baie vervelig
FoodsAmh: ይህ [Pizza] በጣም አስቀያሚ ነው::
FoodsBul: тази пица е много еднообразна
FoodsCat: aquesta pizza és molt aburrida
FoodsChi: 这 张 比 萨 饼 是 非 常 难 吃 的
FoodsCze: tato pizza je velmi nudná
FoodsDut: deze pizza is erg saai
FoodsEng: this pizza is very boring
FoodsEpo: ĉi tiu pico estas tre enuiga
FoodsFin: tämä pizza on erittäin tylsä
FoodsFre: cette pizza est très ennuyeuse
FoodsGer: diese Pizza ist sehr langweilig
FoodsGla: tha an pizza seo glè leamh
FoodsGle: tá an píotsa seo an-leamh
FoodsHeb: הפיצה הזאת מאוד משעממת
FoodsHin: यह पिज़्ज़ा अति अरुचिकर है
FoodsIce: þessi flatbaka er mjög leiðinleg
FoodsIta: questa pizza è molto noiosa
FoodsJpn: この ピザは とても つまらない
FoodsLat: haec placenta neapolitana est valde fluens
FoodsLav: šī pica ir ļoti garlaicīga
FoodsMkd: оваа пица е многу досадна
FoodsMlt: din il- pizza tad-dwejjaq ħafna
FoodsMon: энэ пицца бол маш амтгүй
FoodsNep: यो पिज्जा धेरै नमिठा छ
FoodsOri: ଏଇ ପିଜଜ଼ା ଅତି ଅରୁଚିକର ଅଟେ
FoodsPes: این پیتزا خیلی ملال آور است
FoodsPor: esta pizza é muito chata
FoodsRon: această pizza este foarte plictisitoare
FoodsSpa: esta pizza es muy aburrida
FoodsSwe: den här pizzan är mycket tråkig
FoodsTha: พิซซา ถาด นี้ น่าเบิ่อ มาก
FoodsTsn: pizza e e bosula thata
FoodsTur: bu pizza çok sıkıcıdır
FoodsUrd: یھ پیزہ بہت فضول ہے
Foods: Pred (This Cheese) Fresh
FoodsAfr: hierdie kaas is vars
FoodsAmh: ይህ አይብ አዲስ ነው::
FoodsBul: това сирене е свежо
FoodsCat: aquest formatge és fresc
FoodsChi: 这 块 奶 酪 是 新 鲜 的
FoodsCze: tento sýr je čerstvý
FoodsDut: deze kaas is vers
FoodsEng: this cheese is fresh
FoodsEpo: ĉi tiu fromaĝo estas freŝa
FoodsFin: tämä juusto on tuore
FoodsFre: ce fromage est frais
FoodsGer: dieser Käse ist frisch
FoodsGla: tha an càise seo úr
FoodsGle: tá an cháis seo úr
FoodsHeb: הגבינה הזאת טריה
FoodsHin: यह पनीर ताज़ा है
FoodsIce: þessi ostur er ferskur
FoodsIta: questo formaggio è fresco
FoodsJpn: この チーズは 新鮮 だ
FoodsLat: hoc formaticum est recens
FoodsLav: šis siers ir svaigs
FoodsMkd: ова сирење е свежо
FoodsMlt: dan il- ġobon frisk
FoodsMon: энэ бяслаг бол шинэ
FoodsNep: यो चिज ताजा छ
FoodsOri: ଏଇ ଛେନା ତାଜା ଅଟେ
FoodsPes: این پنیر تازه است
FoodsPor: este queijo é fresco
FoodsRon: această brânză este proaspătă
FoodsSpa: este queso es fresco
FoodsSwe: den här osten är färsk
FoodsTha: เนยแข็ง ก้อน นี้ สด
FoodsTsn: kase e e ntsha
FoodsTur: bu peynir tazedir
FoodsUrd: یھ پنیر تازہ ہے
Foods: Pred (Those Fish) Warm
FoodsAfr: daardie visse is warm
FoodsAmh: [Those] ትኩስ ነው::
FoodsBul: онези риби са горещи
FoodsCat: aquells peixos són calents
FoodsChi: 那 几 条 鱼 是 温 热 的
FoodsCze: tamty ryby jsou teplé
FoodsDut: die vissen zijn warm
FoodsEng: those fish are warm
FoodsEpo: tiuj fiŝoj estas varmaj
FoodsFin: nuo kalat ovat lämpimiä
FoodsFre: ces poissons sont chauds
FoodsGer: jene Fische sind warm
FoodsGla: tha na h-èisg sin blàth
FoodsGle: tá na héisc sin te
FoodsHeb: הדגים ההם חמים
FoodsHin: वे मछलीयँा गरम हैं
FoodsIce: þessir fiskar eru heitir
FoodsIta: quei pesci sono caldi
FoodsJpn: その 魚は あたたかい
FoodsLat: ei pisces sunt calidi
FoodsLav: tās zivis ir siltas
FoodsMkd: оние риби се топли
FoodsMlt: dawk il- ħut sħan
FoodsMon: тэдгээр загаснууд бол халуун
FoodsNep: ती माछाहरु तातो छन्
FoodsOri: ସେଇ ମାଛ ଗୁଡିକ ଗରମ ଅଟେ
FoodsPes: آن ماهىها گرم هستند
FoodsPor: esses peixes são quentes
FoodsRon: acei peşti sunt calzi
FoodsSpa: esos pescados son calientes
FoodsSwe: de där fiskarna är varma
FoodsTha: ปลา ตัว นั้น อุ่น
FoodsTsn: dithlapi tseo di bothitho
FoodsTur: şu balıklar ılıktır
FoodsUrd: وہ مچھلیاں گرم ہیں
Foods: Pred (That (Mod Boring (Mod Italian Pizza))) Expensive
FoodsAfr: daardie vervelige Italiaanse pizza is duur
FoodsAmh: ያ አስቀያሚ የጥልያን [Pizza] ውድ ነው::
FoodsBul: онази еднообразна италианска пица е скъпа
FoodsCat: aquella pizza italiana aburrida és cara
FoodsChi: 那 张 又 难 吃 又 意 大 利 式 的 比 萨 饼 是 昂 贵 的
FoodsCze: tamta nudná italská pizza je drahá
FoodsDut: die saaie Italiaanse pizza is duur
FoodsEng: that boring Italian pizza is expensive
FoodsEpo: tiu enuiga itala pico estas altekosta
FoodsFin: tuo tylsä italialainen pizza on kallis
FoodsFre: cette pizza italienne ennuyeuse est chère
FoodsGer: jene langweilige italienische Pizza ist teuer
FoodsGla: tha an pizza Eadailteach leamh sin daor
FoodsGle: tá an píotsa Iodálach leamh sin daor
FoodsHeb: הפיצה האיטלקית המשעממת ההיא יקרה
FoodsHin: वह अरुचिकर इटली पिज़्ज़ा बहुमूल्य है
FoodsIce: þessi leiðinlega ítalska flatbaka er dýr
FoodsIta: quella pizza italiana noiosa è cara
FoodsJpn: その つまらない イタリアの ピザは たかい
FoodsLat: ea placenta itala fluens neapolitana est pretiosa
FoodsLav: tā garlaicīgā itāļu pica ir dārga
FoodsMkd: онаа досадна италијанска пица е скапа
FoodsMlt: dik il- pizza Taljana tad-dwejjaq għalja
FoodsMon: тэр амтгүй итали пицца бол үнэтэй
FoodsNep: त्यो नमिठा इटालियन पिज्जा महँगो छ
FoodsOri: ସେଇ ଅରୁଚିକର ଇଟାଲି ପିଜଜ଼ା ମୁଲ୍ୟବାନ୍ ଅଟେ
FoodsPes: آن پیتزا ایتالیایی ى ملال آور گران است
FoodsPor: essa pizza Italiana chata é cara
FoodsRon: acea pizza italiană plictisitoare este scumpă
FoodsSpa: esa pizza italiana aburrida es cara
FoodsSwe: den där tråkiga italienska pizzan är dyr
FoodsTha: พิซซา อิตาลี น่าเบิ่อ ถาด นั้น แพง
FoodsTsn: pizza eo ya ga Itali le e e bosula e a tura
FoodsTur: şu sıkıcı İtalyan pizzası pahalıdır
FoodsUrd: وہ فضول اٹا لوی پیزہ مہنگا ہے

View File

@@ -0,0 +1,5 @@
Pred (That Wine) Delicious
Pred (This Pizza) (Very Boring)
Pred (This Cheese) Fresh
Pred (Those Fish) Warm
Pred (That (Mod Boring (Mod Italian Pizza))) Expensive

View File

@@ -0,0 +1,77 @@
-- (c) 2009 Laurette Pretorius Sr & Jr and Ansu Berg under LGPL
--# -coding=latin1
concrete FoodsAfr of Foods = open Prelude, Predef in{
lincat
Comment = {s: Str} ;
Kind = {s: Number => Str} ;
Item = {s: Str ; n: Number} ;
Quality = {s: AdjAP => Str} ;
lin
Pred item quality = {s = item.s ++ "is" ++ (quality.s ! Predic)};
This kind = {s = "hierdie" ++ (kind.s ! Sg); n = Sg};
That kind = {s = "daardie" ++ (kind.s ! Sg); n = Sg};
These kind = {s = "hierdie" ++ (kind.s ! Pl); n = Pl};
Those kind = {s = "daardie" ++ (kind.s ! Pl); n = Pl};
Mod quality kind = {s = table{n => (quality.s ! Attr) ++ (kind.s!n)}};
Wine = declNoun_e "wyn";
Cheese = declNoun_aa "kaas";
Fish = declNoun_ss "vis";
Pizza = declNoun_s "pizza";
Very quality = veryAdj quality;
Fresh = regAdj "vars";
Warm = regAdj "warm";
Italian = smartAdj_e "Italiaans";
Expensive = regAdj "duur";
Delicious = smartAdj_e "heerlik";
Boring = smartAdj_e "vervelig";
param
AdjAP = Attr | Predic ;
Number = Sg | Pl ;
oper
--Noun operations (wyn, kaas, vis, pizza)
declNoun_aa: Str -> {s: Number => Str} = \x ->
let v = tk 2 x
in
{s = table{Sg => x ; Pl => v + (last x) +"e"}};
declNoun_e: Str -> {s: Number => Str} = \x -> {s = table{Sg => x ; Pl => x + "e"}} ;
declNoun_s: Str -> {s: Number => Str} = \x -> {s = table{Sg => x ; Pl => x + "s"}} ;
declNoun_ss: Str -> {s: Number => Str} = \x -> {s = table{Sg => x ; Pl => x + (last x) + "e"}} ;
--Adjective operations
mkAdj : Str -> Str -> {s: AdjAP => Str} = \x,y -> {s = table{Attr => x; Predic => y}};
declAdj_e : Str -> {s : AdjAP=> Str} = \x -> mkAdj (x + "e") x;
declAdj_g : Str -> {s : AdjAP=> Str} = \w ->
let v = init w
in mkAdj (v + "ë") w ;
declAdj_oog : Str -> {s : AdjAP=> Str} = \w ->
let v = init w
in
let i = init v
in mkAdj (i + "ë") w ;
regAdj : Str -> {s : AdjAP=> Str} = \x -> mkAdj x x;
veryAdj : {s: AdjAP => Str} -> {s : AdjAP=> Str} = \x -> {s = table{a => "baie" ++ (x.s!a)}};
smartAdj_e : Str -> {s : AdjAP=> Str} = \a -> case a of
{
_ + "oog" => declAdj_oog a ;
_ + ("e" | "ie" | "o" | "oe") + "g" => declAdj_g a ;
_ => declAdj_e a
};
}

View File

@@ -0,0 +1,21 @@
concrete FoodsAmh of Foods ={
flags coding = utf8;
lincat
Comment,Item,Kind,Quality = Str;
lin
Pred item quality = item ++ quality++ "ነው::" ;
This kind = "ይህ" ++ kind;
That kind = "ያ" ++ kind;
Mod quality kind = quality ++ kind;
Wine = "ወይን";
Cheese = "አይብ";
Fish = "ዓሳ";
Very quality = "በጣም" ++ quality;
Fresh = "አዲስ";
Warm = "ትኩስ";
Italian = "የጥልያን";
Expensive = "ውድ";
Delicious = "ጣፋጭ";
Boring = "አስቀያሚ";
}

View File

@@ -0,0 +1,43 @@
-- (c) 2009 Krasimir Angelov under LGPL
concrete FoodsBul of Foods = {
flags
coding = utf8;
param
Gender = Masc | Fem | Neutr;
Number = Sg | Pl;
Agr = ASg Gender | APl ;
lincat
Comment = Str ;
Quality = {s : Agr => Str} ;
Item = {s : Str; a : Agr} ;
Kind = {s : Number => Str; g : Gender} ;
lin
Pred item qual = item.s ++ case item.a of {ASg _ => "е"; APl => "са"} ++ qual.s ! item.a ;
This kind = {s=case kind.g of {Masc=>"този"; Fem=>"тази"; Neutr=>"това" } ++ kind.s ! Sg; a=ASg kind.g} ;
That kind = {s=case kind.g of {Masc=>"онзи"; Fem=>"онази"; Neutr=>"онова"} ++ kind.s ! Sg; a=ASg kind.g} ;
These kind = {s="тези" ++ kind.s ! Pl; a=APl} ;
Those kind = {s="онези" ++ kind.s ! Pl; a=APl} ;
Mod qual kind = {s=\\n => qual.s ! (case n of {Sg => ASg kind.g; Pl => APl}) ++ kind.s ! n; g=kind.g} ;
Wine = {s = table {Sg => "вино"; Pl => "вина"}; g = Neutr};
Cheese = {s = table {Sg => "сирене"; Pl => "сирена"}; g = Neutr};
Fish = {s = table {Sg => "риба"; Pl => "риби"}; g = Fem};
Pizza = {s = table {Sg => "пица"; Pl => "пици"}; g = Fem};
Very qual = {s = \\g => "много" ++ qual.s ! g};
Fresh = {s = table {ASg Masc => "свеж"; ASg Fem => "свежа"; ASg Neutr => "свежо"; APl => "свежи"}};
Warm = {s = table {ASg Masc => "горещ"; ASg Fem => "гореща"; ASg Neutr => "горещо"; APl => "горещи"}};
Italian = {s = table {ASg Masc => "италиански"; ASg Fem => "италианска"; ASg Neutr => "италианско"; APl => "италиански"}};
Expensive = {s = table {ASg Masc => "скъп"; ASg Fem => "скъпа"; ASg Neutr => "скъпо"; APl => "скъпи"}};
Delicious = {s = table {ASg Masc => "превъзходен"; ASg Fem => "превъзходна"; ASg Neutr => "превъзходно"; APl => "превъзходни"}};
Boring = {s = table {ASg Masc => "еднообразен"; ASg Fem => "еднообразна"; ASg Neutr => "еднообразно"; APl => "еднообразни"}};
}

View File

@@ -0,0 +1,6 @@
-- (c) 2009 Jordi Saludes under LGPL
concrete FoodsCat of Foods = FoodsI with
(Syntax = SyntaxCat),
(LexFoods = LexFoodsCat) ;

View File

@@ -0,0 +1,56 @@
concrete FoodsChi of Foods = open Prelude in {
flags coding = utf8 ;
lincat
Comment, Item = Str;
Kind = knd ;
Quality = qual ;
lin
Pred = (\itm, ql ->
case ql.hasVery of {
True => itm ++ "是 非 常" ++ ql.s ++ ql.p ;
False => itm ++ "是" ++ ql.s ++ ql.p } ) ;
This kind = "这" ++ kind.c ++ kind.m ++ kind.s ;
That kind = "那" ++ kind.c ++ kind.m ++ kind.s ;
These kind = "这" ++ "几" ++ kind.c ++ kind.m ++ kind.s ;
Those kind = "那" ++ "几" ++ kind.c ++ kind.m ++ kind.s ;
Mod = modifier ;
Wine = geKind "酒" "瓶" ;
Pizza = geKind "比 萨 饼" "张" ;
Cheese = geKind "奶 酪" "块";
Fish = geKind "鱼" "条";
Very = (\q -> {s = q.s ; p = q.p ; hasVery = True}) ;
Fresh = longQuality "新 鲜" ;
Warm = longQuality "温 热" ;
Italian = longQuality "意 大 利 式" ;
Expensive = longQuality "昂 贵" ;
Delicious = longQuality "美 味" ;
-- this technically translates to "unpalatable" instead of boring
Boring = longQuality "难 吃" ;
oper
-- lincat aliases
qual : Type = {s,p : Str ; hasVery : Bool} ;
knd : Type = {s,c,m : Str; hasMod : Bool} ;
-- Constructor functions
mkKind : Str -> Str -> knd = \s,c ->
{s = s ; c = c; m = ""; hasMod = False} ;
geKind : Str -> Str -> knd = \s,cl ->
mkKind s (classifier cl) ;
longQuality : Str -> qual = \s ->
{s = s ; p = "的" ; hasVery = False} ;
modifier : qual -> knd -> knd = \q,k ->
{ s = k.s ; c = k.c ; m = modJoin k.hasMod q k.m ;
hasMod = True } ;
-- Helper functions
classifier : Str -> Str = \s ->
case s of {"" => "个" ; _ => s };
modJoin : Bool -> qual -> Str -> Str = \bool, q,m ->
case bool of {
True => "又" ++ q.s ++ "又" ++ m ;
False => q.s ++ q.p } ;
}

View File

@@ -0,0 +1,35 @@
-- (c) 2011 Katerina Bohmova under LGPL
concrete FoodsCze of Foods = open ResCze in {
flags
coding = utf8 ;
lincat
Comment = {s : Str} ;
Quality = Adjective ;
Kind = Noun ;
Item = NounPhrase ;
lin
Pred item quality =
{s = item.s ++ copula ! item.n ++
quality.s ! item.g ! item.n} ;
This = det Sg "tento" "tato" "toto" ;
That = det Sg "tamten" "tamta" "tamto" ;
These = det Pl "tyto" "tyto" "tato" ;
Those = det Pl "tamty" "tamty" "tamta" ;
Mod quality kind = {
s = \\n => quality.s ! kind.g ! n ++ kind.s ! n ;
g = kind.g
} ;
Wine = noun "víno" "vína" Neutr ;
Cheese = noun "sýr" "sýry" Masc ;
Fish = noun "ryba" "ryby" Fem ;
Pizza = noun "pizza" "pizzy" Fem ;
Very qual = {s = \\g,n => "velmi" ++ qual.s ! g ! n} ;
Fresh = regAdj "čerstv" ;
Warm = regAdj "tepl" ;
Italian = regAdj "italsk" ;
Expensive = regAdj "drah" ;
Delicious = regnfAdj "vynikající" ;
Boring = regAdj "nudn" ;
}

View File

@@ -0,0 +1,58 @@
-- (c) 2009 Femke Johansson under LGPL
concrete FoodsDut of Foods = {
lincat
Comment = {s : Str};
Quality = {s : AForm => Str};
Kind = { s : Number => Str};
Item = {s : Str ; n : Number};
lin
Pred item quality =
{s = item.s ++ copula ! item.n ++ quality.s ! APred};
This = det Sg "deze";
These = det Pl "deze";
That = det Sg "die";
Those = det Pl "die";
Mod quality kind =
{s = \\n => quality.s ! AAttr ++ kind.s ! n};
Wine = regNoun "wijn";
Cheese = noun "kaas" "kazen";
Fish = noun "vis" "vissen";
Pizza = noun "pizza" "pizza's";
Very a = {s = \\f => "erg" ++ a.s ! f};
Fresh = regadj "vers";
Warm = regadj "warm";
Italian = regadj "Italiaans";
Expensive = adj "duur" "dure";
Delicious = regadj "lekker";
Boring = regadj "saai";
param
Number = Sg | Pl;
AForm = APred | AAttr;
oper
det : Number -> Str ->
{s : Number => Str} -> {s : Str ; n: Number} =
\n,det,noun -> {s = det ++ noun.s ! n ; n=n};
noun : Str -> Str -> {s : Number => Str} =
\man,men -> {s = table {Sg => man; Pl => men}};
regNoun : Str -> {s : Number => Str} =
\wijn -> noun wijn (wijn + "en");
regadj : Str -> {s : AForm => Str} =
\koud -> adj koud (koud+"e");
adj : Str -> Str -> {s : AForm => Str} =
\duur, dure -> {s = table {APred => duur; AAttr => dure}};
copula : Number => Str =
table {Sg => "is" ; Pl => "zijn"};
}

View File

@@ -0,0 +1,43 @@
-- (c) 2009 Aarne Ranta under LGPL
concrete FoodsEng of Foods = {
flags language = en_US;
lincat
Comment, Quality = {s : Str} ;
Kind = {s : Number => Str} ;
Item = {s : Str ; n : Number} ;
lin
Pred item quality =
{s = item.s ++ copula ! item.n ++ quality.s} ;
This = det Sg "this" ;
That = det Sg "that" ;
These = det Pl "these" ;
Those = det Pl "those" ;
Mod quality kind =
{s = \\n => quality.s ++ kind.s ! n} ;
Wine = regNoun "wine" ;
Cheese = regNoun "cheese" ;
Fish = noun "fish" "fish" ;
Pizza = regNoun "pizza" ;
Very a = {s = "very" ++ a.s} ;
Fresh = adj "fresh" ;
Warm = adj "warm" ;
Italian = adj "Italian" ;
Expensive = adj "expensive" ;
Delicious = adj "delicious" ;
Boring = adj "boring" ;
param
Number = Sg | Pl ;
oper
det : Number -> Str ->
{s : Number => Str} -> {s : Str ; n : Number} =
\n,det,noun -> {s = det ++ noun.s ! n ; n = n} ;
noun : Str -> Str -> {s : Number => Str} =
\man,men -> {s = table {Sg => man ; Pl => men}} ;
regNoun : Str -> {s : Number => Str} =
\car -> noun car (car + "s") ;
adj : Str -> {s : Str} =
\cold -> {s = cold} ;
copula : Number => Str =
table {Sg => "is" ; Pl => "are"} ;
}

View File

@@ -0,0 +1,48 @@
-- (c) 2009 Julia Hammar under LGPL
concrete FoodsEpo of Foods = open Prelude in {
flags coding =utf8 ;
lincat
Comment = SS ;
Kind, Quality = {s : Number => Str} ;
Item = {s : Str ; n : Number} ;
lin
Pred item quality = ss (item.s ++ copula ! item.n ++ quality.s ! item.n) ;
This = det Sg "ĉi tiu" ;
That = det Sg "tiu" ;
These = det Pl "ĉi tiuj" ;
Those = det Pl "tiuj" ;
Mod quality kind = {s = \\n => quality.s ! n ++ kind.s ! n} ;
Wine = regNoun "vino" ;
Cheese = regNoun "fromaĝo" ;
Fish = regNoun "fiŝo" ;
Pizza = regNoun "pico" ;
Very quality = {s = \\n => "tre" ++ quality.s ! n} ;
Fresh = regAdj "freŝa" ;
Warm = regAdj "varma" ;
Italian = regAdj "itala" ;
Expensive = regAdj "altekosta" ;
Delicious = regAdj "bongusta" ;
Boring = regAdj "enuiga" ;
param
Number = Sg | Pl ;
oper
det : Number -> Str -> {s : Number => Str} -> {s : Str ; n : Number} =
\n,d,cn -> {
s = d ++ cn.s ! n ;
n = n
} ;
regNoun : Str -> {s : Number => Str} =
\vino -> {s = table {Sg => vino ; Pl => vino + "j"}
} ;
regAdj : Str -> {s : Number => Str} =
\nova -> {s = table {Sg => nova ; Pl => nova + "j"}
} ;
copula : Number => Str = \\_ => "estas" ;
}

View File

@@ -0,0 +1,6 @@
-- (c) 2009 Aarne Ranta under LGPL
concrete FoodsFin of Foods = FoodsI with
(Syntax = SyntaxFin),
(LexFoods = LexFoodsFin) ;

View File

@@ -0,0 +1,31 @@
concrete FoodsFre of Foods = open SyntaxFre, ParadigmsFre in {
flags coding = utf8 ;
lincat
Comment = Utt ;
Item = NP ;
Kind = CN ;
Quality = AP ;
lin
Pred item quality = mkUtt (mkCl item quality) ;
This kind = mkNP this_QuantSg kind ;
That kind = mkNP that_QuantSg kind ;
These kind = mkNP these_QuantPl kind ;
Those kind = mkNP those_QuantPl kind ;
Mod quality kind = mkCN quality kind ;
Very quality = mkAP very_AdA quality ;
Wine = mkCN (mkN "vin" masculine) ;
Pizza = mkCN (mkN "pizza" feminine) ;
Cheese = mkCN (mkN "fromage" masculine) ;
Fish = mkCN (mkN "poisson" masculine) ;
Fresh = mkAP (mkA "frais" "fraîche" "frais" "fraîchement") ;
Warm = mkAP (mkA "chaud") ;
Italian = mkAP (mkA "italien") ;
Expensive = mkAP (mkA "cher") ;
Delicious = mkAP (mkA "délicieux") ;
Boring = mkAP (mkA "ennuyeux") ;
}

View File

@@ -0,0 +1,6 @@
-- (c) 2009 Aarne Ranta under LGPL
concrete FoodsGer of Foods = FoodsI with
(Syntax = SyntaxGer),
(LexFoods = LexFoodsGer) ;

View File

@@ -0,0 +1,67 @@
--# -coding=latin1
concrete FoodsGla of Foods = open MutationsGla, CharactersGla, Prelude in {
param Gender = Masc|Fem ;
param Number = Sg|Pl ;
param Breadth = Broad|Slender|NoBreadth ;
param Beginning = Bcgmp|Other ;
lincat Comment = Str;
lin Pred item quality = "tha" ++ item ++ quality.s!Sg!Unmutated ;
lincat Item = Str;
lin
This kind = (addArticleSg kind) ++ "seo" ;
That kind = (addArticleSg kind) ++ "sin";
These kind = (addArticlePl kind) ++ "seo" ;
Those kind = (addArticlePl kind) ++ "sin" ;
oper addArticleSg : {s : Number => Mutation => Str; g : Gender} -> Str =
\kind -> case kind.g of { Masc => "an" ++ kind.s!Sg!PrefixT; Fem => "a'" ++ kind.s!Sg!Lenition1DNTLS } ;
oper addArticlePl : {s : Number => Mutation => Str; g : Gender} -> Str =
\kind -> "na" ++ kind.s!Pl!PrefixH ;
oper Noun : Type = {s : Number => Mutation => Str; g : Gender; pe : Breadth; beginning: Beginning; };
lincat Kind = Noun;
lin
Mod quality kind = {
s = table{
Sg => table{mutation => kind.s!Sg!mutation ++ case kind.g of {Masc => quality.s!Sg!Unmutated; Fem => quality.s!Sg!Lenition1} };
Pl => table{mutation => kind.s!Pl!mutation ++ case kind.pe of {Slender => quality.s!Pl!Lenition1; _ => quality.s!Pl!Unmutated} }
};
g = kind.g;
pe = kind.pe;
beginning = kind.beginning
} ;
Wine = makeNoun "fìon" "fìontan" Masc ;
Cheese = makeNoun "càise" "càisean" Masc ;
Fish = makeNoun "iasg" "èisg" Masc ;
Pizza = makeNoun "pizza" "pizzathan" Masc ;
oper makeNoun : Str -> Str -> Gender -> Noun = \sg,pl,g -> {
s = table{Sg => (mutate sg); Pl => (mutate pl)};
g = g;
pe = pe;
beginning = Bcgmp
}
where {
pe : Breadth = case pl of {
_ + v@(#broadVowel) + c@(#consonant*) + #consonant => Broad;
_ + v@(#slenderVowel) + c@(#consonant*) + #consonant => Slender;
_ => NoBreadth
}
};
oper Adjective : Type = {s : Number => Mutation => Str; sVery : Number => Str};
lincat Quality = Adjective;
lin
Very quality = {s=table{number => table{_ => quality.sVery!number}}; sVery=quality.sVery } ;
Fresh = makeAdjective "úr" "ùra" ;
Warm = makeAdjective "blàth" "blàtha" ;
Italian = makeAdjective "Eadailteach" "Eadailteach" ;
Expensive = makeAdjective "daor" "daora" ;
Delicious = makeAdjective "blasta" "blasta" ;
Boring = makeAdjective "leamh" "leamha" ;
oper makeAdjective : Str -> Str -> Adjective =
\sg,pl -> {
s=table{Sg => (mutate sg); Pl => (mutate pl)};
sVery=table{Sg => "glè"++(lenition1dntls sg); Pl => "glè"++(lenition1dntls pl)}
} ;
}

View File

@@ -0,0 +1,60 @@
--# -coding=latin1
concrete FoodsGle of Foods = open MutationsGle, CharactersGle in {
param Gender = Masc|Fem ;
param Number = Sg|Pl ;
param Breadth = Broad|Slender|NoBreadth ;
lincat Comment = Str;
lin Pred item quality = "tá" ++ item ++ quality.s!Sg!Unmutated ;
lincat Item = Str;
lin
This kind = (addArticleSg kind) ++ "seo" ;
That kind = (addArticleSg kind) ++ "sin";
These kind = (addArticlePl kind) ++ "seo" ;
Those kind = (addArticlePl kind) ++ "sin" ;
oper addArticleSg : {s : Number => Mutation => Str; g : Gender} -> Str =
\kind -> "an" ++ case kind.g of { Masc => kind.s!Sg!PrefixT; Fem => kind.s!Sg!Lenition1DNTLS } ;
oper addArticlePl : {s : Number => Mutation => Str; g : Gender} -> Str =
\kind -> "na" ++ kind.s!Pl!PrefixH ;
lincat Kind = {s : Number => Mutation => Str; g : Gender; pe : Breadth} ;
lin
Mod quality kind = {
s = table{
Sg => table{mutation => kind.s!Sg!mutation ++ case kind.g of {Masc => quality.s!Sg!Unmutated; Fem => quality.s!Sg!Lenition1} };
Pl => table{mutation => kind.s!Pl!mutation ++ case kind.pe of {Slender => quality.s!Pl!Lenition1; _ => quality.s!Pl!Unmutated} }
};
g = kind.g;
pe = kind.pe
} ;
Wine = makeNoun "fíon" "fíonta" Masc ;
Cheese = makeNoun "cáis" "cáiseanna" Fem ;
Fish = makeNoun "iasc" "éisc" Masc ;
Pizza = makeNoun "píotsa" "píotsaí" Masc ;
oper makeNoun : Str -> Str -> Gender -> {s : Number => Mutation => Str; g : Gender; pe : Breadth} =
\sg,pl,g -> {
s = table{Sg => (mutate sg); Pl => (mutate pl)};
g = g;
pe = case pl of {
_ + v@(#broadVowel) + c@(#consonant*) + #consonant => Broad;
_ + v@(#slenderVowel) + c@(#consonant*) + #consonant => Slender;
_ => NoBreadth
}
} ;
lincat Quality = {s : Number => Mutation => Str; sVery : Number => Str} ;
lin
Very quality = {s=table{number => table{_ => quality.sVery!number}}; sVery=quality.sVery } ;
Fresh = makeAdjective "úr" "úra" ;
Warm = makeAdjective "te" "te" ;
Italian = makeAdjective "Iodálach" "Iodálacha" ;
Expensive = makeAdjective "daor" "daora" ;
Delicious = makeAdjective "blasta" "blasta" ;
Boring = makeAdjective "leamh" "leamha" ;
oper makeAdjective : Str -> Str -> {s : Number => Mutation => Str; sVery : Number => Str} =
\sg,pl -> {
s=table{Sg => (mutate sg); Pl => (mutate pl)};
sVery=table{Sg => "an-"+(lenition1dntls sg); Pl => "an-"+(lenition1dntls pl)}
} ;
}

View File

@@ -0,0 +1,107 @@
--(c) 2009 Dana Dannells
-- Licensed under LGPL
concrete FoodsHeb of Foods = open Prelude in {
flags coding=utf8 ;
lincat
Comment = SS ;
Quality = {s: Number => Species => Gender => Str} ;
Kind = {s : Number => Species => Str ; g : Gender ; mod : Modified} ;
Item = {s : Str ; g : Gender ; n : Number ; sp : Species ; mod : Modified} ;
lin
Pred item quality = ss (item.s ++ quality.s ! item.n ! Indef ! item.g ) ;
This = det Sg Def "הזה" "הזאת";
That = det Sg Def "ההוא" "ההיא" ;
These = det Pl Def "האלה" "האלה" ;
Those = det Pl Def "ההם" "ההן" ;
Mod quality kind = {
s = \\n,sp => kind.s ! n ! sp ++ quality.s ! n ! sp ! kind.g;
g = kind.g ;
mod = T
} ;
Wine = regNoun "יין" "יינות" Masc ;
Cheese = regNoun "גבינה" "גבינות" Fem ;
Fish = regNoun "דג" "דגים" Masc ;
Pizza = regNoun "פיצה" "פיצות" Fem ;
Very qual = {s = \\g,n,sp => "מאוד" ++ qual.s ! g ! n ! sp} ;
Fresh = regAdj "טרי" ;
Warm = regAdj "חם" ;
Italian = regAdj2 "איטלקי" ;
Expensive = regAdj "יקר" ;
Delicious = regAdj "טעים" ;
Boring = regAdj2 "משעמם";
param
Number = Sg | Pl ;
Gender = Masc | Fem ;
Species = Def | Indef ;
Modified = T | F ;
oper
Noun : Type = {s : Number => Species => Str ; g : Gender ; mod : Modified } ;
Adj : Type = {s : Number => Species => Gender => Str} ;
det : Number -> Species -> Str -> Str -> Noun ->
{s : Str ; g :Gender ; n : Number ; sp : Species ; mod : Modified} =
\n,sp,m,f,cn -> {
s = case cn.mod of { _ => cn.s ! n ! sp ++ case cn.g of {Masc => m ; Fem => f} };
g = cn.g ;
n = n ;
sp = sp ;
mod = cn.mod
} ;
noun : (gvina,hagvina,gvinot,hagvinot : Str) -> Gender -> Noun =
\gvina,hagvina,gvinot,hagvinot,g -> {
s = table {
Sg => table {
Indef => gvina ;
Def => hagvina
} ;
Pl => table {
Indef => gvinot ;
Def => hagvinot
}
} ;
g = g ;
mod = F
} ;
regNoun : Str -> Str -> Gender -> Noun =
\gvina,gvinot, g ->
noun gvina (defH gvina) gvinot (defH gvinot) g ;
defH : Str -> Str = \cn ->
case cn of {_ => "ה" + cn};
replaceLastLetter : Str -> Str = \c ->
case c of {"ף" => "פ" ; "ם" => "מ" ; "ן" => "נ" ; "ץ" => "צ" ; "ך" => "כ"; _ => c} ;
adjective : (_,_,_,_ : Str) -> Adj =
\tov,tova,tovim,tovot -> {
s = table {
Sg => table {
Indef => table { Masc => tov ; Fem => tova } ;
Def => table { Masc => defH tov ; Fem => defH tova }
} ;
Pl => table {
Indef => table {Masc => tovim ; Fem => tovot } ;
Def => table { Masc => defH tovim ; Fem => defH tovot }
}
}
} ;
regAdj : Str -> Adj = \tov ->
case tov of { to + c@? =>
adjective tov (to + replaceLastLetter (c) + "ה" ) (to + replaceLastLetter (c) +"ים" ) (to + replaceLastLetter (c) + "ות" )};
regAdj2 : Str -> Adj = \italki ->
case italki of { italk+ c@? =>
adjective italki (italk + replaceLastLetter (c) +"ת" ) (italk + replaceLastLetter (c)+ "ים" ) (italk + replaceLastLetter (c) + "ות" )};
} -- FoodsHeb

View File

@@ -0,0 +1,75 @@
-- (c) 2010 Vikash Rauniyar under LGPL
concrete FoodsHin of Foods = {
flags coding=utf8 ;
param
Gender = Masc | Fem ;
Number = Sg | Pl ;
lincat
Comment = {s : Str} ;
Item = {s : Str ; g : Gender ; n : Number} ;
Kind = {s : Number => Str ; g : Gender} ;
Quality = {s : Gender => Number => Str} ;
lin
Pred item quality = {
s = item.s ++ quality.s ! item.g ! item.n ++ copula item.n
} ;
This kind = {s = "यह" ++ kind.s ! Sg ; g = kind.g ; n = Sg} ;
That kind = {s = "वह" ++ kind.s ! Sg ; g = kind.g ; n = Sg} ;
These kind = {s = "ये" ++ kind.s ! Pl ; g = kind.g ; n = Pl} ;
Those kind = {s = "वे" ++ kind.s ! Pl ; g = kind.g ; n = Pl} ;
Mod quality kind = {
s = \\n => quality.s ! kind.g ! n ++ kind.s ! n ;
g = kind.g
} ;
Wine = regN "मदिरा" ;
Cheese = regN "पनीर" ;
Fish = regN "मछली" ;
Pizza = regN "पिज़्ज़ा" ;
Very quality = {s = \\g,n => "अति" ++ quality.s ! g ! n} ;
Fresh = regAdj "ताज़ा" ;
Warm = regAdj "गरम" ;
Italian = regAdj "इटली" ;
Expensive = regAdj "बहुमूल्य" ;
Delicious = regAdj "स्वादिष्ट" ;
Boring = regAdj "अरुचिकर" ;
oper
mkN : Str -> Str -> Gender -> {s : Number => Str ; g : Gender} =
\s,p,g -> {
s = table {
Sg => s ;
Pl => p
} ;
g = g
} ;
regN : Str -> {s : Number => Str ; g : Gender} = \s -> case s of {
lark + "ा" => mkN s (lark + "े") Masc ;
lark + "ी" => mkN s (lark + "ीयँा") Fem ;
_ => mkN s s Masc
} ;
mkAdj : Str -> Str -> Str -> {s : Gender => Number => Str} = \ms,mp,f -> {
s = table {
Masc => table {
Sg => ms ;
Pl => mp
} ;
Fem => \\_ => f
}
} ;
regAdj : Str -> {s : Gender => Number => Str} = \a -> case a of {
acch + "ा" => mkAdj a (acch + "े") (acch + "ी") ;
_ => mkAdj a a a
} ;
copula : Number -> Str = \n -> case n of {
Sg => "है" ;
Pl => "हैं"
} ;
}

View File

@@ -0,0 +1,29 @@
-- (c) 2009 Aarne Ranta under LGPL
incomplete concrete FoodsI of Foods =
open Syntax, LexFoods in {
lincat
Comment = Utt ;
Item = NP ;
Kind = CN ;
Quality = AP ;
lin
Pred item quality = mkUtt (mkCl item quality) ;
This kind = mkNP this_Det kind ;
That kind = mkNP that_Det kind ;
These kind = mkNP these_Det kind ;
Those kind = mkNP those_Det kind ;
Mod quality kind = mkCN quality kind ;
Very quality = mkAP very_AdA quality ;
Wine = mkCN wine_N ;
Pizza = mkCN pizza_N ;
Cheese = mkCN cheese_N ;
Fish = mkCN fish_N ;
Fresh = mkAP fresh_A ;
Warm = mkAP warm_A ;
Italian = mkAP italian_A ;
Expensive = mkAP expensive_A ;
Delicious = mkAP delicious_A ;
Boring = mkAP boring_A ;
}

View File

@@ -0,0 +1,83 @@
-- (c) 2009 Martha Dis Brandt under LGPL
concrete FoodsIce of Foods = open Prelude in {
flags coding=utf8;
lincat
Comment = SS ;
Quality = {s : Gender => Number => Defin => Str} ;
Kind = {s : Number => Str ; g : Gender} ;
Item = {s : Str ; g : Gender ; n : Number} ;
lin
Pred item quality = ss (item.s ++ copula item.n ++ quality.s ! item.g ! item.n ! Ind) ;
This, That = det Sg "þessi" "þessi" "þetta" ;
These, Those = det Pl "þessir" "þessar" "þessi" ;
Mod quality kind = { s = \\n => quality.s ! kind.g ! n ! Def ++ kind.s ! n ; g = kind.g } ;
Wine = noun "vín" "vín" Neutr ;
Cheese = noun "ostur" "ostar" Masc ;
Fish = noun "fiskur" "fiskar" Masc ;
-- the word "pizza" is more commonly used in Iceland, but "flatbaka" is the Icelandic word for it
Pizza = noun "flatbaka" "flatbökur" Fem ;
Very qual = {s = \\g,n,defOrInd => "mjög" ++ qual.s ! g ! n ! defOrInd } ;
Fresh = regAdj "ferskur" ;
Warm = regAdj "heitur" ;
Boring = regAdj "leiðinlegur" ;
-- the order of the given adj forms is: mSg fSg nSg mPl fPl nPl mSgDef f/nSgDef _PlDef
Italian = adjective "ítalskur" "ítölsk" "ítalskt" "ítalskir" "ítalskar" "ítölsk" "ítalski" "ítalska" "ítalsku" ;
Expensive = adjective "dýr" "dýr" "dýrt" "dýrir" "dýrar" "dýr" "dýri" "dýra" "dýru" ;
Delicious = adjective "ljúffengur" "ljúffeng" "ljúffengt" "ljúffengir" "ljúffengar" "ljúffeng" "ljúffengi" "ljúffenga" "ljúffengu" ;
param
Number = Sg | Pl ;
Gender = Masc | Fem | Neutr ;
Defin = Ind | Def ;
oper
det : Number -> Str -> Str -> Str -> {s : Number => Str ; g : Gender} ->
{s : Str ; g : Gender ; n : Number} =
\n,masc,fem,neutr,cn -> {
s = case cn.g of {Masc => masc ; Fem => fem; Neutr => neutr } ++ cn.s ! n ;
g = cn.g ;
n = n
} ;
noun : Str -> Str -> Gender -> {s : Number => Str ; g : Gender} =
\man,men,g -> {
s = table {
Sg => man ;
Pl => men
} ;
g = g
} ;
adjective : (x1,_,_,_,_,_,_,_,x9 : Str) -> {s : Gender => Number => Defin => Str} =
\ferskur,fersk,ferskt,ferskir,ferskar,fersk_pl,ferski,ferska,fersku -> {
s = \\g,n,t => case <g,n,t> of {
< Masc, Sg, Ind > => ferskur ;
< Masc, Pl, Ind > => ferskir ;
< Fem, Sg, Ind > => fersk ;
< Fem, Pl, Ind > => ferskar ;
< Neutr, Sg, Ind > => ferskt ;
< Neutr, Pl, Ind > => fersk_pl;
< Masc, Sg, Def > => ferski ;
< Fem, Sg, Def > | < Neutr, Sg, Def > => ferska ;
< _ , Pl, Def > => fersku
}
} ;
regAdj : Str -> {s : Gender => Number => Defin => Str} = \ferskur ->
let fersk = Predef.tk 2 ferskur
in adjective
ferskur fersk (fersk + "t")
(fersk + "ir") (fersk + "ar") fersk
(fersk + "i") (fersk + "a") (fersk + "u") ;
copula : Number -> Str =
\n -> case n of {
Sg => "er" ;
Pl => "eru"
} ;
}

View File

@@ -0,0 +1,7 @@
-- (c) 2009 Aarne Ranta under LGPL
concrete FoodsIta of Foods = FoodsI with
(Syntax = SyntaxIta),
(LexFoods = LexFoodsIta) ;

View File

@@ -0,0 +1,71 @@
-- (c) 2009 Zofia Stankiewicz under LGPL
concrete FoodsJpn of Foods = open Prelude in {
flags coding=utf8 ;
lincat
Comment = {s: Style => Str};
Quality = {s: AdjUse => Str ; t: AdjType} ;
Kind = {s : Number => Str} ;
Item = {s : Str ; n : Number} ;
lin
Pred item quality = {s = case quality.t of {
IAdj => table {Plain => item.s ++ quality.s ! APred ; Polite => item.s ++ quality.s ! APred ++ copula ! Polite ! item.n } ;
NaAdj => \\p => item.s ++ quality.s ! APred ++ copula ! p ! item.n }
} ;
This = det Sg "この" ;
That = det Sg "その" ;
These = det Pl "この" ;
Those = det Pl "その" ;
Mod quality kind = {s = \\n => quality.s ! Attr ++ kind.s ! n} ;
Wine = regNoun "ワインは" ;
Cheese = regNoun "チーズは" ;
Fish = regNoun "魚は" ;
Pizza = regNoun "ピザは" ;
Very quality = {s = \\a => "とても" ++ quality.s ! a ; t = quality.t } ;
Fresh = adj "新鮮な" "新鮮";
Warm = regAdj "あたたかい" ;
Italian = adj "イタリアの" "イタリアのもの";
Expensive = regAdj "たかい" ;
Delicious = regAdj "おいしい" ;
Boring = regAdj "つまらない" ;
param
Number = Sg | Pl ;
AdjUse = Attr | APred ; -- na-adjectives have different forms as noun attributes and predicates
Style = Plain | Polite ; -- for phrase types
AdjType = IAdj | NaAdj ; -- IAdj can form predicates without the copula, NaAdj cannot
oper
det : Number -> Str -> {s : Number => Str} -> {s : Str ; n : Number} =
\n,d,cn -> {
s = d ++ cn.s ! n ;
n = n
} ;
noun : Str -> Str -> {s : Number => Str} =
\sakana,sakana -> {s = \\_ => sakana } ;
regNoun : Str -> {s : Number => Str} =
\sakana -> noun sakana sakana ;
adj : Str -> Str -> {s : AdjUse => Str ; t : AdjType} =
\chosenna, chosen -> {
s = table {
Attr => chosenna ;
APred => chosen
} ;
t = NaAdj
} ;
regAdj : Str -> {s: AdjUse => Str ; t : AdjType} =\akai -> {
s = \\_ => akai ; t = IAdj} ;
copula : Style => Number => Str =
table {
Plain => \\_ => "だ" ;
Polite => \\_ => "です" } ;
}

View File

@@ -0,0 +1,76 @@
--# -path=.:present
-- (c) 2009 Aarne Ranta under LGPL
concrete FoodsLat of Foods = LexFoodsLat **
{
lincat
Comment = { s : Str } ;
Item = { number : Number ; gender : Gender; noun : Str; adj : Str; det : Str };
lin
Mod quality kind =
variants {
{
gender = kind.gender ;
noun = table { number => kind.noun ! number ++ quality.s ! number ! kind.gender } ;
adj = kind.adj
} ;
{
gender = kind.gender ;
noun = kind.noun ;
adj = table { number => kind.adj ! number ++ quality.s ! number ! kind.gender }
} ;
{
gender = kind.gender ;
noun = table { number => quality.s ! number ! kind.gender ++ kind.noun ! number } ;
adj = kind.adj
} ;
{
gender = kind.gender ;
noun = kind.noun ;
adj = table { number => quality.s ! number ! kind.gender ++ kind.adj ! number }
}
};
Pred item quality =
let aux : Number => Str =
table { Sg => "est" ; Pl => "sunt" } ;
in
{
s = variants {
item.det ++ item.noun ++ item.adj ++ aux ! item.number ++ quality.s ! item.number ! item.gender ;
item.det ++ item.adj ++ item.noun ++ aux ! item.number ++ quality.s ! item.number ! item.gender ;
item.det ++ item.noun ++ item.adj ++ quality.s ! item.number ! item.gender ++ aux ! item.number ;
item.det ++ item.adj ++ item.noun ++ quality.s ! item.number ! item.gender ++ aux ! item.number
};
};
This kind = {
number = Sg ;
gender = kind.gender ;
noun = kind.noun ! Sg ;
adj = kind.adj ! Sg ;
det = table { Male => "hic" ; Female => "haec" ; Neuter => "hoc" } ! kind.gender
} ;
These kind = {
number = Pl ;
gender = kind.gender ;
noun = kind.noun ! Pl ;
adj = kind.adj ! Pl ;
det = table { Male => "hi" ; Female => "hae" ; Neuter => "haec" } ! kind.gender
} ;
That kind = {
number = Sg ;
gender = kind.gender ;
noun = kind.noun ! Sg ;
adj = kind.adj ! Sg ;
det = table { Male => "is" ; Female => "ea" ; Neuter => "id" } ! kind.gender
} ;
Those kind = {
number = Pl ;
gender = kind.gender ;
noun = kind.noun ! Pl ;
adj = kind.adj ! Pl ;
det = table { Male => variants { "ei "; "ii" } ; Female => "eae" ; Neuter => "ea" } ! kind.gender
} ;
Very quality = { s = \\n,g => "valde" ++ quality.s ! n ! g };
}

View File

@@ -0,0 +1,90 @@
-- (c) 2009 Inese Bernsone under LGPL
concrete FoodsLav of Foods = open Prelude in {
flags
coding=utf8 ;
lincat
Comment = SS ;
Quality = {s : Q => Gender => Number => Defin => Str } ;
Kind = {s : Number => Str ; g : Gender} ;
Item = {s : Str ; g : Gender ; n : Number } ;
lin
Pred item quality = ss (item.s ++ {- copula item.n -} "ir" ++ quality.s ! Q1 ! item.g ! item.n ! Ind ) ;
This = det Sg "šis" "šī" ;
That = det Sg "tas" "tā" ;
These = det Pl "šie" "šīs" ;
Those = det Pl "tie" "tās" ;
Mod quality kind = {s = \\n => quality.s ! Q1 ! kind.g ! n ! Def ++ kind.s ! n ; g = kind.g } ;
Wine = noun "vīns" "vīni" Masc ;
Cheese = noun "siers" "sieri" Masc ;
Fish = noun "zivs" "zivis" Fem ;
Pizza = noun "pica" "picas" Fem ;
Very qual = {s = \\q,g,n,spec => "ļoti" ++ qual.s ! Q2 ! g ! n ! spec };
Fresh = adjective "svaigs" "svaiga" "svaigi" "svaigas" "svaigais" "svaigā" "svaigie" "svaigās" ;
Warm = regAdj "silts" ;
Italian = specAdj "itāļu" (regAdj "itālisks") ;
Expensive = regAdj "dārgs" ;
Delicious = regAdj "garšīgs" ;
Boring = regAdj "garlaicīgs" ;
param
Number = Sg | Pl ;
Gender = Masc | Fem ;
Defin = Ind | Def ;
Q = Q1 | Q2 ;
oper
det : Number -> Str -> Str -> {s : Number => Str ; g : Gender} ->
{s : Str ; g : Gender ; n : Number} =
\n,m,f,cn -> {
s = case cn.g of {Masc => m ; Fem => f} ++ cn.s ! n ;
g = cn.g ;
n = n
} ;
noun : Str -> Str -> Gender -> {s : Number => Str ; g : Gender} =
\man,men,g -> {
s = table {
Sg => man ;
Pl => men
} ;
g = g
} ;
adjective : (_,_,_,_,_,_,_,_ : Str) -> {s : Q => Gender => Number => Defin => Str} =
\skaists,skaista,skaisti,skaistas,skaistais,skaistaa,skaistie,skaistaas -> {
s = table {
_ => table {
Masc => table {
Sg => table {Ind => skaists ; Def => skaistais} ;
Pl => table {Ind => skaisti ; Def => skaistie}
} ;
Fem => table {
Sg => table {Ind => skaista ; Def => skaistaa} ;
Pl => table {Ind => skaistas ; Def => skaistaas}
}
}
}
} ;
{- irregAdj : Str -> {s : Gender => Number => Defin => Str} = \itaalju ->
let itaalju = itaalju
in adjective itaalju (itaalju) (itaalju) (itaalju) (itaalju) (itaalju) (itaalju) (itaalju) ; -}
regAdj : Str -> {s : Q => Gender => Number => Defin => Str} = \skaists ->
let skaist = init skaists
in adjective skaists (skaist + "a") (skaist + "i") (skaist + "as") (skaist + "ais") (skaist + "ā") (skaist + "ie") (skaist + "ās");
Adjective : Type = {s : Q => Gender => Number => Defin => Str} ;
specAdj : Str -> Adjective -> Adjective = \s,a -> {
s = table {
Q2 => a.s ! Q1 ;
Q1 => \\_,_,_ => s
}
} ;
}

View File

@@ -0,0 +1,120 @@
-- (c) 2009 Krasimir Angelov under LGPL
concrete FoodsMkd of Foods = {
flags coding = utf8 ;
lincat
Comment = Str;
Quality = {s : Agr => Str};
Item = {s : Str; a : Agr};
Kind = {s : Number => Str; g : Gender};
lin
Pred item qual =
item.s ++
case item.a of {
ASg _ => "е";
APl => "се"
} ++
qual.s ! item.a;
This kind = {
s = case kind.g of {
Masc => "овоj";
Fem => "оваа";
Neutr => "ова"
} ++
kind.s ! Sg;
a = ASg kind.g};
That kind = {
s = case kind.g of {
Masc => "оноj";
Fem => "онаа";
Neutr => "она"
} ++
kind.s ! Sg;
a = ASg kind.g};
These kind = {s = "овие" ++ kind.s ! Pl; a = APl};
Those kind = {s = "оние" ++ kind.s ! Pl; a = APl};
Mod qual kind = {
s = \\n => qual.s ! case n of {
Sg => ASg kind.g;
Pl => APl
} ++
kind.s ! n;
g = kind.g};
Wine = {
s = table {
Sg => "вино";
Pl => "вина"
};
g = Neutr};
Cheese = {
s = table {
Sg => "сирење";
Pl => "сирењa"
};
g = Neutr};
Fish = {
s = table {
Sg => "риба";
Pl => "риби"
};
g = Fem};
Pizza = {
s = table {
Sg => "пица";
Pl => "пици"
};
g = Fem
};
Very qual = {s = \\g => "многу" ++ qual.s ! g};
Fresh = {
s = table {
ASg Masc => "свеж";
ASg Fem => "свежа";
ASg Neutr => "свежо";
APl => "свежи"}
};
Warm = {
s = table {
ASg Masc => "топол";
ASg Fem => "топла";
ASg Neutr => "топло";
APl => "топли"}
};
Italian = {
s = table {
ASg Masc => "италијански";
ASg Fem => "италијанска";
ASg Neutr => "италијанско";
APl => "италијански"}
};
Expensive = {
s = table {
ASg Masc => "скап";
ASg Fem => "скапа";
ASg Neutr => "скапо";
APl => "скапи"}
};
Delicious = {
s = table {
ASg Masc => "вкусен";
ASg Fem => "вкусна";
ASg Neutr => "вкусно";
APl => "вкусни"}
};
Boring = {
s = table {
ASg Masc => "досаден";
ASg Fem => "досадна";
ASg Neutr => "досадно";
APl => "досадни"}
};
param
Gender = Masc | Fem | Neutr;
Number = Sg | Pl;
Agr = ASg Gender | APl;
}

View File

@@ -0,0 +1,105 @@
-- (c) 2013 John J. Camilleri under LGPL
concrete FoodsMlt of Foods = open Prelude in {
flags coding=utf8 ;
lincat
Comment = SS ;
Quality = {s : Gender => Number => Str} ;
Kind = {s : Number => Str ; g : Gender} ;
Item = {s : Str ; g : Gender ; n : Number} ;
lin
-- Pred item quality = ss (item.s ++ copula item.n item.g ++ quality.s ! item.g ! item.n) ;
Pred item quality = ss (item.s ++ quality.s ! item.g ! item.n) ;
This kind = det Sg "dan" "din" kind ;
That kind = det Sg "dak" "dik" kind ;
These kind = det Pl "dawn" "" kind ;
Those kind = det Pl "dawk" "" kind ;
Mod quality kind = {
s = \\n => kind.s ! n ++ quality.s ! kind.g ! n ;
g = kind.g
} ;
Wine = noun "inbid" "inbejjed" Masc ;
Cheese = noun "ġobon" "ġobniet" Masc ;
Fish = noun "ħuta" "ħut" Fem ;
Pizza = noun "pizza" "pizzez" Fem ;
Very qual = {s = \\g,n => qual.s ! g ! n ++ "ħafna"} ;
Warm = adjective "sħun" "sħuna" "sħan" ;
Expensive = adjective "għali" "għalja" "għaljin" ;
Delicious = adjective "tajjeb" "tajba" "tajbin" ;
Boring = uniAdj "tad-dwejjaq" ;
Fresh = regAdj "frisk" ;
Italian = regAdj "Taljan" ;
param
Number = Sg | Pl ;
Gender = Masc | Fem ;
oper
--Create an adjective (full function)
--Params: Sing Masc, Sing Fem, Plural
adjective : (_,_,_ : Str) -> {s : Gender => Number => Str} = \iswed,sewda,suwed -> {
s = table {
Masc => table {
Sg => iswed ;
Pl => suwed
} ;
Fem => table {
Sg => sewda ;
Pl => suwed
}
}
} ;
--Create a regular adjective
--Param: Sing Masc
regAdj : Str -> {s : Gender => Number => Str} = \frisk ->
adjective frisk (frisk + "a") (frisk + "i") ;
--Create a "uni-adjective" eg tal-buzz
--Param: Sing Masc
uniAdj : Str -> {s : Gender => Number => Str} = \uni ->
adjective uni uni uni ;
--Create a noun
--Params: Singular, Plural, Gender (inherent)
noun : Str -> Str -> Gender -> {s : Number => Str ; g : Gender} = \ktieb,kotba,g -> {
s = table {
Sg => ktieb ;
Pl => kotba
} ;
g = g
} ;
--Copula is a linking verb
--Params: Number, Gender
-- copula : Number -> Gender -> Str = \n,g -> case n of {
-- Sg => case g of { Masc => "huwa" ; Fem => "hija" } ;
-- Pl => "huma"
-- } ;
--Create an article, taking into account first letter of next word
article = pre {
"a"|"e"|"i"|"o"|"u" => "l-" ;
--cons@("ċ"|"d"|"n"|"r"|"s"|"t"|"x"|"ż") => "i" + cons + "-" ;
_ => "il-"
} ;
--Create a determinant
--Params: Sg/Pl, Masc, Fem
det : Number -> Str -> Str -> {s : Number => Str ; g : Gender} -> {s : Str ; g : Gender ; n : Number} = \n,m,f,cn -> {
s = case n of {
Sg => case cn.g of {Masc => m ; Fem => f}; --string
Pl => m --default to masc
} ++ article ++ cn.s ! n ;
g = cn.g ; --gender
n = n --number
} ;
}

Some files were not shown because too many files have changed in this diff Show More