1
0
forked from GitHub/gf-core

Compare commits

..

92 Commits

Author SHA1 Message Date
John J. Camilleri
3e2673de3b Use modify instead of insert: results mildly better but not significantly 2021-03-16 16:45:57 +01:00
John J. Camilleri
6c6a201d96 Introduce state with Map for caching compilation, but results are worse 2021-03-12 13:39:56 +01:00
John J. Camilleri
8f5033e4ce Add notes on profiling 2021-03-09 08:36:35 +01:00
John J. Camilleri
126b61ea03 Merge branch 'master' into lpgf 2021-03-08 13:52:34 +01:00
John J. Camilleri
99abb9b2a5 Add Phrasebook benchmark snippet to LPGF README 2021-03-08 13:37:02 +01:00
John J. Camilleri
3e9d12854a Switch to 10000-tree Phrasebook treebank. All errors to do with missing functions, plus variants in German. 2021-03-08 11:19:06 +01:00
John J. Camilleri
fd07946a50 Remove commented line 2021-03-08 10:42:16 +01:00
John J. Camilleri
c76efcf916 Use C runtime in mkTreebank script 2021-03-08 10:17:03 +01:00
John J. Camilleri
785d6069e2 Fix lin2string and pass all unittests and Phrasebook 2021-03-08 09:53:10 +01:00
John J. Camilleri
0f4b349b0b Remove old commented code 2021-03-05 16:51:59 +01:00
John J. Camilleri
dbf369aae5 Make removal of record fields recursive. Latest results with Phrasebook:
Bul ✓
Cat ✗
Chi ✓
Dan ✓
Dut ✓
Eng ✓
Est ✓
Fin ✓
Fre ✗
Ger ✓
Hin ✓
Ita ✗
Jpn ✓
Lav ✓
Nor ✓
Pol ✓
Ron ✓
Snd ✗
Spa ✓
Swe ✓
Tha ✓
Urd ✓

Passed 18 | Failed 4 | Total 22
2021-03-05 16:48:05 +01:00
John J. Camilleri
0d4659fe8c Add workaround for missing param defs. Add links to gf-core issues in workaround comments. 2021-03-05 13:23:00 +01:00
John J. Camilleri
575a746a3e Add LPGF function for catching errors. Manual fixes to Phrasebook treebank. 2021-03-05 12:05:25 +01:00
John J. Camilleri
70581c2d8c Improve base case in table handling, cleanup. Add run-phrasebook script, current output:
Bul ✗
Cat ✗
Chi ✓
Dan ✓
Dut ✓
Eng ✓
Est ✓
Fin ✗
Fre ✗
Ger ✓
Hin ✓
Ita ✗
Jpn ✓
Lav ✓
Nor ✓
Pol ✓
Ron ✓
Snd ✗
Spa ✗
Swe ✓
Tha ✓
Urd ✓

Passed 15 | Failed 7 | Total 22
2021-03-04 17:09:35 +01:00
John J. Camilleri
bca1e2286d New handling of tables, works for all tests but Phrasebook still fails 2021-03-04 16:42:56 +01:00
John J. Camilleri
94f76b9e36 Add more tests to Params5 which cause it to fail again
Originally found in PhrasebookFre
2021-03-04 13:38:55 +01:00
John J. Camilleri
f5886bf447 Add more complex param/table unit tests and pass them. Still fails on Phrasebook though. 2021-03-04 12:37:12 +01:00
John J. Camilleri
0ba0438dc7 Add a little colour to benchmark output 2021-03-04 10:20:57 +01:00
John J. Camilleri
30b016032d Also store Pre prefixes in token map. Introduce IntMapBuilder data structure.
Storing of prefixes uses show/read, which isn't a great solution but avoids having yet another token map.
2021-03-04 09:58:17 +01:00
John J. Camilleri
4082c006c3 Extract token strings and put them in map which linfuns refer to by index, to reduce LPGF sizes. 2021-03-04 00:16:12 +01:00
John J. Camilleri
adc162b374 Pass all unit tests and Foods again, with new strategy. Cleanup. 2021-03-03 15:21:32 +01:00
John J. Camilleri
3beed2c49e Replace list comprehension lookups with maps. Halfway through transitioning to new strategy for tables/params, see testsuite/lpgf/README.md. 2021-03-03 13:26:03 +01:00
John J. Camilleri
a8e3dc8855 Improve mkTreebank script. Add 100-tree Phrasebook treebank. Improve output in testsuite. 2021-03-03 11:01:31 +01:00
John J. Camilleri
997d7c1694 Use ErrorMonad instead of IOE
It probably ends up being the same thing, but the code is a little cleaner for it.
2021-03-03 09:36:48 +01:00
John J. Camilleri
4c09e4a340 Remove LF prefix from constructors. Pass all unit tests and Foods again, but improvements/cleanup still necessary. 2021-03-03 09:19:52 +01:00
John J. Camilleri
33e0e98aec Add 1-tree treebank for Phrasebook in a few languages 2021-02-28 00:34:46 +01:00
John J. Camilleri
83bc3c9c6e More work on params: pass all tests except params1 (!) 2021-02-27 23:13:02 +01:00
John J. Camilleri
f42b5ec9ef More work on params, but Foods fails now 2021-02-26 20:25:05 +01:00
John J. Camilleri
4771d9c356 WIP params 2021-02-26 17:18:21 +01:00
John J. Camilleri
9785f8351d Reduce Params2 further 2021-02-26 11:52:12 +01:00
John J. Camilleri
6a5d735904 Reduce Params2 unittest (still fails) 2021-02-26 10:26:11 +01:00
John J. Camilleri
8324ad8801 Add pretty-printing of LPGF grammars, to help debugging 2021-02-26 10:13:33 +01:00
John J. Camilleri
20290be616 Add Params2 unit test, from problem uncovered in PhrasebookGer 2021-02-22 10:52:37 +01:00
John J. Camilleri
b4a393ac09 Pass missing unit test 2021-02-21 14:22:46 +01:00
John J. Camilleri
9942908df9 Add unit test for missing lins 2021-02-21 14:05:31 +01:00
John J. Camilleri
dca2ebaf72 Add Phrasebook to testsuite. Move grammars into subfolders. Add run-bench script. 2021-02-20 13:22:29 +01:00
John J. Camilleri
5ad5789b31 Filter out record fields which don't exist in lintype
This is to work around an inconsistency in the canonical representation
2021-02-19 15:19:40 +01:00
John J. Camilleri
9f3f4139b1 Grammar and languages to run in testsuite can be specified by command line options, see README 2021-02-19 11:14:55 +01:00
John J. Camilleri
505c12c528 Rename run.hs to test.hs 2021-02-19 09:33:35 +01:00
John J. Camilleri
023b50557e Write LPGF dump to file when DEBUG is set, rather than console 2021-02-19 09:31:26 +01:00
John J. Camilleri
2b0493eece Tweak memory reporting and strictness in benchmark 2021-02-19 09:18:01 +01:00
John J. Camilleri
51e543878b Add support for wildcards when specifying modules names in benchmark compilation 2021-02-18 21:34:23 +01:00
John J. Camilleri
625386a14f Force evaluation in benchmark linearisation
BangPatterns only does WHNF which is not sufficient, previous benchmark results are thus wrong
2021-02-18 21:01:30 +01:00
John J. Camilleri
5240749fad Make grammar and trees files command line arguments into benchmark script 2021-02-18 15:27:25 +01:00
John J. Camilleri
e6079523f1 Remove ParamAliasDefs by inlining their definitions 2021-02-18 14:45:10 +01:00
John J. Camilleri
866a2101e1 When projecting a non-existent field, return Prelude.False
This seems to be GF's own behaviour, as exhibited by the canonical version of PhrasebookTha:

    NNumeral Numeral_0 = {s = Numeral_0.s; hasC = <>.hasC};
2021-02-18 14:42:39 +01:00
John J. Camilleri
d8557e8433 Enable debug output to files with envvar DEBUG=1 2021-02-18 14:40:03 +01:00
John J. Camilleri
7a5bc2dab3 Separate compile/run in benchmark 2021-02-17 16:57:06 +01:00
John J. Camilleri
9a263450f5 Add PFG2 linearisation to benchmark 2021-02-17 15:30:11 +01:00
John J. Camilleri
8e1fa4981f Add memory stats to benchmark 2021-02-17 15:02:39 +01:00
John J. Camilleri
b4fce5db59 Use envvars in benchmark for controlling PGF/LPGF. Add readme. 2021-02-17 11:44:00 +01:00
John J. Camilleri
6a7ead0f84 Add benchmark for comparing PGF and LPGF 2021-02-17 10:04:36 +01:00
John J. Camilleri
d3988f93d5 writePGF et al. functions return path[s] of written files 2021-02-17 10:03:52 +01:00
John J. Camilleri
236dbdbba3 Minor tidying 2021-02-17 00:15:44 +01:00
John J. Camilleri
768c3d9b2d Include return types for params, records, pre 2021-02-17 00:04:37 +01:00
John J. Camilleri
29114ce606 Improve binary format, reducing Foods.lpgf from 300 to 73KB (4x smaller!) 2021-02-16 23:30:21 +01:00
John J. Camilleri
5be21dba1c Add and pass FoodsJpn 2021-02-16 22:49:37 +01:00
John J. Camilleri
d5cf00f711 Add and pass all Foods languages, except Jpn 2021-02-16 22:41:28 +01:00
John J. Camilleri
312cfeb69d Add Afr, Amh, Cat, Cze, Dut, Ger foods grammars to testsuite 2021-02-16 22:33:26 +01:00
John J. Camilleri
2d03b9ee0c Finish type passing in val2lin, generalise projection case and pass FoodsFre testsuite. 2021-02-16 21:07:24 +01:00
John J. Camilleri
4c06c3f825 Add case for when pre is not followed by anything 2021-02-16 21:01:01 +01:00
John J. Camilleri
7227ede24b WIP return type from val2lin for use in projection case 2021-02-16 17:18:01 +01:00
John J. Camilleri
398b294734 Use Data.Text instead of String. Rename Abstr to Abstract, Concr to Concrete. 2021-02-16 16:04:40 +01:00
John J. Camilleri
d394cacddf Add support for CAPIT and ALL_CAPIT 2021-02-16 15:17:54 +01:00
John J. Camilleri
21f14c2aa1 Add support for SOFT_SPACE 2021-02-16 14:57:33 +01:00
John J. Camilleri
23e49cddb7 Add support for SOFT_BIND (which PGF runtime doesn't support) 2021-02-16 14:51:29 +01:00
John J. Camilleri
4d1217b06d Add support for pre 2021-02-15 21:57:05 +01:00
John J. Camilleri
4f0abe5540 Add FoodsFre, fails because pre is not implemented
Also an unhandled Projection case
2021-02-15 01:14:34 +01:00
John J. Camilleri
109822675b Pass test with FoodsFin, by forcibly resorting record fields to make s first 2021-02-15 00:43:53 +01:00
John J. Camilleri
d563abb928 Minors 2021-02-13 00:59:15 +01:00
John J. Camilleri
a58a6c8a59 Add FoodsFin to testsuite (fails) 2021-02-13 00:16:03 +01:00
John J. Camilleri
98f6136ebd Add support for BIND 2021-02-13 00:14:35 +01:00
John J. Camilleri
8cfaa69b6e Handle record tables, pass FoodSwe in testsuite 2021-02-12 23:51:16 +01:00
John J. Camilleri
a12f58e7b0 Add test case for selection using records (fails) 2021-02-10 13:55:38 +01:00
John J. Camilleri
d5f68970b9 Add FoodsSwe (fails) 2021-02-09 10:54:51 +01:00
John J. Camilleri
9c2d8eb0b2 Add FoodsChi, FoodsHeb to LPGF testsuite 2021-02-09 10:14:40 +01:00
John J. Camilleri
34f0fc0ba7 Fix bug in dynamic parameter handling, compile FoodsBul successfully 2021-02-03 15:41:27 +01:00
John J. Camilleri
42b9e7036e Support dynamic param values 2021-02-03 13:16:10 +01:00
John J. Camilleri
132f693713 Minor cleanup 2021-02-03 09:44:15 +01:00
John J. Camilleri
153bffdad7 Support nested parameters, but fails with non-static values (see FoodsBull, ASg kind.g). 2021-02-03 00:11:22 +01:00
John J. Camilleri
d09838e97e Separate .trees and .treebank, and add a script for making the latter from the former 2021-02-02 21:46:38 +01:00
John J. Camilleri
c94bffe435 Generalise testsuite script to use treebank files, add FoodEng 2021-02-02 21:22:36 +01:00
John J. Camilleri
2a5850023b Correctly handle projection, but only in limited cases 2021-02-01 13:08:39 +01:00
John J. Camilleri
fe15aa0c00 Use canonical GF in LPGF compiler
Still contains some hardcoded values, missing cases.

I notice now that LPGF and Canonical GF are almost identical, so maybe we don't need a new LPGF format,
just a linearization-only runtime which works on canonical grammars.
The argument for keeping LGPF is that it would be optimized for size and speed.
2021-02-01 12:28:06 +01:00
John J. Camilleri
cead0cc4c1 Add selection and projection cases but not working 2021-01-26 09:55:07 +01:00
John J. Camilleri
6f622b496b Rename Zero grammar to Walking 2021-01-26 09:35:21 +01:00
John J. Camilleri
270e7f021f Add binary instances 2021-01-25 14:42:00 +01:00
John J. Camilleri
32b0860925 Make LPGF testsuite work (but still fails)
stack test :lpgf
2021-01-25 13:41:33 +01:00
John J. Camilleri
f24c50339b Strip down format. More early work on compiler. Add testsuite (doesn't work yet). 2021-01-25 12:10:30 +01:00
John J. Camilleri
cd5881d83a Early work on LPGF compiler 2021-01-22 15:17:36 +01:00
John J. Camilleri
93b81b9f13 Add first version of LPGF datatype, with linearization function and some hardcoded examples 2021-01-22 14:07:41 +01:00
John J. Camilleri
8ad9cf1e09 Add flag and stubs for compiling to LPGF format 2021-01-19 17:21:13 +01:00
404 changed files with 316167 additions and 5643 deletions

View File

@@ -12,34 +12,28 @@ jobs:
name: ${{ matrix.os }} / ghc ${{ matrix.ghc }} name: ${{ matrix.os }} / ghc ${{ matrix.ghc }}
runs-on: ${{ matrix.os }} runs-on: ${{ matrix.os }}
strategy: strategy:
fail-fast: false
matrix: matrix:
os: [ubuntu-latest, macos-latest, windows-latest] os: [ubuntu-latest, macos-latest, windows-latest]
cabal: ["latest"] cabal: ["3.2"]
ghc: ghc:
- "8.6.5" - "8.6.5"
- "8.8.3" - "8.8.3"
- "8.10.7" - "8.10.1"
- "9.6.7"
exclude: exclude:
- os: macos-latest - os: macos-latest
ghc: 8.8.3 ghc: 8.8.3
- os: macos-latest - os: macos-latest
ghc: 8.6.5 ghc: 8.6.5
- os: macos-latest
ghc: 8.10.7
- os: windows-latest - os: windows-latest
ghc: 8.8.3 ghc: 8.8.3
- os: windows-latest - os: windows-latest
ghc: 8.6.5 ghc: 8.6.5
- os: windows-latest
ghc: 8.10.7
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master' if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: haskell-actions/setup@v2 - uses: actions/setup-haskell@v1.1.4
id: setup-haskell-cabal id: setup-haskell-cabal
name: Setup Haskell name: Setup Haskell
with: with:
@@ -50,7 +44,7 @@ jobs:
run: | run: |
cabal freeze cabal freeze
- uses: actions/cache@v4 - uses: actions/cache@v1
name: Cache ~/.cabal/store name: Cache ~/.cabal/store
with: with:
path: ${{ steps.setup-haskell-cabal.outputs.cabal-store }} path: ${{ steps.setup-haskell-cabal.outputs.cabal-store }}
@@ -68,41 +62,34 @@ jobs:
stack: stack:
name: stack / ghc ${{ matrix.ghc }} name: stack / ghc ${{ matrix.ghc }}
runs-on: ${{ matrix.ghc == '7.10.3' && 'ubuntu-20.04' || 'ubuntu-latest' }} runs-on: ubuntu-latest
strategy: strategy:
fail-fast: false
matrix: matrix:
stack: ["latest"] stack: ["2.3.3"]
ghc: ["8.4.4", "8.6.5", "8.8.4", "8.10.7", "9.0.2", "9.6.7"] ghc: ["7.10.3","8.0.2", "8.2.2", "8.4.4", "8.6.5", "8.8.4"]
# ghc: ["8.8.3"]
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master' if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: haskell-actions/setup@v2 - uses: actions/setup-haskell@v1.1.4
name: Setup Haskell Stack name: Setup Haskell Stack
with: with:
ghc-version: ${{ matrix.ghc }} # ghc-version: ${{ matrix.ghc }}
stack-version: 'latest' stack-version: ${{ matrix.stack }}
enable-stack: true
- uses: actions/cache@v1
# Fix linker errrors on ghc-7.10.3 for ubuntu (see https://github.com/commercialhaskell/stack/blob/255cd830627870cdef34b5e54d670ef07882523e/doc/faq.md#i-get-strange-ld-errors-about-recompiling-with--fpic)
- run: sed -i.bak 's/"C compiler link flags", "/&-no-pie /' /home/runner/.ghcup/ghc/7.10.3/lib/ghc-7.10.3/settings
if: matrix.ghc == '7.10.3'
- uses: actions/cache@v4
name: Cache ~/.stack name: Cache ~/.stack
with: with:
path: ~/.stack path: ~/.stack
key: ${{ runner.os }}-${{ matrix.ghc }}-stack--${{ hashFiles(format('stack-ghc{0}', matrix.ghc)) }} key: ${{ runner.os }}-${{ matrix.ghc }}-stack
restore-keys: |
${{ runner.os }}-${{ matrix.ghc }}-stack
- name: Build - name: Build
run: | run: |
stack build --test --no-run-tests --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml stack build --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml
# stack build --system-ghc --test --bench --no-run-tests --no-run-benchmarks
- name: Test # - name: Test
run: | # run: |
stack test --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml # stack test --system-ghc

View File

@@ -3,7 +3,6 @@ name: Build Binary Packages
on: on:
workflow_dispatch: workflow_dispatch:
release: release:
types: ["created"]
jobs: jobs:
@@ -11,13 +10,11 @@ jobs:
ubuntu: ubuntu:
name: Build Ubuntu package name: Build Ubuntu package
strategy: runs-on: ubuntu-18.04
matrix: # strategy:
ghc: ["9.6"] # matrix:
cabal: ["3.10"] # ghc: ["8.6.5"]
os: ["ubuntu-24.04"] # cabal: ["2.4"]
runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
@@ -25,13 +22,12 @@ jobs:
# Note: `haskell-platform` is listed as requirement in debian/control, # Note: `haskell-platform` is listed as requirement in debian/control,
# which is why it's installed using apt instead of the Setup Haskell action. # which is why it's installed using apt instead of the Setup Haskell action.
- name: Setup Haskell # - name: Setup Haskell
uses: haskell-actions/setup@v2 # uses: actions/setup-haskell@v1
id: setup-haskell-cabal # id: setup-haskell-cabal
with: # with:
ghc-version: ${{ matrix.ghc }} # ghc-version: ${{ matrix.ghc }}
cabal-version: ${{ matrix.cabal }} # cabal-version: ${{ matrix.cabal }}
if: matrix.os == 'ubuntu-24.04'
- name: Install build tools - name: Install build tools
run: | run: |
@@ -40,15 +36,14 @@ jobs:
make \ make \
dpkg-dev \ dpkg-dev \
debhelper \ debhelper \
haskell-platform \
libghc-json-dev \ libghc-json-dev \
python-dev \
default-jdk \ default-jdk \
python-dev-is-python3 \
libtool-bin libtool-bin
cabal install alex happy
- name: Build package - name: Build package
run: | run: |
export PYTHONPATH="/home/runner/work/gf-core/gf-core/debian/gf/usr/local/lib/python3.12/dist-packages/"
make deb make deb
- name: Copy package - name: Copy package
@@ -56,41 +51,27 @@ jobs:
cp ../gf_*.deb dist/ cp ../gf_*.deb dist/
- name: Upload artifact - name: Upload artifact
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v2
with: with:
name: gf-${{ github.event.release.tag_name }}-${{ matrix.os }}.deb name: gf-${{ github.sha }}-ubuntu
path: dist/gf_*.deb path: dist/gf_*.deb
if-no-files-found: error if-no-files-found: error
- name: Rename package for specific ubuntu version
run: |
mv dist/gf_*.deb dist/gf-${{ github.event.release.tag_name }}-${{ matrix.os }}.deb
#- uses: actions/upload-release-asset@v1.0.2
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ github.event.release.upload_url }}
# asset_path: dist/gf-${{ github.event.release.tag_name }}-${{ matrix.os }}.deb
# asset_name: gf-${{ github.event.release.tag_name }}-${{ matrix.os }}.deb
# asset_content_type: application/octet-stream
# --- # ---
macos: macos:
name: Build macOS package name: Build macOS package
runs-on: macos-10.15
strategy: strategy:
matrix: matrix:
ghc: ["9.6"] ghc: ["8.6.5"]
cabal: ["3.10"] cabal: ["2.4"]
os: ["macos-latest", "macos-13"]
runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
- name: Setup Haskell - name: Setup Haskell
uses: haskell-actions/setup@v2 uses: actions/setup-haskell@v1
id: setup-haskell-cabal id: setup-haskell-cabal
with: with:
ghc-version: ${{ matrix.ghc }} ghc-version: ${{ matrix.ghc }}
@@ -99,10 +80,8 @@ jobs:
- name: Install build tools - name: Install build tools
run: | run: |
brew install \ brew install \
automake \ automake
libtool
cabal v1-install alex happy cabal v1-install alex happy
pip install setuptools
- name: Build package - name: Build package
run: | run: |
@@ -111,35 +90,21 @@ jobs:
make pkg make pkg
- name: Upload artifact - name: Upload artifact
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v2
with: with:
name: gf-${{ github.event.release.tag_name }}-${{ matrix.os }} name: gf-${{ github.sha }}-macos
path: dist/gf-*.pkg path: dist/gf-*.pkg
if-no-files-found: error if-no-files-found: error
- name: Rename package
run: |
mv dist/gf-*.pkg dist/gf-${{ github.event.release.tag_name }}-macos.pkg
#- uses: actions/upload-release-asset@v1.0.2
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ github.event.release.upload_url }}
# asset_path: dist/gf-${{ github.event.release.tag_name }}-macos.pkg
# asset_name: gf-${{ github.event.release.tag_name }}-macos.pkg
# asset_content_type: application/octet-stream
# --- # ---
windows: windows:
name: Build Windows package name: Build Windows package
runs-on: windows-2019
strategy: strategy:
matrix: matrix:
ghc: ["9.6.7"] ghc: ["8.6.5"]
cabal: ["3.10"] cabal: ["2.4"]
os: ["windows-2022"]
runs-on: ${{ matrix.os }}
steps: steps:
- uses: actions/checkout@v2 - uses: actions/checkout@v2
@@ -151,7 +116,6 @@ jobs:
base-devel base-devel
gcc gcc
python-devel python-devel
autotools
- name: Prepare dist folder - name: Prepare dist folder
shell: msys2 {0} shell: msys2 {0}
@@ -172,23 +136,17 @@ jobs:
cp /mingw64/bin/libpgf-0.dll /c/tmp-dist/c cp /mingw64/bin/libpgf-0.dll /c/tmp-dist/c
cp /mingw64/bin/libgu-0.dll /c/tmp-dist/c cp /mingw64/bin/libgu-0.dll /c/tmp-dist/c
# JAVA_HOME_8_X64 = C:\hostedtoolcache\windows\Java_Adopt_jdk\8.0.292-10\x64
- name: Build Java bindings - name: Build Java bindings
shell: msys2 {0} shell: msys2 {0}
run: | run: |
echo $JAVA_HOME_8_X64 export PATH="${PATH}:/c/Program Files/Java/jdk8u275-b01/bin"
export JDKPATH="$(cygpath -u "${JAVA_HOME_8_X64}")"
export PATH="${PATH}:${JDKPATH}/bin"
cd src/runtime/java cd src/runtime/java
make \ make \
JNI_INCLUDES="-I \"${JDKPATH}/include\" -I \"${JDKPATH}/include/win32\" -I \"/mingw64/include\" -D__int64=int64_t" \ JNI_INCLUDES="-I \"/c/Program Files/Java/jdk8u275-b01/include\" -I \"/c/Program Files/Java/jdk8u275-b01/include/win32\" -I \"/mingw64/include\" -D__int64=int64_t" \
WINDOWS_LDFLAGS="-L\"/mingw64/lib\" -no-undefined" WINDOWS_LDFLAGS="-L\"/mingw64/lib\" -no-undefined"
make install make install
cp .libs/msys-jpgf-0.dll /c/tmp-dist/java/jpgf.dll cp .libs//msys-jpgf-0.dll /c/tmp-dist/java/jpgf.dll
cp jpgf.jar /c/tmp-dist/java cp jpgf.jar /c/tmp-dist/java
if: false
# - uses: actions/setup-python@v5
- name: Build Python bindings - name: Build Python bindings
shell: msys2 {0} shell: msys2 {0}
@@ -197,13 +155,12 @@ jobs:
EXTRA_LIB_DIRS: /mingw64/lib EXTRA_LIB_DIRS: /mingw64/lib
run: | run: |
cd src/runtime/python cd src/runtime/python
pacman --noconfirm -S python-setuptools
python setup.py build python setup.py build
python setup.py install python setup.py install
cp -r /usr/lib/python3.12/site-packages/pgf* /c/tmp-dist/python cp /usr/lib/python3.8/site-packages/pgf* /c/tmp-dist/python
- name: Setup Haskell - name: Setup Haskell
uses: haskell-actions/setup@v2 uses: actions/setup-haskell@v1
id: setup-haskell-cabal id: setup-haskell-cabal
with: with:
ghc-version: ${{ matrix.ghc }} ghc-version: ${{ matrix.ghc }}
@@ -215,26 +172,14 @@ jobs:
- name: Build GF - name: Build GF
run: | run: |
cabal install -fserver --only-dependencies cabal install --only-dependencies -fserver
cabal configure -fserver cabal configure -fserver
cabal build cabal build
copy dist-newstyle/build/x86_64-windows/ghc-${{matrix.ghc}}/*/x/gf/build/gf/gf.exe C:/tmp-dist copy dist\build\gf\gf.exe C:\tmp-dist
- name: Upload artifact - name: Upload artifact
uses: actions/upload-artifact@v4 uses: actions/upload-artifact@v2
with: with:
name: gf-${{ github.event.release.tag_name }}-windows name: gf-${{ github.sha }}-windows
path: C:\tmp-dist\* path: C:\tmp-dist\*
if-no-files-found: error if-no-files-found: error
- name: Create archive
run: |
Compress-Archive C:\tmp-dist C:\gf-${{ github.event.release.tag_name }}-windows.zip
#- uses: actions/upload-release-asset@v1.0.2
# env:
# GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
# with:
# upload_url: ${{ github.event.release.upload_url }}
# asset_path: C:\gf-${{ github.event.release.tag_name }}-windows.zip
# asset_name: gf-${{ github.event.release.tag_name }}-windows.zip
# asset_content_type: application/zip

View File

@@ -13,25 +13,24 @@ jobs:
strategy: strategy:
fail-fast: true fail-fast: true
matrix: matrix:
os: [ubuntu-latest, macos-latest, macos-13] os: [ubuntu-18.04, macos-10.15]
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v1
- uses: actions/setup-python@v5 - uses: actions/setup-python@v1
name: Install Python name: Install Python
with: with:
python-version: '3.x' python-version: '3.7'
- name: Install cibuildwheel - name: Install cibuildwheel
run: | run: |
python -m pip install cibuildwheel python -m pip install git+https://github.com/joerick/cibuildwheel.git@master
- name: Install build tools for OSX - name: Install build tools for OSX
if: startsWith(matrix.os, 'macos') if: startsWith(matrix.os, 'macos')
run: | run: |
brew install automake brew install automake
brew install libtool
- name: Build wheels on Linux - name: Build wheels on Linux
if: startsWith(matrix.os, 'macos') != true if: startsWith(matrix.os, 'macos') != true
@@ -43,32 +42,30 @@ jobs:
- name: Build wheels on OSX - name: Build wheels on OSX
if: startsWith(matrix.os, 'macos') if: startsWith(matrix.os, 'macos')
env: env:
CIBW_BEFORE_BUILD: cd src/runtime/c && glibtoolize && autoreconf -i && ./configure && make && sudo make install CIBW_BEFORE_BUILD: cd src/runtime/c && glibtoolize && autoreconf -i && ./configure && make && make install
run: | run: |
python -m cibuildwheel src/runtime/python --output-dir wheelhouse python -m cibuildwheel src/runtime/python --output-dir wheelhouse
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v2
with: with:
name: wheel-${{ matrix.os }}
path: ./wheelhouse path: ./wheelhouse
build_sdist: build_sdist:
name: Build source distribution name: Build source distribution
runs-on: ubuntu-latest runs-on: ubuntu-latest
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v2
- uses: actions/setup-python@v5 - uses: actions/setup-python@v2
name: Install Python name: Install Python
with: with:
python-version: '3.10' python-version: '3.7'
- name: Build sdist - name: Build sdist
run: cd src/runtime/python && python setup.py sdist run: cd src/runtime/python && python setup.py sdist
- uses: actions/upload-artifact@v4 - uses: actions/upload-artifact@v2
with: with:
name: wheel-source
path: ./src/runtime/python/dist/*.tar.gz path: ./src/runtime/python/dist/*.tar.gz
upload_pypi: upload_pypi:
@@ -78,25 +75,24 @@ jobs:
if: github.ref == 'refs/heads/master' && github.event_name == 'push' if: github.ref == 'refs/heads/master' && github.event_name == 'push'
steps: steps:
- uses: actions/checkout@v4 - uses: actions/checkout@v2
- name: Set up Python - name: Set up Python
uses: actions/setup-python@v5 uses: actions/setup-python@v2
with: with:
python-version: '3.x' python-version: '3.x'
- name: Install twine - name: Install twine
run: pip install twine run: pip install twine
- uses: actions/download-artifact@v4.1.7 - uses: actions/download-artifact@v2
with: with:
pattern: wheel-* name: artifact
merge-multiple: true
path: ./dist path: ./dist
- name: Publish - name: Publish
env: env:
TWINE_USERNAME: __token__ TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.PYPI_PASSWORD }} TWINE_PASSWORD: ${{ secrets.pypi_password }}
run: | run: |
twine upload --verbose --non-interactive --skip-existing dist/* (cd ./src/runtime/python && curl -I --fail https://pypi.org/project/$(python setup.py --name)/$(python setup.py --version)/) || twine upload dist/*

11
.gitignore vendored
View File

@@ -5,6 +5,7 @@
*.jar *.jar
*.gfo *.gfo
*.pgf *.pgf
*.lpgf
debian/.debhelper debian/.debhelper
debian/debhelper-build-stamp debian/debhelper-build-stamp
debian/gf debian/gf
@@ -53,10 +54,6 @@ DATA_DIR
stack*.yaml.lock stack*.yaml.lock
# Output files for test suite
*.out
gf-tests.html
# Generated documentation (not exhaustive) # Generated documentation (not exhaustive)
demos/index-numbers.html demos/index-numbers.html
demos/resourcegrammars.html demos/resourcegrammars.html
@@ -73,9 +70,3 @@ doc/icfp-2012.html
download/*.html download/*.html
gf-book/index.html gf-book/index.html
src/www/gf-web-api.html src/www/gf-web-api.html
.devenv
.direnv
result
.vscode
.envrc
.pre-commit-config.yaml

14
.travis.yml Normal file
View File

@@ -0,0 +1,14 @@
sudo: required
language: c
services:
- docker
before_install:
- docker pull odanoburu/gf-src:3.9
script:
- |
docker run --mount src="$(pwd)",target=/home/gfer,type=bind odanoburu/gf-src:3.9 /bin/bash -c "cd /home/gfer/src/runtime/c &&
autoreconf -i && ./configure && make && make install ; cd /home/gfer ; cabal install -fserver -fc-runtime --extra-lib-dirs='/usr/local/lib'"

View File

@@ -1,12 +0,0 @@
### New since 3.12 (WIP)
### 3.12
See <https://www.grammaticalframework.org/download/release-3.12.html>
### 3.11
See <https://www.grammaticalframework.org/download/release-3.11.html>
### 3.10
See <https://www.grammaticalframework.org/download/release-3.10.html>

View File

@@ -1,48 +1,31 @@
.PHONY: all build install doc clean html deb pkg bintar sdist .PHONY: all build install doc clean gf html deb pkg bintar sdist
# This gets the numeric part of the version from the cabal file # This gets the numeric part of the version from the cabal file
VERSION=$(shell sed -ne "s/^version: *\([0-9.]*\).*/\1/p" gf.cabal) VERSION=$(shell sed -ne "s/^version: *\([0-9.]*\).*/\1/p" gf.cabal)
# Check if stack is installed
STACK=$(shell if hash stack 2>/dev/null; then echo "1"; else echo "0"; fi)
# Check if cabal >= 2.4 is installed (with v1- and v2- commands)
CABAL_NEW=$(shell if cabal v1-repl --help >/dev/null 2>&1 ; then echo "1"; else echo "0"; fi)
ifeq ($(STACK),1)
CMD=stack
else
CMD=cabal
ifeq ($(CABAL_NEW),1)
CMD_PFX=v1-
endif
endif
all: build all: build
dist/setup-config: gf.cabal Setup.hs WebSetup.hs dist/setup-config: gf.cabal Setup.hs WebSetup.hs
ifneq ($(STACK),1) cabal configure
cabal ${CMD_PFX}configure
endif
build: dist/setup-config build: dist/setup-config
${CMD} ${CMD_PFX}build cabal build
install: install:
ifeq ($(STACK),1) cabal copy
stack install cabal register
else
cabal ${CMD_PFX}copy
cabal ${CMD_PFX}register
endif
doc: doc:
${CMD} ${CMD_PFX}haddock cabal haddock
clean: clean:
${CMD} ${CMD_PFX}clean cabal clean
bash bin/clean_html bash bin/clean_html
gf:
cabal build rgl-none
strip dist/build/gf/gf
html:: html::
bash bin/update_html bash bin/update_html
@@ -50,9 +33,9 @@ html::
# number to the top of debian/changelog. # number to the top of debian/changelog.
# (Tested on Ubuntu 15.04. You need to install dpkg-dev & debhelper.) # (Tested on Ubuntu 15.04. You need to install dpkg-dev & debhelper.)
deb: deb:
dpkg-buildpackage -b -uc -d dpkg-buildpackage -b -uc
# Make a macOS installer package # Make an OS X Installer package
pkg: pkg:
FMT=pkg bash bin/build-binary-dist.sh FMT=pkg bash bin/build-binary-dist.sh
@@ -65,6 +48,6 @@ bintar:
# Make a source tar.gz distribution using git to make sure that everything is included. # Make a source tar.gz distribution using git to make sure that everything is included.
# We put the distribution in dist/ so it is removed on `make clean` # We put the distribution in dist/ so it is removed on `make clean`
# sdist: sdist:
# test -d dist || mkdir dist test -d dist || mkdir dist
# git archive --format=tar.gz --output=dist/gf-${VERSION}.tar.gz HEAD git archive --format=tar.gz --output=dist/gf-${VERSION}.tar.gz HEAD

View File

@@ -1,4 +1,4 @@
![GF Logo](https://www.grammaticalframework.org/doc/Logos/gf1.svg) ![GF Logo](doc/Logos/gf1.svg)
# Grammatical Framework (GF) # Grammatical Framework (GF)
@@ -30,31 +30,13 @@ GF particularly addresses four aspects of grammars:
## Compilation and installation ## Compilation and installation
The simplest way of installing GF from source is with the command: The simplest way of installing GF is with the command:
``` ```
cabal install cabal install
``` ```
or:
```
stack install
```
Note that if you are unlucky to have Cabal 3.0 or later, then it uses
the so-called Nix style commands. Using those for GF development is
a pain. Every time when you change something in the source code, Cabal
will generate a new folder for GF to look for the GF libraries and
the GF cloud. Either reinstall everything with every change in the
compiler, or be sane and stop using cabal-install. Instead you can do:
```
runghc Setup.hs configure
runghc Setup.hs build
sudo runghc Setup.hs install
```
The script will install the GF dependencies globally. The only solution
to the Nix madness that I found is radical:
"No person, no problem" (Нет человека нет проблемы). For more details, see the [download page](http://www.grammaticalframework.org/download/index.html)
and [developers manual](http://www.grammaticalframework.org/doc/gf-developers.html).
For more information, including links to precompiled binaries, see the [download page](https://www.grammaticalframework.org/download/index.html).
## About this repository ## About this repository

View File

@@ -45,16 +45,11 @@ but the generated _artifacts_ must be manually attached to the release as _asset
### 4. Upload to Hackage ### 4. Upload to Hackage
In order to do this you will need to be added the [GF maintainers](https://hackage.haskell.org/package/gf/maintainers/) on Hackage. 1. Run `make sdist`
1. Run `stack sdist --test-tarball` and address any issues.
2. Upload the package, either: 2. Upload the package, either:
1. **Manually**: visit <https://hackage.haskell.org/upload> and upload the file generated by the previous command. 1. **Manually**: visit <https://hackage.haskell.org/upload> and upload the file `dist/gf-X.Y.tar.gz`
2. **via Stack**: `stack upload . --candidate` 2. **via Cabal (≥2.4)**: `cabal upload dist/gf-X.Y.tar.gz`
3. After testing the candidate, publish it: 3. If the documentation-building fails on the Hackage server, do:
1. **Manually**: visit <https://hackage.haskell.org/package/gf-X.Y.Z/candidate/publish>
1. **via Stack**: `stack upload .`
4. If the documentation-building fails on the Hackage server, do:
``` ```
cabal v2-haddock --builddir=dist/docs --haddock-for-hackage --enable-doc cabal v2-haddock --builddir=dist/docs --haddock-for-hackage --enable-doc
cabal upload --documentation dist/docs/*-docs.tar.gz cabal upload --documentation dist/docs/*-docs.tar.gz

View File

@@ -4,68 +4,42 @@ import Distribution.Simple.LocalBuildInfo(LocalBuildInfo(..),absoluteInstallDirs
import Distribution.Simple.Setup(BuildFlags(..),Flag(..),InstallFlags(..),CopyDest(..),CopyFlags(..),SDistFlags(..)) import Distribution.Simple.Setup(BuildFlags(..),Flag(..),InstallFlags(..),CopyDest(..),CopyFlags(..),SDistFlags(..))
import Distribution.PackageDescription(PackageDescription(..),emptyHookedBuildInfo) import Distribution.PackageDescription(PackageDescription(..),emptyHookedBuildInfo)
import Distribution.Simple.BuildPaths(exeExtension) import Distribution.Simple.BuildPaths(exeExtension)
import System.Directory
import System.FilePath((</>),(<.>)) import System.FilePath((</>),(<.>))
import System.Process
import Control.Monad(forM_,unless)
import Control.Exception(bracket_)
import Data.Char(isSpace)
import WebSetup import WebSetup
-- | Notice about RGL not built anymore
noRGLmsg :: IO ()
noRGLmsg = putStrLn "Notice: the RGL is not built as part of GF anymore. See https://github.com/GrammaticalFramework/gf-rgl"
main :: IO () main :: IO ()
main = defaultMainWithHooks simpleUserHooks main = defaultMainWithHooks simpleUserHooks
{ preConf = gfPreConf { preBuild = gfPreBuild
, preBuild = gfPreBuild
, postBuild = gfPostBuild , postBuild = gfPostBuild
, preInst = gfPreInst , preInst = gfPreInst
, postInst = gfPostInst , postInst = gfPostInst
, postCopy = gfPostCopy , postCopy = gfPostCopy
} }
where where
gfPreConf args flags = do gfPreBuild args = gfPre args . buildDistPref
pkgs <- fmap (map (dropWhile isSpace) . tail . lines) gfPreInst args = gfPre args . installDistPref
(readProcess "ghc-pkg" ["list"] "")
forM_ dependencies $ \pkg -> do
let name = takeWhile (/='/') (drop 36 pkg)
unless (name `elem` pkgs) $ do
let fname = name <.> ".tar.gz"
callProcess "wget" [pkg,"-O",fname]
callProcess "tar" ["-xzf",fname]
removeFile fname
bracket_ (setCurrentDirectory name) (setCurrentDirectory ".." >> removeDirectoryRecursive name) $ do
exists <- doesFileExist "Setup.hs"
unless exists $ do
writeFile "Setup.hs" (unlines [
"import Distribution.Simple",
"main = defaultMain"
])
let to_descr = reverse .
(++) (reverse ".cabal") .
drop 1 .
dropWhile (/='-') .
reverse
callProcess "wget" [to_descr pkg, "-O", to_descr name]
callProcess "runghc" ["Setup.hs","configure"]
callProcess "runghc" ["Setup.hs","build"]
callProcess "sudo" ["runghc","Setup.hs","install"]
preConf simpleUserHooks args flags
gfPreBuild args = gfPre args . buildDistPref
gfPreInst args = gfPre args . installDistPref
gfPre args distFlag = do gfPre args distFlag = do
return emptyHookedBuildInfo return emptyHookedBuildInfo
gfPostBuild args flags pkg lbi = do gfPostBuild args flags pkg lbi = do
-- noRGLmsg
let gf = default_gf lbi let gf = default_gf lbi
buildWeb gf flags (pkg,lbi) buildWeb gf flags (pkg,lbi)
gfPostInst args flags pkg lbi = do gfPostInst args flags pkg lbi = do
-- noRGLmsg
saveInstallPath args flags (pkg,lbi)
installWeb (pkg,lbi) installWeb (pkg,lbi)
gfPostCopy args flags pkg lbi = do gfPostCopy args flags pkg lbi = do
-- noRGLmsg
saveCopyPath args flags (pkg,lbi)
copyWeb flags (pkg,lbi) copyWeb flags (pkg,lbi)
-- `cabal sdist` will not make a proper dist archive, for that see `make sdist` -- `cabal sdist` will not make a proper dist archive, for that see `make sdist`
@@ -73,16 +47,27 @@ main = defaultMainWithHooks simpleUserHooks
gfSDist pkg lbi hooks flags = do gfSDist pkg lbi hooks flags = do
return () return ()
dependencies = [ saveInstallPath :: [String] -> InstallFlags -> (PackageDescription, LocalBuildInfo) -> IO ()
"https://hackage.haskell.org/package/utf8-string-1.0.2/utf8-string-1.0.2.tar.gz", saveInstallPath args flags bi = do
"https://hackage.haskell.org/package/json-0.10/json-0.10.tar.gz", let
"https://hackage.haskell.org/package/network-bsd-2.8.1.0/network-bsd-2.8.1.0.tar.gz", dest = NoCopyDest
"https://hackage.haskell.org/package/httpd-shed-0.4.1.1/httpd-shed-0.4.1.1.tar.gz", dir = datadir (uncurry absoluteInstallDirs bi dest)
"https://hackage.haskell.org/package/exceptions-0.10.5/exceptions-0.10.5.tar.gz", writeFile dataDirFile dir
"https://hackage.haskell.org/package/stringsearch-0.3.6.6/stringsearch-0.3.6.6.tar.gz",
"https://hackage.haskell.org/package/multipart-0.2.1/multipart-0.2.1.tar.gz", saveCopyPath :: [String] -> CopyFlags -> (PackageDescription, LocalBuildInfo) -> IO ()
"https://hackage.haskell.org/package/cgi-3001.5.0.0/cgi-3001.5.0.0.tar.gz" saveCopyPath args flags bi = do
] let
dest = case copyDest flags of
NoFlag -> NoCopyDest
Flag d -> d
dir = datadir (uncurry absoluteInstallDirs bi dest)
writeFile dataDirFile dir
-- | Name of file where installation's data directory is recording
-- This is a last-resort way in which the seprate RGL build script
-- can determine where to put the compiled RGL files
dataDirFile :: String
dataDirFile = "DATA_DIR"
-- | Get path to locally-built gf -- | Get path to locally-built gf
default_gf :: LocalBuildInfo -> FilePath default_gf :: LocalBuildInfo -> FilePath

View File

@@ -26,14 +26,6 @@ import Distribution.PackageDescription(PackageDescription(..))
so users won't see this message unless they check the log.) so users won't see this message unless they check the log.)
-} -}
-- | Notice about contrib grammars
noContribMsg :: IO ()
noContribMsg = putStr $ unlines
[ "Example grammars are no longer included in the main GF repository, but have moved to gf-contrib."
, "If you want them to be built, clone the following repository in the same directory as gf-core:"
, "https://github.com/GrammaticalFramework/gf-contrib.git"
]
example_grammars :: [(String, String, [String])] -- [(pgf, subdir, source modules)] example_grammars :: [(String, String, [String])] -- [(pgf, subdir, source modules)]
example_grammars = example_grammars =
[("Letter.pgf","letter",letterSrc) [("Letter.pgf","letter",letterSrc)
@@ -58,8 +50,11 @@ buildWeb gf flags (pkg,lbi) = do
contrib_exists <- doesDirectoryExist contrib_dir contrib_exists <- doesDirectoryExist contrib_dir
if contrib_exists if contrib_exists
then mapM_ build_pgf example_grammars then mapM_ build_pgf example_grammars
-- else noContribMsg else putStr $ unlines
else return () [ "Example grammars are no longer included in the main GF repository, but have moved to gf-contrib."
, "If you want these example grammars to be built, clone this repository in the same top-level directory as GF:"
, "https://github.com/GrammaticalFramework/gf-contrib.git"
]
where where
gfo_dir = buildDir lbi </> "examples" gfo_dir = buildDir lbi </> "examples"

View File

@@ -32,7 +32,7 @@ set -x # print commands before executing them
pushd src/runtime/c pushd src/runtime/c
bash setup.sh configure --prefix="$prefix" bash setup.sh configure --prefix="$prefix"
bash setup.sh build bash setup.sh build
# bash setup.sh install prefix="$prefix" # hack required for GF build on macOS bash setup.sh install prefix="$prefix" # hack required for GF build on macOS
bash setup.sh install prefix="$destdir$prefix" bash setup.sh install prefix="$destdir$prefix"
popd popd
@@ -46,7 +46,7 @@ if which >/dev/null python; then
pyver=$(ls "$destdir$prefix/lib" | sed -n 's/^python//p') pyver=$(ls "$destdir$prefix/lib" | sed -n 's/^python//p')
pydest="$destdir/Library/Python/$pyver/site-packages" pydest="$destdir/Library/Python/$pyver/site-packages"
mkdir -p "$pydest" mkdir -p "$pydest"
ln "$destdir$prefix/lib/python$pyver/site-packages"/pgf*.so "$pydest" ln "$destdir$prefix/lib/python$pyver/site-packages"/pgf* "$pydest"
fi fi
popd popd
else else

11
debian/changelog vendored
View File

@@ -1,14 +1,3 @@
gf (3.12) noble; urgency=low
* GF 3.12
-- Inari Listenmaa <inari@digitalgrammars.com> Fri, 8 Aug 2025 18:29:29 +0100
gf (3.11) bionic focal; urgency=low
* GF 3.11
-- Inari Listenmaa <inari@digitalgrammars.com> Sun, 25 Jul 2021 10:27:40 +0800
gf (3.10.4-1) xenial bionic cosmic; urgency=low gf (3.10.4-1) xenial bionic cosmic; urgency=low
* GF 3.10.4 * GF 3.10.4

2
debian/control vendored
View File

@@ -3,7 +3,7 @@ Section: devel
Priority: optional Priority: optional
Maintainer: Thomas Hallgren <hallgren@chalmers.se> Maintainer: Thomas Hallgren <hallgren@chalmers.se>
Standards-Version: 3.9.2 Standards-Version: 3.9.2
Build-Depends: debhelper (>= 5), libghc-haskeline-dev, libghc-mtl-dev, libghc-json-dev, autoconf, automake, libtool-bin, python-dev-is-python3, java-sdk Build-Depends: debhelper (>= 5), haskell-platform (>= 2011.2.0.1), libghc-haskeline-dev, libghc-mtl-dev, libghc-json-dev, autoconf, automake, libtool-bin, python-dev, java-sdk
Homepage: http://www.grammaticalframework.org/ Homepage: http://www.grammaticalframework.org/
Package: gf Package: gf

18
debian/rules vendored
View File

@@ -17,30 +17,28 @@ override_dh_auto_configure:
cd src/runtime/c && bash setup.sh configure --prefix=/usr cd src/runtime/c && bash setup.sh configure --prefix=/usr
cd src/runtime/c && bash setup.sh build cd src/runtime/c && bash setup.sh build
cabal update cabal update
cabal v1-install --only-dependencies cabal install --only-dependencies
cabal v1-configure --prefix=/usr -fserver -fc-runtime --extra-lib-dirs=$(CURDIR)/src/runtime/c/.libs --extra-include-dirs=$(CURDIR)/src/runtime/c cabal configure --prefix=/usr -fserver -fc-runtime --extra-lib-dirs=$(CURDIR)/src/runtime/c/.libs --extra-include-dirs=$(CURDIR)/src/runtime/c
SET_LDL=LD_LIBRARY_PATH=$$LD_LIBRARY_PATH:$(CURDIR)/src/runtime/c/.libs SET_LDL=LD_LIBRARY_PATH=$$LD_LIBRARY_PATH:$(CURDIR)/src/runtime/c/.libs
override_dh_auto_build: override_dh_auto_build:
cd src/runtime/python && EXTRA_INCLUDE_DIRS=$(CURDIR)/src/runtime/c EXTRA_LIB_DIRS=$(CURDIR)/src/runtime/c/.libs python setup.py build cd src/runtime/python && EXTRA_INCLUDE_DIRS=$(CURDIR)/src/runtime/c EXTRA_LIB_DIRS=$(CURDIR)/src/runtime/c/.libs python setup.py build
# cd src/runtime/java && make CFLAGS="-I$(CURDIR)/src/runtime/c -L$(CURDIR)/src/runtime/c/.libs" INSTALL_PATH=/usr cd src/runtime/java && make CFLAGS="-I$(CURDIR)/src/runtime/c -L$(CURDIR)/src/runtime/c/.libs" INSTALL_PATH=/usr
echo $(SET_LDL) echo $(SET_LDL)
-$(SET_LDL) cabal v1-build -$(SET_LDL) cabal build
override_dh_auto_install: override_dh_auto_install:
$(SET_LDL) cabal v1-copy --destdir=$(CURDIR)/debian/gf $(SET_LDL) cabal copy --destdir=$(CURDIR)/debian/gf
cd src/runtime/c && bash setup.sh copy prefix=$(CURDIR)/debian/gf/usr cd src/runtime/c && bash setup.sh copy prefix=$(CURDIR)/debian/gf/usr
cd src/runtime/python && python setup.py install --prefix=$(CURDIR)/debian/gf/usr cd src/runtime/python && python setup.py install --prefix=$(CURDIR)/debian/gf/usr
# cd src/runtime/java && make INSTALL_PATH=$(CURDIR)/debian/gf/usr install cd src/runtime/java && make INSTALL_PATH=$(CURDIR)/debian/gf/usr install
# D="`find debian/gf -name dist-packages`" && [ -n "$$D" ] && cd $$D && cd .. && mv dist-packages dist-packages D="`find debian/gf -name site-packages`" && [ -n "$$D" ] && cd $$D && cd .. && mv site-packages dist-packages
override_dh_usrlocal:
override_dh_auto_clean: override_dh_auto_clean:
rm -fr dist/build rm -fr dist/build
-cd src/runtime/python && rm -fr build -cd src/runtime/python && rm -fr build
# -cd src/runtime/java && make clean -cd src/runtime/java && make clean
-cd src/runtime/c && make clean -cd src/runtime/c && make clean
override_dh_auto_test: override_dh_auto_test:

View File

@@ -1,201 +0,0 @@
GF Developer's Guide: Old installation instructions with Cabal
This page contains the old installation instructions from the [Developer's Guide ../doc/gf-developers.html].
We recommend Stack as a primary installation method, because it's easier for a Haskell beginner, and we want to keep the main instructions short.
But if you are an experienced Haskeller and want to keep using Cabal, here are the old instructions using ``cabal install``.
Note that some of these instructions may be outdated. Other parts may still be useful.
== Compilation from source with Cabal ==
The build system of GF is based on //Cabal//, which is part of the
Haskell Platform, so no extra steps are needed to install it. In the simplest
case, all you need to do to compile and install GF, after downloading the
source code as described above, is
```
$ cabal install
```
This will automatically download any additional Haskell libraries needed to
build GF. If this is the first time you use Cabal, you might need to run
``cabal update`` first, to update the list of available libraries.
If you want more control, the process can also be split up into the usual
//configure//, //build// and //install// steps.
=== Configure ===
During the configuration phase Cabal will check that you have all
necessary tools and libraries needed for GF. The configuration is
started by the command:
```
$ cabal configure
```
If you don't see any error message from the above command then you
have everything that is needed for GF. You can also add the option
``-v`` to see more details about the configuration.
You can use ``cabal configure --help`` to get a list of configuration options.
=== Build ===
The build phase does two things. First it builds the GF compiler from
the Haskell source code and after that it builds the GF Resource Grammar
Library using the already build compiler. The simplest command is:
```
$ cabal build
```
Again you can add the option ``-v`` if you want to see more details.
==== Parallel builds ====
If you have Cabal>=1.20 you can enable parallel compilation by using
```
$ cabal build -j
```
or by putting a line
```
jobs: $ncpus
```
in your ``.cabal/config`` file. Cabal
will pass this option to GHC when building the GF compiler, if you
have GHC>=7.8.
Cabal also passes ``-j`` to GF to enable parallel compilation of the
Resource Grammar Library. This is done unconditionally to avoid
causing problems for developers with Cabal<1.20. You can disable this
by editing the last few lines in ``WebSetup.hs``.
=== Install ===
After you have compiled GF you need to install the executable and libraries
to make the system usable.
```
$ cabal copy
$ cabal register
```
This command installs the GF compiler for a single user, in the standard
place used by Cabal.
On Linux and Mac this could be ``$HOME/.cabal/bin``.
On Mac it could also be ``$HOME/Library/Haskell/bin``.
On Windows this is ``C:\Program Files\Haskell\bin``.
The compiled GF Resource Grammar Library will be installed
under the same prefix, e.g. in
``$HOME/.cabal/share/gf-3.3.3/lib`` on Linux and
in ``C:\Program Files\Haskell\gf-3.3.3\lib`` on Windows.
If you want to install in some other place then use the ``--prefix``
option during the configuration phase.
=== Clean ===
Sometimes you want to clean up the compilation and start again from clean
sources. Use the clean command for this purpose:
```
$ cabal clean
```
%=== SDist ===
%
%You can use the command:
%
%% This does *NOT* include everything that is needed // TH 2012-08-06
%```
%$ cabal sdist
%```
%
%to prepare archive with all source codes needed to compile GF.
=== Known problems with Cabal ===
Some versions of Cabal (at least version 1.16) seem to have a bug that can
cause the following error:
```
Configuring gf-3.x...
setup: Distribution/Simple/PackageIndex.hs:124:8-13: Assertion failed
```
The exact cause of this problem is unclear, but it seems to happen
during the configure phase if the same version of GF is already installed,
so a workaround is to remove the existing installation with
```
ghc-pkg unregister gf
```
You can check with ``ghc-pkg list gf`` that it is gone.
== Compilation with make ==
If you feel more comfortable with Makefiles then there is a thin Makefile
wrapper arround Cabal for you. If you just type:
```
$ make
```
the configuration phase will be run automatically if needed and after that
the sources will be compiled.
%% cabal build rgl-none does not work with recent versions of Cabal
%If you don't want to compile the resource library
%every time then you can use:
%```
%$ make gf
%```
For installation use:
```
$ make install
```
For cleaning:
```
$ make clean
```
%and to build source distribution archive run:
%```
%$ make sdist
%```
== Partial builds of RGL ==
**NOTE**: The following doesn't work with recent versions of ``cabal``. //(This comment was left in 2015, so make your own conclusions.)//
%% // TH 2015-06-22
%Sometimes you just want to work on the GF compiler and don't want to
%recompile the resource library after each change. In this case use
%this extended command:
%```
%$ cabal build rgl-none
%```
The resource grammar library can be compiled in two modes: with present
tense only and with all tenses. By default it is compiled with all
tenses. If you want to use the library with only present tense you can
compile it in this special mode with the command:
```
$ cabal build present
```
You could also control which languages you want to be recompiled by
adding the option ``langs=list``. For example the following command
will compile only the English and the Swedish language:
```
$ cabal build langs=Eng,Swe
```

View File

@@ -1,6 +1,6 @@
GF Developers Guide GF Developers Guide
2021-07-15 2018-07-26
%!options(html): --toc %!options(html): --toc
@@ -15,304 +15,388 @@ you are a GF user who just wants to download and install GF
== Setting up your system for building GF == == Setting up your system for building GF ==
To build GF from source you need to install some tools on your To build GF from source you need to install some tools on your
system: the Haskell build tool //Stack//, the version control software //Git// and the //Haskeline// library. system: the //Haskell Platform//, //Git// and the //Haskeline library//.
%**On Linux** the best option is to install the tools via the standard **On Linux** the best option is to install the tools via the standard
%software distribution channels, i.e. by using the //Software Center// software distribution channels, i.e. by using the //Software Center//
%in Ubuntu or the corresponding tool in other popular Linux distributions. in Ubuntu or the corresponding tool in other popular Linux distributions.
Or, from a Terminal window, the following command should be enough:
%**On Mac OS and Windows**, the tools can be downloaded from their respective - On Ubuntu: ``sudo apt-get install haskell-platform git libghc6-haskeline-dev``
%web sites, as described below. - On Fedora: ``sudo dnf install haskell-platform git ghc-haskeline-devel``
=== Stack ===
The primary installation method is via //Stack//.
(You can also use Cabal, but we recommend Stack to those who are new to Haskell.)
To install Stack:
- **On Linux and Mac OS**, do either
``$ curl -sSL https://get.haskellstack.org/ | sh``
or
``$ wget -qO- https://get.haskellstack.org/ | sh``
- **On other operating systems**, see the [installation guide https://docs.haskellstack.org/en/stable/install_and_upgrade]. **On Mac OS and Windows**, the tools can be downloaded from their respective
web sites, as described below.
=== The Haskell Platform ===
%If you already have Stack installed, upgrade it to the latest version by running: ``stack upgrade`` GF is written in Haskell, so first of all you need
the //Haskell Platform//, e.g. version 8.0.2 or 7.10.3. Downloads
and installation instructions are available from here:
http://hackage.haskell.org/platform/
Once you have installed the Haskell Platform, open a terminal
(Command Prompt on Windows) and try to execute the following command:
```
$ ghc --version
```
This command should show you which version of GHC you have. If the installation
of the Haskell Platform was successful you should see a message like:
```
The Glorious Glasgow Haskell Compilation System, version 8.0.2
```
Other required tools included in the Haskell Platform are
[Cabal http://www.haskell.org/cabal/],
[Alex http://www.haskell.org/alex/]
and
[Happy http://www.haskell.org/happy/].
=== Git === === Git ===
To get the GF source code, you also need //Git//, a distributed version control system. To get the GF source code, you also need //Git//.
//Git// is a distributed version control system, see
https://git-scm.com/downloads for more information.
- **On Linux**, the best option is to install the tools via the standard === The haskeline library ===
software distribution channels:
- On Ubuntu: ``sudo apt-get install git-all``
- On Fedora: ``sudo dnf install git-all``
- **On other operating systems**, see
https://git-scm.com/book/en/v2/Getting-Started-Installing-Git for installation.
=== Haskeline ===
GF uses //haskeline// to enable command line editing in the GF shell. GF uses //haskeline// to enable command line editing in the GF shell.
This should work automatically on Mac OS and Windows, but on Linux one
extra step is needed to make sure the C libraries (terminfo)
required by //haskeline// are installed. Here is one way to do this:
- **On Mac OS and Windows**, this should work automatically. - On Ubuntu: ``sudo apt-get install libghc-haskeline-dev``
- On Fedora: ``sudo dnf install ghc-haskeline-devel``
- **On Linux**, an extra step is needed to make sure the C libraries (terminfo)
required by //haskeline// are installed:
- On Ubuntu: ``sudo apt-get install libghc-haskeline-dev``
- On Fedora: ``sudo dnf install ghc-haskeline-devel``
== Getting the source ==[getting-source] == Getting the source ==
Once you have all tools in place you can get the GF source code from Once you have all tools in place you can get the GF source code. If you
[GitHub https://github.com/GrammaticalFramework/]: just want to compile and use GF then it is enough to have read-only
access. It is also possible to make changes in the source code but if you
want these changes to be applied back to the main source repository you will
have to send the changes to us. If you plan to work continuously on
GF then you should consider getting read-write access.
- https://github.com/GrammaticalFramework/gf-core for the GF compiler === Read-only access ===
- https://github.com/GrammaticalFramework/gf-rgl for the Resource Grammar Library
==== Getting a fresh copy for read-only access ====
=== Read-only access: clone the main repository === Anyone can get the latest development version of GF by running:
If you only want to compile and use GF, you can just clone the repositories as follows:
``` ```
$ git clone https://github.com/GrammaticalFramework/gf-core.git $ git clone https://github.com/GrammaticalFramework/gf-core.git
$ git clone https://github.com/GrammaticalFramework/gf-rgl.git $ git clone https://github.com/GrammaticalFramework/gf-rgl.git
``` ```
To get new updates, run the following anywhere in your local copy of the repository: This will create directories ``gf-core`` and ``gf-rgl`` in the current directory.
==== Updating your copy ====
To get all new patches from each repo:
```
$ git pull
```
This can be done anywhere in your local repository.
==== Recording local changes ====[record]
Since every copy is a repository, you can have local version control
of your changes.
If you have added files, you first need to tell your local repository to
keep them under revision control:
``` ```
$ git pull $ git add file1 file2 ...
``` ```
=== Contribute your changes: fork the main repository === To record changes, use:
If you want the possibility to contribute your changes,
you should create your own fork, do your changes there,
and then send a pull request to the main repository.
+ **Creating and cloning a fork —**
See GitHub documentation for instructions how to [create your own fork https://docs.github.com/en/get-started/quickstart/fork-a-repo]
of the repository. Once you've done it, clone the fork to your local computer.
``` ```
$ git clone https://github.com/<YOUR_USERNAME>/gf-core.git $ git commit file1 file2 ...
``` ```
+ **Updating your copy —** This creates a patch against the previous version and stores it in your
Once you have cloned your fork, you need to set up the main repository as a remote: local repository. You can record any number of changes before
pushing them to the main repo. In fact, you don't have to push them at
all if you want to keep the changes only in your local repo.
Instead of enumerating all modified files on the command line,
you can use the flag ``-a`` to automatically record //all// modified
files. You still need to use ``git add`` to add new files.
=== Read-write access ===
If you are a member of the GF project on GitHub, you can push your
changes directly to the GF git repository on GitHub.
``` ```
$ git remote add upstream https://github.com/GrammaticalFramework/gf-core.git $ git push
``` ```
Then you can get the latest updates by running the following: It is also possible for anyone else to contribute by
``` - creating a fork of the GF repository on GitHub,
$ git pull upstream master - working with local clone of the fork (obtained with ``git clone``),
``` - pushing changes to the fork,
- and finally sending a pull request.
+ **Recording local changes —**
See Git tutorial on how to [record and push your changes https://git-scm.com/book/en/v2/Git-Basics-Recording-Changes-to-the-Repository] to your fork.
+ **Pull request —**
When you want to contribute your changes to the main gf-core repository,
[create a pull request https://docs.github.com/en/github/collaborating-with-pull-requests/proposing-changes-to-your-work-with-pull-requests/creating-a-pull-request]
from your fork.
If you want to contribute to the RGL as well, do the same process for the RGL repository. == Compilation from source with Cabal ==
The build system of GF is based on //Cabal//, which is part of the
== Compilation from source == Haskell Platform, so no extra steps are needed to install it. In the simplest
case, all you need to do to compile and install GF, after downloading the
By now you should have installed Stack and Haskeline, and cloned the Git repository on your own computer, in a directory called ``gf-core``. source code as described above, is
=== Primary recommendation: use Stack ===
Open a terminal, go to the top directory (``gf-core``), and type the following command.
```
$ stack install
```
=== Alternative: use Cabal ===
If you prefer Cabal, then you just need to manually choose a suitable GHC to build GF. We recommend GHC 9.6.7, see other supported options in [gf.cabal https://github.com/GrammaticalFramework/gf-core/blob/master/gf.cabal#L14].
The actual installation process is similar to Stack: open a terminal, go to the top directory (``gf-core``), and type the following command.
``` ```
$ cabal install $ cabal install
``` ```
=== Nix === This will automatically download any additional Haskell libraries needed to
build GF. If this is the first time you use Cabal, you might need to run
``cabal update`` first, to update the list of available libraries.
As of 3.12, GF can also be installed via Nix. You can install GF from github with the following command: If you want more control, the process can also be split up into the usual
//configure//, //build// and //install// steps.
=== Configure ===
During the configuration phase Cabal will check that you have all
necessary tools and libraries needed for GF. The configuration is
started by the command:
``` ```
nix profile install github:GrammaticalFramework/gf-core#gf $ cabal configure
``` ```
== Compiling GF with C runtime system support == If you don't see any error message from the above command then you
have everything that is needed for GF. You can also add the option
``-v`` to see more details about the configuration.
The C runtime system is a separate implementation of the PGF runtime services. You can use ``cabal configure --help`` to get a list of configuration options.
=== Build ===
The build phase does two things. First it builds the GF compiler from
the Haskell source code and after that it builds the GF Resource Grammar
Library using the already build compiler. The simplest command is:
```
$ cabal build
```
Again you can add the option ``-v`` if you want to see more details.
==== Parallel builds ====
If you have Cabal>=1.20 you can enable parallel compilation by using
```
$ cabal build -j
```
or by putting a line
```
jobs: $ncpus
```
in your ``.cabal/config`` file. Cabal
will pass this option to GHC when building the GF compiler, if you
have GHC>=7.8.
Cabal also passes ``-j`` to GF to enable parallel compilation of the
Resource Grammar Library. This is done unconditionally to avoid
causing problems for developers with Cabal<1.20. You can disable this
by editing the last few lines in ``WebSetup.hs``.
==== Partial builds ====
**NOTE**: The following doesn't work with recent versions of ``cabal``.
%% // TH 2015-06-22
Sometimes you just want to work on the GF compiler and don't want to
recompile the resource library after each change. In this case use
this extended command:
```
$ cabal build rgl-none
```
The resource library could also be compiled in two modes: with present
tense only and with all tenses. By default it is compiled with all
tenses. If you want to use the library with only present tense you can
compile it in this special mode with the command:
```
$ cabal build present
```
You could also control which languages you want to be recompiled by
adding the option ``langs=list``. For example the following command
will compile only the English and the Swedish language:
```
$ cabal build langs=Eng,Swe
```
=== Install ===
After you have compiled GF you need to install the executable and libraries
to make the system usable.
```
$ cabal copy
$ cabal register
```
This command installs the GF compiler for a single user, in the standard
place used by Cabal.
On Linux and Mac this could be ``$HOME/.cabal/bin``.
On Mac it could also be ``$HOME/Library/Haskell/bin``.
On Windows this is ``C:\Program Files\Haskell\bin``.
The compiled GF Resource Grammar Library will be installed
under the same prefix, e.g. in
``$HOME/.cabal/share/gf-3.3.3/lib`` on Linux and
in ``C:\Program Files\Haskell\gf-3.3.3\lib`` on Windows.
If you want to install in some other place then use the ``--prefix``
option during the configuration phase.
=== Clean ===
Sometimes you want to clean up the compilation and start again from clean
sources. Use the clean command for this purpose:
```
$ cabal clean
```
%=== SDist ===
%
%You can use the command:
%
%% This does *NOT* include everything that is needed // TH 2012-08-06
%```
%$ cabal sdist
%```
%
%to prepare archive with all source codes needed to compile GF.
=== Known problems with Cabal ===
Some versions of Cabal (at least version 1.16) seem to have a bug that can
cause the following error:
```
Configuring gf-3.x...
setup: Distribution/Simple/PackageIndex.hs:124:8-13: Assertion failed
```
The exact cause of this problem is unclear, but it seems to happen
during the configure phase if the same version of GF is already installed,
so a workaround is to remove the existing installation with
```
ghc-pkg unregister gf
```
You can check with ``ghc-pkg list gf`` that it is gone.
== Compilation with make ==
If you feel more comfortable with Makefiles then there is a thin Makefile
wrapper arround Cabal for you. If you just type:
```
$ make
```
the configuration phase will be run automatically if needed and after that
the sources will be compiled.
%% cabal build rgl-none does not work with recent versions of Cabal
%If you don't want to compile the resource library
%every time then you can use:
%```
%$ make gf
%```
For installation use:
```
$ make install
```
For cleaning:
```
$ make clean
```
%and to build source distribution archive run:
%```
%$ make sdist
%```
== Compiling GF with C run-time system support ==
The C run-time system is a separate implementation of the PGF run-time services.
It makes it possible to work with very large, ambiguous grammars, using It makes it possible to work with very large, ambiguous grammars, using
probabilistic models to obtain probable parses. The C runtime system might probabilistic models to obtain probable parses. The C run-time system might
also be easier to use than the Haskell runtime system on certain platforms, also be easier to use than the Haskell run-time system on certain platforms,
e.g. Android and iOS. e.g. Android and iOS.
To install the C runtime system, go to the ``src/runtime/c`` directory. To install the C run-time system, go to the ``src/runtime/c`` directory
%and follow the instructions in the ``INSTALL`` file.
and use the ``install.sh`` script:
```
bash setup.sh configure
bash setup.sh build
bash setup.sh install
```
This will install
the C header files and libraries need to write C programs that use PGF grammars.
Some example C programs are included in the ``utils`` subdirectory, e.g.
``pgf-translate.c``.
- **On Linux and Mac OS —** When the C run-time system is installed, you can install GF with C run-time
You should have autoconf, automake, libtool and make. support by doing
If you are missing some of them, follow the
instructions in the [INSTALL https://github.com/GrammaticalFramework/gf-core/blob/master/src/runtime/c/INSTALL] file.
Once you have the required libraries, the easiest way to install the C runtime is to use the ``install.sh`` script. Just type
``$ bash install.sh``
This will install the C header files and libraries need to write C programs
that use PGF grammars.
% If this doesn't work for you, follow the manual instructions in the [INSTALL https://github.com/GrammaticalFramework/gf-core/blob/master/src/runtime/c/INSTALL] file under your operating system.
- **On other operating systems —** Follow the instructions in the
[INSTALL https://github.com/GrammaticalFramework/gf-core/blob/master/src/runtime/c/INSTALL] file under your operating system.
Depending on what you want to do with the C runtime, you can follow one or more of the following steps.
=== Use the C runtime from another programming language ===[bindings]
% **If you just want to use the C runtime from Python or Haskell, you don't need to change your GF installation.**
- **What —**
This is the most common use case for the C runtime: compile
your GF grammars into PGF with the standard GF executable,
and manipulate the PGFs from another programming language,
using the bindings to the C runtime.
- **How —**
The Python, Java and Haskell bindings are found in the
``src/runtime/{python,java,haskell-bind}`` directories,
respecively. Compile them by following the instructions
in the ``INSTALL`` or ``README`` files in those directories.
The Python library can also be installed from PyPI using ``pip install pgf``.
//If you are on Mac and get an error about ``clang`` version, you can try some of [these solutions https://stackoverflow.com/questions/63972113/big-sur-clang-invalid-version-error-due-to-macosx-deployment-target]—but be careful before removing any existing installations.//
=== Use GF shell with C runtime support ===
- **What —**
If you want to use the GF shell with C runtime functionalities, then you need to (re)compile GF with special flags.
The GF shell can be started with ``gf -cshell`` or ``gf -crun`` to use
the C run-time system instead of the Haskell run-time system.
Only limited functionality is available when running the shell in these
modes (use the ``help`` command in the shell for details).
(Re)compiling your GF with these flags will also give you
Haskell bindings to the C runtime, as a library called ``PGF2``,
but if you want Python bindings, you need to do [the previous step #bindings].
% ``PGF2``: a module to import in Haskell programs, providing a binding to the C run-time system.
- **How —**
Add (or uncomment) the following lines in the ``stack.yaml`` file:
``` ```
flags: cabal install -fserver -fc-runtime
gf:
c-runtime: true
extra-lib-dirs:
- /usr/local/lib
``` ```
and then run ``stack install`` from the top directory (``gf-core``). from the top directory. This give you three new things:
Run the newly built executable with the flag ``-cshell``, and you should see the following welcome message: - ``PGF2``: a module to import in Haskell programs, providing a binding to
the C run-time system.
``` - The GF shell can be started with ``gf -cshell`` or ``gf -crun`` to use
$ gf -cshell the C run-time system instead of the Haskell run-time system.
Only limited functionality is available when running the shell in these
modes (use the ``help`` command in the shell for details).
* * * - ``gf -server`` mode is extended with new requests to call the C run-time
* * system, e.g. ``c-parse``, ``c-linearize`` and ``c-translate``.
* *
*
*
* * * * * * *
* * *
* * * * * *
* * *
* * *
This is GF version 3.12.0.
Built on ...
Git info: ...
Flags: interrupt server c-runtime
License: see help -license.
This shell uses the C run-time system. See help for available commands.
>
```
//If you get an "``error while loading shared libraries``" when trying to run GF with C runtime, remember to declare your ``LD_LIBRARY_PATH``.//
//Add ``export LD_LIBRARY_PATH="/usr/local/lib"`` to either your ``.bashrc`` or ``.profile``. You should now be able to start GF with C runtime.//
=== Use GF server mode with C runtime === === Python and Java bindings ===
- **What —**
With this feature, ``gf -server`` mode is extended with new requests to call the C run-time
system, e.g. ``c-parse``, ``c-linearize`` and ``c-translate``.
- **How —**
Add the following lines in the ``stack.yaml`` file:
```
flags:
gf:
c-runtime: true
server: true
extra-lib-dirs:
- /usr/local/lib
```
and then run ``stack install``, also from the top directory.
The C run-time system can also be used from Python and Java. Python and Java
bindings are found in the ``src/runtime/python`` and ``src/runtime/java``
directories, respecively. Compile them by following the instructions in
the ``INSTALL`` files in those directories.
The Python library can also be installed from PyPI using `pip install pgf`.
== Compilation of RGL == == Compilation of RGL ==
As of 2018-07-26, the RGL is distributed separately from the GF compiler and runtimes. As of 2018-07-26, the RGL is distributed separately from the GF compiler and runtimes.
To get the source, follow the previous instructions on [how to clone a repository with Git #getting-source].
After cloning the RGL, you should have a directory named ``gf-rgl`` on your computer.
=== Simple === === Simple ===
To install the RGL, you can use the following commands from within the ``gf-rgl`` repository: To install the RGL, you can use the following commands from within the ``gf-rgl`` repository:
``` ```
@@ -334,68 +418,103 @@ If you do not have Haskell installed, you can use the simple build script ``Setu
== Creating binary distribution packages == == Creating binary distribution packages ==
The binaries are generated with Github Actions. More details can be viewed here: === Creating .deb packages for Ubuntu ===
https://github.com/GrammaticalFramework/gf-core/actions/workflows/build-binary-packages.yml This was tested on Ubuntu 14.04 for the release of GF 3.6, and the
resulting ``.deb`` packages appears to work on Ubuntu 12.04, 13.10 and 14.04.
For the release of GF 3.7, we generated ``.deb`` packages on Ubuntu 15.04 and
tested them on Ubuntu 12.04 and 14.04.
Under Ubuntu, Haskell executables are statically linked against other Haskell
libraries, so the .deb packages are fairly self-contained.
== Running the test suite == ==== Preparations ====
The GF test suite is run with one of the following commands from the top directory:
``` ```
$ cabal test sudo apt-get install dpkg-dev debhelper
``` ```
or ==== Creating the package ====
Make sure the ``debian/changelog`` starts with an entry that describes the
version you are building. Then run
``` ```
$ stack test make deb
``` ```
If get error messages about missing dependencies
(e.g. ``autoconf``, ``automake``, ``libtool-bin``, ``python-dev``,
``java-sdk``, ``txt2tags``)
use ``apt-get intall`` to install them, then try again.
=== Creating OS X Installer packages ===
Run
```
make pkg
```
=== Creating binary tar distributions ===
Run
```
make bintar
```
=== Creating .rpm packages for Fedora ===
This is possible, but the procedure has not been automated.
It involves using the cabal-rpm tool,
```
sudo dnf install cabal-rpm
```
and following the Fedora guide
[How to create an RPM package http://fedoraproject.org/wiki/How_to_create_an_RPM_package].
Under Fedora, Haskell executables are dynamically linked against other Haskell
libraries, so ``.rpm`` packages for all Haskell libraries that GF depends on
are required. Most of them are already available in the Fedora distribution,
but a few of them might have to be built and distributed along with
the GF ``.rpm`` package.
When building ``.rpm`` packages for GF 3.4, we also had to build ``.rpm``s for
``fst`` and ``httpd-shed``.
== Running the testsuite ==
**NOTE:** The test suite has not been maintained recently, so expect many
tests to fail.
%% // TH 2012-08-06
GF has testsuite. It is run with the following command:
```
$ cabal test
```
The testsuite architecture for GF is very simple but still very flexible. The testsuite architecture for GF is very simple but still very flexible.
GF by itself is an interpreter and could execute commands in batch mode. GF by itself is an interpreter and could execute commands in batch mode.
This is everything that we need to organize a testsuite. The root of the This is everything that we need to organize a testsuite. The root of the
testsuite is the ``testsuite/`` directory. It contains subdirectories testsuite is the testsuite/ directory. It contains subdirectories which
which themselves contain GF batch files (with extension ``.gfs``). themself contain GF batch files (with extension .gfs). The above command
The above command searches the subdirectories of the ``testsuite/`` directory searches the subdirectories of the testsuite/ directory for files with extension
for files with extension ``.gfs`` and when it finds one, it is executed with .gfs and when it finds one it is executed with the GF interpreter.
the GF interpreter. The output of the script is stored in file with extension ``.out`` The output of the script is stored in file with extension .out and is compared
and is compared with the content of the corresponding file with extension ``.gold``, if there is one. with the content of the corresponding file with extension .gold, if there is one.
If the contents are identical the command reports that the test was passed successfully.
Otherwise the test had failed.
Every time when you make some changes to GF that have to be tested, Every time when you make some changes to GF that have to be tested, instead of
instead of writing the commands by hand in the GF shell, add them to one ``.gfs`` writing the commands by hand in the GF shell, add them to one .gfs file in the testsuite
file in the testsuite subdirectory where its ``.gf`` file resides and run the test. and run the test. In this way you can use the same test later and we will be sure
In this way you can use the same test later and we will be sure that we will not that we will not incidentaly break your code later.
accidentally break your code later.
**Test Outcome - Passed:** If the contents of the files with the ``.out`` extension
are identical to their correspondingly-named files with the extension ``.gold``,
the command will report that the tests passed successfully, e.g.
If you don't want to run the whole testsuite you can write the path to the subdirectory
in which you are interested. For example:
``` ```
Running 1 test suites... $ cabal test testsuite/compiler
Test suite gf-tests: RUNNING...
Test suite gf-tests: PASS
1 of 1 test suites (1 of 1 test cases) passed.
``` ```
will run only the testsuite for the compiler.
**Test Outcome - Failed:** If there is a contents mismatch between the files
with the ``.out`` extension and their corresponding files with the extension ``.gold``,
the test diagnostics will show a fail and the areas that failed. e.g.
```
testsuite/compiler/compute/Records.gfs: OK
testsuite/compiler/compute/Variants.gfs: FAIL
testsuite/compiler/params/params.gfs: OK
Test suite gf-tests: FAIL
0 of 1 test suites (0 of 1 test cases) passed.
```
The fail results overview is available in gf-tests.html which shows 4 columns:
+ __Results__ - only areas that fail will appear. (Note: There are 3 failures in the gf-tests.html which are labelled as (expected). These failures should be ignored.)
+ __Input__ - which is the test written in the .gfs file
+ __Gold__ - the expected output from running the test set out in the .gfs file. This column refers to the contents from the .gold extension files.
+ __Output__ - This column refers to the contents from the .out extension files which are generated as test output.
After fixing the areas which fail, rerun the test command. Repeat the entire process of fix-and-test until the test suite passes before submitting a pull request to include your changes.

View File

@@ -1,75 +0,0 @@
# Editor modes & IDE integration for GF
We collect GF modes for various editors on this page. Contributions are welcome!
## Emacs
[gf.el](https://github.com/GrammaticalFramework/gf-emacs-mode) by Johan
Bockgård provides syntax highlighting and automatic indentation and
lets you run the GF Shell in an emacs buffer. See installation
instructions inside.
## Atom
[language-gf](https://atom.io/packages/language-gf), by John J. Camilleri
## Visual Studio Code
* [Grammatical Framework Language Server](https://marketplace.visualstudio.com/items?itemName=anka-213.gf-vscode) by Andreas Källberg.
This provides syntax highlighting and a client for the Grammatical Framework language server. Follow the installation instructions in the link.
* [Grammatical Framework](https://marketplace.visualstudio.com/items?itemName=GrammaticalFramework.gf-vscode) is a simpler extension
without any external dependencies which provides only syntax highlighting.
## Eclipse
[GF Eclipse Plugin](https://github.com/GrammaticalFramework/gf-eclipse-plugin/), by John J. Camilleri
## Gedit
By John J. Camilleri
Copy the file below to
`~/.local/share/gtksourceview-3.0/language-specs/gf.lang` (under Ubuntu).
* [gf.lang](../src/tools/gf.lang)
Some helpful notes/links:
* The code is based heavily on the `haskell.lang` file which I found in
`/usr/share/gtksourceview-2.0/language-specs/haskell.lang`.
* Ruslan Osmanov recommends
[registering your file extension as its own MIME type](http://osmanov-dev-notes.blogspot.com/2011/04/how-to-add-new-highlight-mode-in-gedit.html)
(see also [here](https://help.ubuntu.com/community/AddingMimeTypes)),
however on my system the `.gf` extension was already registered
as a generic font (`application/x-tex-gf`) and I didn't want to risk
messing any of that up.
* This is a quick 5-minute job and might require some tweaking.
[The GtkSourceView language definition tutorial](http://developer.gnome.org/gtksourceview/stable/lang-tutorial.html)
is the place to start looking.
* Contributions are welcome!
## Geany
By John J. Camilleri
[Custom filetype](http://www.geany.org/manual/dev/index.html#custom-filetypes)
config files for syntax highlighting in [Geany](http://www.geany.org/).
For version 1.36 and above, copy one of the files below to
`/usr/share/geany/filedefs/filetypes.GF.conf` (under Ubuntu).
If you're using a version older than 1.36, copy the file to `/usr/share/geany/filetypes.GF.conf`.
You will need to manually create the file.
* [light-filetypes.GF.conf](../src/tools/light-filetypes.GF.conf)
* [dark-filetypes.GF.conf](../src/tools/dark-filetypes.GF.conf)
You will also need to edit the `filetype_extensions.conf` file and add the
following line somewhere:
```
GF=*.gf
```
## Vim
[vim-gf](https://github.com/gdetrez/vim-gf)

72
doc/gf-editor-modes.t2t Normal file
View File

@@ -0,0 +1,72 @@
Editor modes & IDE integration for GF
We collect GF modes for various editors on this page. Contributions are
welcome!
==Emacs==
[gf.el https://github.com/GrammaticalFramework/gf-emacs-mode] by Johan
Bockgård provides syntax highlighting and automatic indentation and
lets you run the GF Shell in an emacs buffer. See installation
instructions inside.
==Atom==
[language-gf https://atom.io/packages/language-gf], by John J. Camilleri
==Eclipse==
[GF Eclipse Plugin https://github.com/GrammaticalFramework/gf-eclipse-plugin/], by John J. Camilleri
==Gedit==
By John J. Camilleri
Copy the file below to
``~/.local/share/gtksourceview-3.0/language-specs/gf.lang`` (under Ubuntu).
- [gf.lang ../src/tools/gf.lang]
Some helpful notes/links:
- The code is based heavily on the ``haskell.lang`` file which I found in
``/usr/share/gtksourceview-2.0/language-specs/haskell.lang``.
- Ruslan Osmanov recommends
[registering your file extension as its own MIME type http://osmanov-dev-notes.blogspot.com/2011/04/how-to-add-new-highlight-mode-in-gedit.html]
(see also [here https://help.ubuntu.com/community/AddingMimeTypes]),
however on my system the ``.gf`` extension was already registered
as a generic font (``application/x-tex-gf``) and I didn't want to risk
messing any of that up.
- This is a quick 5-minute job and might require some tweaking.
[The GtkSourceView language definition tutorial http://developer.gnome.org/gtksourceview/stable/lang-tutorial.html]
is the place to start looking.
- Contributions are welcome!
==Geany==
By John J. Camilleri
[Custom filetype http://www.geany.org/manual/dev/index.html#custom-filetypes]
config files for syntax highlighting in [Geany http://www.geany.org/].
Copy one of the files below to ``/usr/share/geany/filetypes.GF.conf``
(under Ubuntu). You will need to manually create the file.
- [light-filetypes.GF.conf ../src/tools/light-filetypes.GF.conf]
- [dark-filetypes.GF.conf ../src/tools/dark-filetypes.GF.conf]
You will also need to edit the ``filetype_extensions.conf`` file and add the
following line somewhere:
```
GF=*.gf
```
==Vim==
[vim-gf https://github.com/gdetrez/vim-gf]

View File

@@ -303,7 +303,7 @@ but the resulting .gf file must be imported separately.
#TINY #TINY
Generates a list of random trees, by default one tree up to depth 5. Generates a list of random trees, by default one tree.
If a tree argument is given, the command completes the Tree with values to If a tree argument is given, the command completes the Tree with values to
all metavariables in the tree. The generation can be biased by probabilities, all metavariables in the tree. The generation can be biased by probabilities,
given in a file in the -probs flag. given in a file in the -probs flag.
@@ -315,14 +315,13 @@ given in a file in the -probs flag.
| ``-cat`` | generation category | ``-cat`` | generation category
| ``-lang`` | uses only functions that have linearizations in all these languages | ``-lang`` | uses only functions that have linearizations in all these languages
| ``-number`` | number of trees generated | ``-number`` | number of trees generated
| ``-depth`` | the maximum generation depth (default: 5) | ``-depth`` | the maximum generation depth
| ``-probs`` | file with biased probabilities (format 'f 0.4' one by line) | ``-probs`` | file with biased probabilities (format 'f 0.4' one by line)
- Examples: - Examples:
| ``gr`` | one tree in the startcat of the current grammar | ``gr`` | one tree in the startcat of the current grammar
| ``gr -cat=NP -number=16`` | 16 trees in the category NP | ``gr -cat=NP -number=16`` | 16 trees in the category NP
| ``gr -cat=NP -depth=2`` | one tree in the category NP, up to depth 2
| ``gr -lang=LangHin,LangTha -cat=Cl`` | Cl, both in LangHin and LangTha | ``gr -lang=LangHin,LangTha -cat=Cl`` | Cl, both in LangHin and LangTha
| ``gr -probs=FILE`` | generate with bias | ``gr -probs=FILE`` | generate with bias
| ``gr (AdjCN ? (UseN ?))`` | generate trees of form (AdjCN ? (UseN ?)) | ``gr (AdjCN ? (UseN ?))`` | generate trees of form (AdjCN ? (UseN ?))
@@ -340,7 +339,7 @@ given in a file in the -probs flag.
#TINY #TINY
Generates all trees of a given category. By default, Generates all trees of a given category. By default,
the depth is limited to 5, but this can be changed by a flag. the depth is limited to 4, but this can be changed by a flag.
If a Tree argument is given, the command completes the Tree with values If a Tree argument is given, the command completes the Tree with values
to all metavariables in the tree. to all metavariables in the tree.
@@ -354,7 +353,7 @@ to all metavariables in the tree.
- Examples: - Examples:
| ``gt`` | all trees in the startcat, to depth 5 | ``gt`` | all trees in the startcat, to depth 4
| ``gt -cat=NP -number=16`` | 16 trees in the category NP | ``gt -cat=NP -number=16`` | 16 trees in the category NP
| ``gt -cat=NP -depth=2`` | trees in the category NP to depth 2 | ``gt -cat=NP -depth=2`` | trees in the category NP to depth 2
| ``gt (AdjCN ? (UseN ?))`` | trees of form (AdjCN ? (UseN ?)) | ``gt (AdjCN ? (UseN ?))`` | trees of form (AdjCN ? (UseN ?))

View File

@@ -7,6 +7,7 @@ title: "Grammatical Framework: Authors and Acknowledgements"
The current maintainers of GF are The current maintainers of GF are
[Krasimir Angelov](http://www.chalmers.se/cse/EN/organization/divisions/computing-science/people/angelov-krasimir), [Krasimir Angelov](http://www.chalmers.se/cse/EN/organization/divisions/computing-science/people/angelov-krasimir),
[Thomas Hallgren](http://www.cse.chalmers.se/~hallgren/),
[Aarne Ranta](http://www.cse.chalmers.se/~aarne/), [Aarne Ranta](http://www.cse.chalmers.se/~aarne/),
[John J. Camilleri](http://johnjcamilleri.com), and [John J. Camilleri](http://johnjcamilleri.com), and
[Inari Listenmaa](https://inariksit.github.io/). [Inari Listenmaa](https://inariksit.github.io/).
@@ -21,7 +22,6 @@ and
The following people have contributed code to some of the versions: The following people have contributed code to some of the versions:
- [Thomas Hallgren](http://www.cse.chalmers.se/~hallgren/) (University of Gothenburg)
- Grégoire Détrez (University of Gothenburg) - Grégoire Détrez (University of Gothenburg)
- Ramona Enache (University of Gothenburg) - Ramona Enache (University of Gothenburg)
- [Björn Bringert](http://www.cse.chalmers.se/alumni/bringert) (University of Gothenburg) - [Björn Bringert](http://www.cse.chalmers.se/alumni/bringert) (University of Gothenburg)

View File

@@ -1188,7 +1188,7 @@ use ``generate_trees = gt``.
this wine is fresh this wine is fresh
this wine is warm this wine is warm
``` ```
The default **depth** is 5; the depth can be The default **depth** is 3; the depth can be
set by using the ``depth`` flag: set by using the ``depth`` flag:
``` ```
> generate_trees -depth=2 | l > generate_trees -depth=2 | l
@@ -1265,16 +1265,10 @@ Human eye may prefer to see a visualization: ``visualize_tree = vt``:
> parse "this delicious cheese is very Italian" | visualize_tree > parse "this delicious cheese is very Italian" | visualize_tree
``` ```
The tree is generated in postscript (``.ps``) file. The ``-view`` option is used for The tree is generated in postscript (``.ps``) file. The ``-view`` option is used for
telling what command to use to view the file. telling what command to use to view the file. Its default is ``"open"``, which works
on Mac OS X. On Ubuntu Linux, one can write
This works on Mac OS X:
``` ```
> parse "this delicious cheese is very Italian" | visualize_tree -view=open > parse "this delicious cheese is very Italian" | visualize_tree -view="eog"
```
On Linux, one can use one of the following commands.
```
> parse "this delicious cheese is very Italian" | visualize_tree -view=eog
> parse "this delicious cheese is very Italian" | visualize_tree -view=xdg-open
``` ```
@@ -1739,13 +1733,6 @@ A new module can **extend** an old one:
Pizza : Kind ; Pizza : Kind ;
} }
``` ```
Note that the extended grammar doesn't inherit the start
category from the grammar it extends, so if you want to
generate sentences with this grammar, you'll have to either
add a startcat (e.g. ``flags startcat = Question ;``),
or in the GF shell, specify the category to ``generate_random`` or ``geneate_trees``
(e.g. ``gr -cat=Comment`` or ``gt -cat=Question``).
Parallel to the abstract syntax, extensions can Parallel to the abstract syntax, extensions can
be built for concrete syntaxes: be built for concrete syntaxes:
``` ```
@@ -4591,7 +4578,7 @@ in any multilingual grammar between any languages in the grammar.
module Main where module Main where
import PGF import PGF
import System.Environment (getArgs) import System (getArgs)
main :: IO () main :: IO ()
main = do main = do

View File

@@ -1,9 +1,8 @@
--- ---
title: Grammatical Framework Download and Installation title: Grammatical Framework Download and Installation
date: 25 July 2021 ...
---
**GF 3.11** was released on 25 July 2021. **GF 3.11** was released on ... December 2020.
What's new? See the [release notes](release-3.11.html). What's new? See the [release notes](release-3.11.html).
@@ -25,25 +24,22 @@ Binary packages are available for Debian/Ubuntu, macOS, and Windows and include:
Unlike in previous versions, the binaries **do not** include the RGL. Unlike in previous versions, the binaries **do not** include the RGL.
[Binary packages on GitHub](https://github.com/GrammaticalFramework/gf-core/releases/tag/3.11) [Binary packages on GitHub](https://github.com/GrammaticalFramework/gf-core/releases/tag/RELEASE-3.11)
#### Debian/Ubuntu #### Debian/Ubuntu
There are two versions: `gf-3.11-ubuntu-18.04.deb` for Ubuntu 18.04 (Cosmic), and `gf-3.11-ubuntu-20.04.deb` for Ubuntu 20.04 (Focal).
To install the package use: To install the package use:
``` ```
sudo apt-get install ./gf-3.11-ubuntu-*.deb sudo dpkg -i gf_3.11.deb
``` ```
<!-- The Ubuntu `.deb` packages should work on Ubuntu 16.04, 18.04 and similar Linux distributions. --> The Ubuntu `.deb` packages should work on Ubuntu 16.04, 18.04 and similar Linux distributions.
#### macOS #### macOS
To install the package, just double-click it and follow the installer instructions. To install the package, just double-click it and follow the installer instructions.
The packages should work on at least Catalina and Big Sur. The packages should work on at least 10.13 (High Sierra) and 10.14 (Mojave).
#### Windows #### Windows
@@ -53,39 +49,24 @@ You will probably need to update the `PATH` environment variable to include your
For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10). For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10).
## Installing from Hackage ## Installing the latest release from source
_Instructions applicable for macOS, Linux, and WSL2 on Windows._
[GF is on Hackage](http://hackage.haskell.org/package/gf), so under [GF is on Hackage](http://hackage.haskell.org/package/gf), so under
normal circumstances the procedure is fairly simple: normal circumstances the procedure is fairly simple:
``` 1. Install a recent version of the [Haskell Platform](http://hackage.haskell.org/platform) (see note below)
cabal update 2. `cabal update`
cabal install gf-3.11 3. On Linux: install some C libraries from your Linux distribution (see note below)
``` 4. `cabal install gf`
You can also download the source code release from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases),
and follow the instructions below under **Installing from the latest developer source code**.
### Notes ### Notes
**GHC version**
The GF source code is known to be compilable with GHC versions 7.10 through to 8.10.
**Obtaining Haskell**
There are various ways of obtaining Haskell, including:
- ghcup
1. Install from https://www.haskell.org/ghcup/
2. `ghcup install ghc 8.10.4`
3. `ghcup set ghc 8.10.4`
- Haskell Platform https://www.haskell.org/platform/
- Stack https://haskellstack.org/
**Installation location** **Installation location**
The above steps install GF for a single user. The above steps installs GF for a single user.
The executables are put in `$HOME/.cabal/bin` (or on macOS in `$HOME/Library/Haskell/bin`), The executables are put in `$HOME/.cabal/bin` (or on macOS in `$HOME/Library/Haskell/bin`),
so you might want to add this directory to your path (in `.bash_profile` or similar): so you might want to add this directory to your path (in `.bash_profile` or similar):
@@ -93,38 +74,47 @@ so you might want to add this directory to your path (in `.bash_profile` or simi
PATH=$HOME/.cabal/bin:$PATH PATH=$HOME/.cabal/bin:$PATH
``` ```
**Build tools**
In order to compile GF you need the build tools **Alex** and **Happy**.
These can be installed via Cabal, e.g.:
```
cabal install alex happy
```
or obtained by other means, depending on your OS.
**Haskeline** **Haskeline**
GF uses [`haskeline`](http://hackage.haskell.org/package/haskeline), which GF uses [`haskeline`](http://hackage.haskell.org/package/haskeline), which
on Linux depends on some non-Haskell libraries that won't be installed on Linux depends on some non-Haskell libraries that won't be installed
automatically by Cabal, and therefore need to be installed manually. automatically by cabal, and therefore need to be installed manually.
Here is one way to do this: Here is one way to do this:
- On Ubuntu: `sudo apt-get install libghc-haskeline-dev` - On Ubuntu: `sudo apt-get install libghc-haskeline-dev`
- On Fedora: `sudo dnf install ghc-haskeline-devel` - On Fedora: `sudo dnf install ghc-haskeline-devel`
## Installing from source code **GHC version**
**Obtaining** The GF source code has been updated to compile with GHC versions 7.10 through to 8.8.
To obtain the source code for the **release**, ## Installing from the latest developer source code
download it from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases).
Alternatively, to obtain the **latest version** of the source code: If you haven't already, clone the repository with:
1. If you haven't already, clone the repository with:
``` ```
git clone https://github.com/GrammaticalFramework/gf-core.git git clone https://github.com/GrammaticalFramework/gf-core.git
``` ```
2. If you've already cloned the repository previously, update with:
If you've already cloned the repository previously, update with:
``` ```
git pull git pull
``` ```
Then install with:
**Installing**
You can then install with:
``` ```
cabal install cabal install
``` ```
@@ -135,12 +125,10 @@ or, if you're a Stack user:
stack install stack install
``` ```
<!--The above notes for installing from source apply also in these cases.--> The above notes for installing from source apply also in these cases.
For more info on working with the GF source code, see the For more info on working with the GF source code, see the
[GF Developers Guide](../doc/gf-developers.html). [GF Developers Guide](../doc/gf-developers.html).
For macOS Sequoia, you need to downgrade the LLVM package, see instructions [here](https://github.com/GrammaticalFramework/gf-core/issues/172#issuecomment-2599365457).
## Installing the Python bindings from PyPI ## Installing the Python bindings from PyPI
The Python library is available on PyPI as `pgf`, so it can be installed using: The Python library is available on PyPI as `pgf`, so it can be installed using:

View File

@@ -1,191 +0,0 @@
---
title: Grammatical Framework Download and Installation
date: 8 August 2025
---
**GF 3.12** was released on 8 August 2025.
What's new? See the [release notes](release-3.12.html).
#### Note: GF core and the RGL
The following instructions explain how to install **GF core**, i.e. the compiler, shell and run-time systems.
Obtaining the **Resource Grammar Library (RGL)** is done separately; see the section [at the bottom of this page](#installing-the-rgl-from-a-binary-release).
---
## Installing from a binary package
Binary packages are available for Debian/Ubuntu, macOS, and Windows and include:
- GF shell and grammar compiler
- `gf -server` mode
- C run-time system
- Python bindings to the C run-time system
[Binary packages on GitHub](https://github.com/GrammaticalFramework/gf-core/releases/tag/3.12)
#### Debian/Ubuntu
The package targets Ubuntu 24.04 (Noble).
To install it, use:
```
sudo apt install ./gf-3.12-ubuntu-24.04.deb
```
#### macOS
If you are on an Intel Mac (2019 or older), use `gf-3.12-macos-intel.pkg`.<br>
For newer ARM-based Macs (Apple Silicon M1, M2, M3), use `gf-3.12-macos-arm.pkg`.
After downloading, right click on the file and click on Open.[^1]
You will see a dialog saying that "macOS cannot verify the developer of "gf-3.12-macos-intel.pkg". Are you sure you want to open it?".
Press Open.
[^1]: If you just double click on the file, you will get an error message "gf-3.12-macos-intel.pkg" cannot be opened because it is from an unidentified developer.
#### Windows
To install the package:
1. unpack it anywhere and take note of the full path to the folder containing the `.exe` file.
2. add it to the `PATH` environment variable
For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10).
## Installing from Hackage
_Instructions applicable for macOS, Linux, and WSL2 on Windows._
[GF is on Hackage](http://hackage.haskell.org/package/gf), so under
normal circumstances the procedure is fairly simple:
```
cabal update
cabal install gf-3.12
```
### Notes
#### GHC version
The GF source code is known to be compilable with GHC versions 7.10 through to 9.6.7.
#### Obtaining Haskell
There are various ways of obtaining Haskell, including:
- ghcup
1. Install from https://www.haskell.org/ghcup/
2. `ghcup install ghc 9.6.7`
3. `ghcup set ghc 9.6.7`
- Stack: https://haskellstack.org/
#### Installation location
The above steps install GF for a single user.
The executables are put in `$HOME/.cabal/bin` (or on macOS in `$HOME/Library/Haskell/bin`),
so you might want to add this directory to your path (in `.bash_profile` or similar):
```
PATH=$HOME/.cabal/bin:$PATH
```
#### Haskeline
GF uses [`haskeline`](http://hackage.haskell.org/package/haskeline), which
on Linux depends on some non-Haskell libraries that won't be installed
automatically by Cabal, and therefore need to be installed manually.
Here is one way to do this:
- On Ubuntu: `sudo apt-get install libghc-haskeline-dev`
- On Fedora: `sudo dnf install ghc-haskeline-devel`
## Installing from source code
### Obtaining
To obtain the source code for the **release**,
download it from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases).
Alternatively, to obtain the **latest version** of the source code:
1. If you haven't already, clone the repository with:
```
git clone https://github.com/GrammaticalFramework/gf-core.git
```
2. If you've already cloned the repository previously, update with:
```
git pull
```
### Installing
You can then install with:
```
cabal install
```
or, if you're a Stack user:
```
stack install
```
<!--The above notes for installing from source apply also in these cases.-->
For more info on working with the GF source code, see the
[GF Developers Guide](../doc/gf-developers.html).
## Installing the Python bindings from PyPI
The Python library is available on PyPI as `pgf`, so it can be installed using:
```
pip install pgf
```
If this doesn't work, you will need to install the C runtime manually; see the instructions [here](https://www.grammaticalframework.org/doc/gf-developers.html#toc12).
---
## Installing the RGL from a binary release
Binary releases of the RGL are made available on [GitHub](https://github.com/GrammaticalFramework/gf-rgl/releases).
In general the steps to follow are:
1. Download a binary release and extract it somewhere on your system.
2. Set the environment variable `GF_LIB_PATH` to point to wherever you extracted the RGL.
For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10).
## Installing the RGL from source
To compile the RGL, you will need to have GF already installed and in your path.
1. Obtain the RGL source code, either by:
- cloning with `git clone https://github.com/GrammaticalFramework/gf-rgl.git`
- downloading a source archive [here](https://github.com/GrammaticalFramework/gf-rgl/archive/master.zip)
2. Run `make` in the source code folder.
For more options, see the [RGL README](https://github.com/GrammaticalFramework/gf-rgl/blob/master/README.md).
---
## Older releases
- [GF 3.11](index-3.11.html) (July 2021)
- [GF 3.10](index-3.10.html) (December 2018)
- [GF 3.9](index-3.9.html) (August 2017)
- [GF 3.8](index-3.8.html) (June 2016)
- [GF 3.7.1](index-3.7.1.html) (October 2015)
- [GF 3.7](index-3.7.html) (June 2015)
- [GF 3.6](index-3.6.html) (June 2014)
- [GF 3.5](index-3.5.html) (August 2013)
- [GF 3.4](index-3.4.html) (January 2013)
- [GF 3.3.3](index-3.3.3.html) (March 2012)
- [GF 3.3](index-3.3.html) (October 2011)
- [GF 3.2.9](index-3.2.9.html) source-only snapshot (September 2011)
- [GF 3.2](index-3.2.html) (December 2010)
- [GF 3.1.6](index-3.1.6.html) (April 2010)

View File

@@ -1,8 +1,8 @@
<html> <html>
<head> <head>
<meta http-equiv="refresh" content="0; URL=/download/index-3.11.html" /> <meta http-equiv="refresh" content="0; URL=/download/index-3.10.html" />
</head> </head>
<body> <body>
You are being redirected to <a href="index-3.12.html">the current version</a> of this page. You are being redirected to <a href="index-3.10.html">the current version</a> of this page.
</body> </body>
</html> </html>

View File

@@ -1,7 +1,7 @@
--- ---
title: GF 3.11 Release Notes title: GF 3.11 Release Notes
date: 25 July 2021 date: ... December 2020
--- ...
## Installation ## Installation
@@ -12,27 +12,24 @@ See the [download page](index-3.11.html).
From this release, the binary GF core packages do not contain the RGL. From this release, the binary GF core packages do not contain the RGL.
The RGL's release cycle is now completely separate from GF's. See [RGL releases](https://github.com/GrammaticalFramework/gf-rgl/releases). The RGL's release cycle is now completely separate from GF's. See [RGL releases](https://github.com/GrammaticalFramework/gf-rgl/releases).
Over 500 changes have been pushed to GF core Over 400 changes have been pushed to GF core
since the release of GF 3.10 in December 2018. since the release of GF 3.10 in December 2018.
## General ## General
- Make the test suite work again. - Make the test suite work again.
- Compatibility with new versions of GHC, including multiple Stack files for the different versions. - Compatibility with new versions of GHC, including multiple Stack files for the different versions.
- Support for newer version of Ubuntu 20.04 in the precompiled binaries. - Updates to build scripts and CI.
- Updates to build scripts and CI workflows. - Bug fixes.
- Bug fixes and code cleanup.
## GF compiler and run-time library ## GF compiler and run-time library
- Huge improvements in time & space requirements for grammar compilation (pending [#87](https://github.com/GrammaticalFramework/gf-core/pull/87)).
- Add CoNLL output to `visualize_tree` shell command. - Add CoNLL output to `visualize_tree` shell command.
- Add canonical GF as output format in the compiler. - Add canonical GF as output format in the compiler.
- Add PGF JSON as output format in the compiler. - Add PGF JSON as output format in the compiler.
- Deprecate JavaScript runtime in favour of updated [TypeScript runtime](https://github.com/GrammaticalFramework/gf-typescript). - Deprecate JavaScript runtime in favour of updated [TypeScript runtime](https://github.com/GrammaticalFramework/gf-typescript).
- Improvements in time & space requirements when compiling certain grammars.
- Improvements to Haskell export. - Improvements to Haskell export.
- Improvements to the GF shell.
- Improvements to canonical GF compilation.
- Improvements to the C runtime. - Improvements to the C runtime.
- Improvements to `gf -server` mode. - Improvements to `gf -server` mode.
- Clearer compiler error messages. - Clearer compiler error messages.

View File

@@ -1,37 +0,0 @@
---
title: GF 3.12 Release Notes
date: 08 August 2025
---
## Installation
See the [download page](index-3.12.html).
## What's new
This release adds support for Apple Silicon M1 Mac computers and newer versions of GHC, along with various improvements and bug fixes.
Over 70 commits have been merged to gf-core since the release of GF 3.11 in July 2021.
## General
- Support for ARM, allowing to run GF on Mac computers with Apple Silicon M1
- Support for newer versions of GHC (8.10.7, 9.0.2, 9.2.4, 9.4, 9.6.7)
- Support compiling with Nix
- Better error messages
- Improvements to several GF shell commands
- Several bug fixes and performance improvements
- Temporarily dropped support for Java bindings
## GF compiler and run-time library
- Syntactic sugar for table update: `table {cases ; vvv => t \! vvv}.t` can now be written as `t ** { cases }`
- Adjust the `-view` command depending on the OS
- Improve output of the `visualize_dependencies` (`vd`) command for large dependency trees
- Reintroduce syntactic transfer with `pt -transfer` and fix a bug in `pt -compute`
- Bug fix: apply `gt` to all arguments when piped
- Fix many "Invalid character" messages by always encoding GF files in UTF-8
- Improve performance with long extend-lists
- Improve syntax error messages
- Add support for BIND tokens in the Python bindings
- Allow compilation with emscripten
## Other
- Add support for Visual Studio Code

43
flake.lock generated
View File

@@ -1,43 +0,0 @@
{
"nodes": {
"nixpkgs": {
"locked": {
"lastModified": 1704290814,
"narHash": "sha256-LWvKHp7kGxk/GEtlrGYV68qIvPHkU9iToomNFGagixU=",
"owner": "NixOS",
"repo": "nixpkgs",
"rev": "70bdadeb94ffc8806c0570eb5c2695ad29f0e421",
"type": "github"
},
"original": {
"owner": "NixOS",
"ref": "nixos-23.05",
"repo": "nixpkgs",
"type": "github"
}
},
"root": {
"inputs": {
"nixpkgs": "nixpkgs",
"systems": "systems"
}
},
"systems": {
"locked": {
"lastModified": 1681028828,
"narHash": "sha256-Vy1rq5AaRuLzOxct8nz4T6wlgyUR7zLU309k9mBC768=",
"owner": "nix-systems",
"repo": "default",
"rev": "da67096a3b9bf56a91d16901293e51ba5b49a27e",
"type": "github"
},
"original": {
"owner": "nix-systems",
"repo": "default",
"type": "github"
}
}
},
"root": "root",
"version": 7
}

View File

@@ -1,50 +0,0 @@
{
inputs = {
nixpkgs.url = "github:NixOS/nixpkgs/nixos-23.05";
systems.url = "github:nix-systems/default";
};
nixConfig = {
# extra-trusted-public-keys =
# "devenv.cachix.org-1:w1cLUi8dv3hnoSPGAuibQv+f9TZLr6cv/Hm9XgU50cw=";
# extra-substituters = "https://devenv.cachix.org";
};
outputs = { self, nixpkgs, systems, ... }@inputs:
let forEachSystem = nixpkgs.lib.genAttrs (import systems);
in {
packages = forEachSystem (system:
let
pkgs = nixpkgs.legacyPackages.${system};
haskellPackages = pkgs.haskell.packages.ghc925.override {
overrides = self: _super: {
cgi = pkgs.haskell.lib.unmarkBroken (pkgs.haskell.lib.dontCheck
(self.callHackage "cgi" "3001.5.0.1" { }));
};
};
in {
gf = pkgs.haskell.lib.overrideCabal
(haskellPackages.callCabal2nixWithOptions "gf" self "--flag=-server"
{ }) (_old: {
# Fix utf8 encoding problems
patches = [
# Already applied in master
# (
# pkgs.fetchpatch {
# url = "https://github.com/anka-213/gf-core/commit/6f1ca05fddbcbc860898ddf10a557b513dfafc18.patch";
# sha256 = "17vn3hncxm1dwbgpfmrl6gk6wljz3r28j191lpv5zx741pmzgbnm";
# }
# )
./nix/expose-all.patch
./nix/revert-new-cabal-madness.patch
];
jailbreak = true;
# executableSystemDepends = [
# (pkgs.ncurses.override { enableStatic = true; })
# ];
# executableHaskellDepends = [ ];
});
});
};
}

View File

@@ -2,7 +2,7 @@ concrete FoodIta of Food = {
lincat lincat
Comment, Item, Kind, Quality = Str ; Comment, Item, Kind, Quality = Str ;
lin lin
Pred item quality = item ++ "è" ++ quality ; Pred item quality = item ++ "è" ++ quality ;
This kind = "questo" ++ kind ; This kind = "questo" ++ kind ;
That kind = "quel" ++ kind ; That kind = "quel" ++ kind ;
Mod quality kind = kind ++ quality ; Mod quality kind = kind ++ quality ;

View File

@@ -32,5 +32,5 @@ resource ResIta = open Prelude in {
in in
adjective nero (ner+"a") (ner+"i") (ner+"e") ; adjective nero (ner+"a") (ner+"i") (ner+"e") ;
copula : Number => Str = copula : Number => Str =
table {Sg => "è" ; Pl => "sono"} ; table {Sg => "è" ; Pl => "sono"} ;
} }

View File

@@ -9,12 +9,12 @@ instance LexFoodsFin of LexFoods =
fish_N = mkN "kala" ; fish_N = mkN "kala" ;
fresh_A = mkA "tuore" ; fresh_A = mkA "tuore" ;
warm_A = mkA warm_A = mkA
(mkN "lämmin" "lämpimän" "lämmintä" "lämpimänä" "lämpimään" (mkN "lämmin" "lämpimän" "lämmintä" "lämpimänä" "lämpimään"
"lämpiminä" "lämpimiä" "lämpimien" "lämpimissä" "lämpimiin" "lämpiminä" "lämpimiä" "lämpimien" "lämpimissä" "lämpimiin"
) )
"lämpimämpi" "lämpimin" ; "lämpimämpi" "lämpimin" ;
italian_A = mkA "italialainen" ; italian_A = mkA "italialainen" ;
expensive_A = mkA "kallis" ; expensive_A = mkA "kallis" ;
delicious_A = mkA "herkullinen" ; delicious_A = mkA "herkullinen" ;
boring_A = mkA "tylsä" ; boring_A = mkA "tylsä" ;
} }

View File

@@ -5,12 +5,12 @@ instance LexFoodsGer of LexFoods =
oper oper
wine_N = mkN "Wein" ; wine_N = mkN "Wein" ;
pizza_N = mkN "Pizza" "Pizzen" feminine ; pizza_N = mkN "Pizza" "Pizzen" feminine ;
cheese_N = mkN "Käse" "Käse" masculine ; cheese_N = mkN "Käse" "Käse" masculine ;
fish_N = mkN "Fisch" ; fish_N = mkN "Fisch" ;
fresh_A = mkA "frisch" ; fresh_A = mkA "frisch" ;
warm_A = mkA "warm" "wärmer" "wärmste" ; warm_A = mkA "warm" "wärmer" "wärmste" ;
italian_A = mkA "italienisch" ; italian_A = mkA "italienisch" ;
expensive_A = mkA "teuer" ; expensive_A = mkA "teuer" ;
delicious_A = mkA "köstlich" ; delicious_A = mkA "köstlich" ;
boring_A = mkA "langweilig" ; boring_A = mkA "langweilig" ;
} }

View File

@@ -7,10 +7,10 @@ instance LexFoodsSwe of LexFoods =
pizza_N = mkN "pizza" ; pizza_N = mkN "pizza" ;
cheese_N = mkN "ost" ; cheese_N = mkN "ost" ;
fish_N = mkN "fisk" ; fish_N = mkN "fisk" ;
fresh_A = mkA "färsk" ; fresh_A = mkA "färsk" ;
warm_A = mkA "varm" ; warm_A = mkA "varm" ;
italian_A = mkA "italiensk" ; italian_A = mkA "italiensk" ;
expensive_A = mkA "dyr" ; expensive_A = mkA "dyr" ;
delicious_A = mkA "läcker" ; delicious_A = mkA "läcker" ;
boring_A = mkA "tråkig" ; boring_A = mkA "tråkig" ;
} }

View File

@@ -6,7 +6,7 @@ concrete QueryFin of Query = {
Odd = pred "pariton" ; Odd = pred "pariton" ;
Prime = pred "alkuluku" ; Prime = pred "alkuluku" ;
Number i = i.s ; Number i = i.s ;
Yes = "kyllä" ; Yes = "kyllä" ;
No = "ei" ; No = "ei" ;
oper oper
pred : Str -> Str -> Str = \f,x -> "onko" ++ x ++ f ; pred : Str -> Str -> Str = \f,x -> "onko" ++ x ++ f ;

View File

@@ -46,7 +46,7 @@ oper
Avere => Avere =>
mkVerb "avere" "ho" "hai" "ha" "abbiamo" "avete" "hanno" "avuto" Avere ; mkVerb "avere" "ho" "hai" "ha" "abbiamo" "avete" "hanno" "avuto" Avere ;
Essere => Essere =>
mkVerb "essere" "sono" "sei" "è" "siamo" "siete" "sono" "stato" Essere mkVerb "essere" "sono" "sei" "è" "siamo" "siete" "sono" "stato" Essere
} ; } ;
agrPart : Verb -> Agr -> ClitAgr -> Str = \v,a,c -> case v.aux of { agrPart : Verb -> Agr -> ClitAgr -> Str = \v,a,c -> case v.aux of {

602
gf.cabal
View File

@@ -1,24 +1,19 @@
name: gf name: gf
version: 3.12.0 version: 3.10.4-git
cabal-version: 1.22 cabal-version: >= 1.22
build-type: Simple build-type: Custom
license: OtherLicense license: OtherLicense
license-file: LICENSE license-file: LICENSE
category: Natural Language Processing, Compiler category: Natural Language Processing, Compiler
synopsis: Grammatical Framework synopsis: Grammatical Framework
description: GF, Grammatical Framework, is a programming language for multilingual grammar applications description: GF, Grammatical Framework, is a programming language for multilingual grammar applications
maintainer: John J. Camilleri <john@digitalgrammars.com> homepage: http://www.grammaticalframework.org/
homepage: https://www.grammaticalframework.org/
bug-reports: https://github.com/GrammaticalFramework/gf-core/issues bug-reports: https://github.com/GrammaticalFramework/gf-core/issues
tested-with: GHC==7.10.3, GHC==8.0.2, GHC==8.10.4, GHC==9.0.2, GHC==9.2.4, GHC==9.6.7 maintainer: Thomas Hallgren
tested-with: GHC==7.10.3, GHC==8.0.2, GHC==8.2.2, GHC==8.4.3
data-dir: src data-dir: src
extra-source-files:
README.md
CHANGELOG.md
WebSetup.hs
doc/Logos/gf0.png
data-files: data-files:
www/*.html www/*.html
www/*.css www/*.css
@@ -44,17 +39,25 @@ data-files:
www/translator/*.css www/translator/*.css
www/translator/*.js www/translator/*.js
custom-setup
setup-depends:
base,
Cabal >=1.22.0.0,
directory,
filepath,
process >=1.0.1.1
source-repository head source-repository head
type: git type: git
location: https://github.com/GrammaticalFramework/gf-core.git location: https://github.com/GrammaticalFramework/gf-core.git
flag interrupt flag interrupt
Description: Enable Ctrl+Break in the shell Description: Enable Ctrl+Break in the shell
Default: True Default: True
flag server flag server
Description: Include --server mode Description: Include --server mode
Default: True Default: True
flag network-uri flag network-uri
description: Get Network.URI from the network-uri package description: Get Network.URI from the network-uri package
@@ -66,29 +69,27 @@ flag network-uri
flag c-runtime flag c-runtime
Description: Include functionality from the C run-time library (which must be installed already) Description: Include functionality from the C run-time library (which must be installed already)
Default: False Default: False
library
default-language: Haskell2010
build-depends:
-- GHC 8.0.2 to GHC 8.10.4
array >= 0.5.1 && < 0.6,
base >= 4.9.1 && < 4.22,
bytestring >= 0.10.8 && < 0.12,
containers >= 0.5.7 && < 0.7,
exceptions >= 0.8.3 && < 0.11,
ghc-prim >= 0.5.0 && <= 0.10.0,
mtl >= 2.2.1 && <= 2.3.1,
pretty >= 1.1.3 && < 1.2,
random >= 1.1 && < 1.3,
utf8-string >= 1.0.1.1 && < 1.1
if impl(ghc<8.0)
build-depends:
-- We need this in order for ghc-7.10 to build
transformers-compat >= 0.6.3 && < 0.7,
fail >= 4.9.0 && < 4.10
Library
default-language: Haskell2010
build-depends: base >= 4.6 && <5,
array,
containers,
bytestring,
utf8-string,
random,
pretty,
mtl,
exceptions,
fail,
-- For compatability with ghc < 8
-- We need transformers-compat >= 0.6.3, but that is only in newer snapshots where it is redundant.
transformers-compat,
ghc-prim,
text,
hashable,
unordered-containers
hs-source-dirs: src/runtime/haskell hs-source-dirs: src/runtime/haskell
other-modules: other-modules:
@@ -103,12 +104,13 @@ library
--ghc-options: -fwarn-unused-imports --ghc-options: -fwarn-unused-imports
--if impl(ghc>=7.8) --if impl(ghc>=7.8)
-- ghc-options: +RTS -A20M -RTS -- ghc-options: +RTS -A20M -RTS
-- ghc-prof-options: -fprof-auto ghc-prof-options: -fprof-auto
exposed-modules: exposed-modules:
PGF PGF
PGF.Internal PGF.Internal
PGF.Haskell PGF.Haskell
LPGF
other-modules: other-modules:
PGF.Data PGF.Data
@@ -137,29 +139,18 @@ library
if flag(c-runtime) if flag(c-runtime)
exposed-modules: PGF2 exposed-modules: PGF2
other-modules: other-modules: PGF2.FFI PGF2.Expr PGF2.Type
PGF2.FFI GF.Interactive2 GF.Command.Commands2
PGF2.Expr hs-source-dirs: src/runtime/haskell-bind
PGF2.Type build-tools: hsc2hs
GF.Interactive2
GF.Command.Commands2
hs-source-dirs: src/runtime/haskell-bind
build-tools: hsc2hs
extra-libraries: pgf gu extra-libraries: pgf gu
c-sources: src/runtime/haskell-bind/utils.c c-sources: src/runtime/haskell-bind/utils.c
cc-options: -std=c99 cc-options: -std=c99
---- GF compiler as a library: ---- GF compiler as a library:
build-depends: build-depends: filepath, directory>=1.2, time,
directory >= 1.3.0 && < 1.4, process, haskeline, parallel>=3, json
filepath >= 1.4.1 && < 1.5,
haskeline >= 0.7.3 && < 0.9,
json >= 0.9.1 && <= 0.11,
parallel >= 3.2.1.1 && < 3.3,
process >= 1.4.3 && < 1.7,
time >= 1.6.0 && <= 1.12.2,
template-haskell >= 2.13.0.0 && < 2.21
hs-source-dirs: src/compiler hs-source-dirs: src/compiler
exposed-modules: exposed-modules:
@@ -170,19 +161,12 @@ library
GF.Grammar.Canonical GF.Grammar.Canonical
other-modules: other-modules:
GF.Main GF.Main GF.Compiler GF.Interactive
GF.Compiler
GF.Interactive
GF.Compile GF.Compile GF.CompileInParallel GF.CompileOne GF.Compile.GetGrammar
GF.CompileInParallel
GF.CompileOne
GF.Compile.GetGrammar
GF.Grammar GF.Grammar
GF.Data.Operations GF.Data.Operations GF.Infra.Option GF.Infra.UseIO
GF.Infra.Option
GF.Infra.UseIO
GF.Command.Abstract GF.Command.Abstract
GF.Command.CommandInfo GF.Command.CommandInfo
@@ -197,13 +181,14 @@ library
GF.Command.TreeOperations GF.Command.TreeOperations
GF.Compile.CFGtoPGF GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar GF.Compile.CheckGrammar
GF.Compile.Compute.Concrete GF.Compile.Compute.ConcreteNew
GF.Compile.Compute.Predef GF.Compile.Compute.Predef
GF.Compile.Compute.Value GF.Compile.Compute.Value
GF.Compile.ExampleBased GF.Compile.ExampleBased
GF.Compile.Export GF.Compile.Export
GF.Compile.GenerateBC GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG GF.Compile.GeneratePMCFG
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF GF.Compile.GrammarToPGF
GF.Compile.Multi GF.Compile.Multi
GF.Compile.Optimize GF.Compile.Optimize
@@ -226,12 +211,14 @@ library
GF.Compile.TypeCheck.Concrete GF.Compile.TypeCheck.Concrete
GF.Compile.TypeCheck.ConcreteNew GF.Compile.TypeCheck.ConcreteNew
GF.Compile.TypeCheck.Primitives GF.Compile.TypeCheck.Primitives
GF.Compile.TypeCheck.RConcrete
GF.Compile.TypeCheck.TC GF.Compile.TypeCheck.TC
GF.Compile.Update GF.Compile.Update
GF.Data.BacktrackM GF.Data.BacktrackM
GF.Data.ErrM GF.Data.ErrM
GF.Data.Graph GF.Data.Graph
GF.Data.Graphviz GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Relation GF.Data.Relation
GF.Data.Str GF.Data.Str
GF.Data.Utilities GF.Data.Utilities
@@ -292,17 +279,12 @@ library
cpp-options: -DC_RUNTIME cpp-options: -DC_RUNTIME
if flag(server) if flag(server)
build-depends: build-depends: httpd-shed>=0.4.0.3, network>=2.3 && <2.7,
cgi >= 3001.3.0.2 && < 3001.6, cgi>=3001.2.2.0
httpd-shed >= 0.4.0 && < 0.5,
network>=2.3 && <3.2
if flag(network-uri) if flag(network-uri)
build-depends: build-depends: network-uri>=2.6, network>=2.6
network-uri >= 2.6.1.0 && < 2.7,
network>=2.6 && <3.2
else else
build-depends: build-depends: network<2.6
network >= 2.5 && <3.2
cpp-options: -DSERVER_MODE cpp-options: -DSERVER_MODE
other-modules: other-modules:
@@ -319,10 +301,7 @@ library
Fold Fold
ExampleDemo ExampleDemo
ExampleService ExampleService
hs-source-dirs: hs-source-dirs: src/server src/server/transfer src/example-based
src/server
src/server/transfer
src/example-based
if flag(interrupt) if flag(interrupt)
cpp-options: -DUSE_INTERRUPT cpp-options: -DUSE_INTERRUPT
@@ -331,41 +310,26 @@ library
other-modules: GF.System.NoSignal other-modules: GF.System.NoSignal
if impl(ghc>=7.8) if impl(ghc>=7.8)
build-tools: build-tools: happy>=1.19, alex>=3.1
happy>=1.19,
alex>=3.1
-- ghc-options: +RTS -A20M -RTS -- ghc-options: +RTS -A20M -RTS
else else
build-tools: build-tools: happy, alex>=3
happy,
alex>=3
ghc-options: -fno-warn-tabs ghc-options: -fno-warn-tabs
if os(windows) if os(windows)
build-depends: build-depends: Win32
Win32 >= 2.3.1.1 && < 2.7
else else
build-depends: build-depends: unix, terminfo>=0.4
terminfo >=0.4.0 && < 0.5
if impl(ghc >= 9.6.6)
build-depends: unix >= 2.8 && < 2.9
else
build-depends: unix >= 2.7.2 && < 2.8
if impl(ghc>=8.2) if impl(ghc>=8.2)
ghc-options: -fhide-source-paths ghc-options: -fhide-source-paths
executable gf Executable gf
hs-source-dirs: src/programs hs-source-dirs: src/programs
main-is: gf-main.hs main-is: gf-main.hs
default-language: Haskell2010 default-language: Haskell2010
build-depends: build-depends: gf, base
gf,
base >= 4.9.1 && < 4.22
ghc-options: -threaded ghc-options: -threaded
--ghc-options: -fwarn-unused-imports --ghc-options: -fwarn-unused-imports
@@ -374,35 +338,423 @@ executable gf
if impl(ghc<7.8) if impl(ghc<7.8)
ghc-options: -with-rtsopts=-K64M ghc-options: -with-rtsopts=-K64M
-- ghc-prof-options: -auto-all ghc-prof-options: -auto-all
if impl(ghc>=8.2) if impl(ghc>=8.2)
ghc-options: -fhide-source-paths ghc-options: -fhide-source-paths
-- executable pgf-shell executable pgf-shell
-- --if !flag(c-runtime) --if !flag(c-runtime)
-- buildable: False buildable: False
-- main-is: pgf-shell.hs main-is: pgf-shell.hs
-- hs-source-dirs: src/runtime/haskell-bind/examples hs-source-dirs: src/runtime/haskell-bind/examples
-- build-depends: build-depends: gf, base, containers, mtl, lifted-base
-- gf, default-language: Haskell2010
-- base, if impl(ghc>=7.0)
-- containers, ghc-options: -rtsopts
-- mtl,
-- lifted-base
-- default-language: Haskell2010
-- if impl(ghc>=7.0)
-- ghc-options: -rtsopts
test-suite gf-tests test-suite gf-tests
type: exitcode-stdio-1.0 type: exitcode-stdio-1.0
main-is: run.hs main-is: run.hs
hs-source-dirs: testsuite hs-source-dirs: testsuite
build-depends: base>=4.3 && <5, Cabal>=1.8, directory, filepath, process
default-language: Haskell2010
test-suite lpgf
type: exitcode-stdio-1.0
main-is: test.hs
hs-source-dirs:
src/compiler
src/runtime/haskell
testsuite/lpgf
other-modules:
Data.Binary
Data.Binary.Builder
Data.Binary.Get
Data.Binary.IEEE754
Data.Binary.Put
GF
GF.Command.Abstract
GF.Command.CommandInfo
GF.Command.Commands
GF.Command.CommonCommands
GF.Command.Help
GF.Command.Importing
GF.Command.Interpreter
GF.Command.Messages
GF.Command.Parse
GF.Command.SourceCommands
GF.Command.TreeOperations
GF.Compile
GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar
GF.Compile.Compute.ConcreteNew
GF.Compile.Compute.Predef
GF.Compile.Compute.Value
GF.Compile.ConcreteToHaskell
GF.Compile.ExampleBased
GF.Compile.Export
GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG
GF.Compile.GetGrammar
GF.Compile.GrammarToCanonical
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF
GF.Compile.Multi
GF.Compile.Optimize
GF.Compile.PGFtoHaskell
GF.Compile.PGFtoJava
GF.Compile.PGFtoJS
GF.Compile.PGFtoJSON
GF.Compile.PGFtoProlog
GF.Compile.PGFtoPython
GF.Compile.ReadFiles
GF.Compile.Rename
GF.Compile.SubExOpt
GF.Compile.Tags
GF.Compile.ToAPI
GF.Compile.TypeCheck.Abstract
GF.Compile.TypeCheck.ConcreteNew
GF.Compile.TypeCheck.Primitives
GF.Compile.TypeCheck.RConcrete
GF.Compile.TypeCheck.TC
GF.Compile.Update
GF.CompileInParallel
GF.CompileOne
GF.Compiler
GF.Data.BacktrackM
GF.Data.ErrM
GF.Data.Graph
GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Operations
GF.Data.Relation
GF.Data.Str
GF.Data.Utilities
GF.Data.XML
GF.Grammar
GF.Grammar.Analyse
GF.Grammar.Binary
GF.Grammar.BNFC
GF.Grammar.Canonical
GF.Grammar.CanonicalJSON
GF.Grammar.CFG
GF.Grammar.EBNF
GF.Grammar.Grammar
GF.Grammar.Lexer
GF.Grammar.Lockfield
GF.Grammar.Lookup
GF.Grammar.Macros
GF.Grammar.Parser
GF.Grammar.PatternMatch
GF.Grammar.Predef
GF.Grammar.Printer
GF.Grammar.ShowTerm
GF.Grammar.Unify
GF.Grammar.Values
GF.Haskell
GF.Infra.BuildInfo
GF.Infra.CheckM
GF.Infra.Concurrency
GF.Infra.Dependencies
GF.Infra.GetOpt
GF.Infra.Ident
GF.Infra.Location
GF.Infra.Option
GF.Infra.SIO
GF.Infra.UseIO
GF.Interactive
GF.JavaScript.AbsJS
GF.JavaScript.PrintJS
GF.Main
GF.Quiz
GF.Speech.CFGToFA
GF.Speech.FiniteState
GF.Speech.GSL
GF.Speech.JSGF
GF.Speech.PGFToCFG
GF.Speech.PrRegExp
GF.Speech.RegExp
GF.Speech.SISR
GF.Speech.SLF
GF.Speech.SRG
GF.Speech.SRGS_ABNF
GF.Speech.SRGS_XML
GF.Speech.VoiceXML
GF.Support
GF.System.Catch
GF.System.Concurrency
GF.System.Console
GF.System.Directory
GF.System.Process
GF.System.Signal
GF.Text.Clitics
GF.Text.Coding
GF.Text.Lexing
GF.Text.Pretty
GF.Text.Transliterations
LPGF
PGF
PGF.Binary
PGF.ByteCode
PGF.CId
PGF.Data
PGF.Expr
PGF.Forest
PGF.Generate
PGF.Internal
PGF.Linearize
PGF.Macros
PGF.Morphology
PGF.OldBinary
PGF.Optimize
PGF.Paraphrase
PGF.Parse
PGF.Printer
PGF.Probabilistic
PGF.Tree
PGF.TrieMap
PGF.Type
PGF.TypeCheck
PGF.Utilities
PGF.VisualizeTree
Paths_gf
if flag(interrupt)
cpp-options: -DUSE_INTERRUPT
other-modules: GF.System.UseSignal
else
other-modules: GF.System.NoSignal
build-depends: build-depends:
base >= 4.9.1 && < 4.22, ansi-terminal,
Cabal >= 1.8, array,
directory >= 1.3.0 && < 1.4, base>=4.6 && <5,
filepath >= 1.4.1 && < 1.5, bytestring,
process >= 1.4.3 && < 1.7 containers,
build-tool-depends: gf:gf directory,
filepath,
ghc-prim,
hashable,
haskeline,
json,
mtl,
parallel>=3,
pretty,
process,
random,
terminfo,
text,
time,
transformers-compat,
unix,
unordered-containers,
utf8-string
default-language: Haskell2010
benchmark lpgf-bench
type: exitcode-stdio-1.0
main-is: bench.hs
hs-source-dirs:
src/compiler
src/runtime/haskell
testsuite/lpgf
other-modules:
Data.Binary
Data.Binary.Builder
Data.Binary.Get
Data.Binary.IEEE754
Data.Binary.Put
GF
GF.Command.Abstract
GF.Command.CommandInfo
GF.Command.Commands
GF.Command.CommonCommands
GF.Command.Help
GF.Command.Importing
GF.Command.Interpreter
GF.Command.Messages
GF.Command.Parse
GF.Command.SourceCommands
GF.Command.TreeOperations
GF.Compile
GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar
GF.Compile.Compute.ConcreteNew
GF.Compile.Compute.Predef
GF.Compile.Compute.Value
GF.Compile.ConcreteToHaskell
GF.Compile.ExampleBased
GF.Compile.Export
GF.Compile.GenerateBC
GF.Compile.GeneratePMCFG
GF.Compile.GetGrammar
GF.Compile.GrammarToCanonical
GF.Compile.GrammarToLPGF
GF.Compile.GrammarToPGF
GF.Compile.Multi
GF.Compile.Optimize
GF.Compile.PGFtoHaskell
GF.Compile.PGFtoJS
GF.Compile.PGFtoJSON
GF.Compile.PGFtoJava
GF.Compile.PGFtoProlog
GF.Compile.PGFtoPython
GF.Compile.ReadFiles
GF.Compile.Rename
GF.Compile.SubExOpt
GF.Compile.Tags
GF.Compile.ToAPI
GF.Compile.TypeCheck.Abstract
GF.Compile.TypeCheck.ConcreteNew
GF.Compile.TypeCheck.Primitives
GF.Compile.TypeCheck.RConcrete
GF.Compile.TypeCheck.TC
GF.Compile.Update
GF.CompileInParallel
GF.CompileOne
GF.Compiler
GF.Data.BacktrackM
GF.Data.ErrM
GF.Data.Graph
GF.Data.Graphviz
GF.Data.IntMapBuilder
GF.Data.Operations
GF.Data.Relation
GF.Data.Str
GF.Data.Utilities
GF.Data.XML
GF.Grammar
GF.Grammar.Analyse
GF.Grammar.BNFC
GF.Grammar.Binary
GF.Grammar.CFG
GF.Grammar.Canonical
GF.Grammar.CanonicalJSON
GF.Grammar.EBNF
GF.Grammar.Grammar
GF.Grammar.Lexer
GF.Grammar.Lockfield
GF.Grammar.Lookup
GF.Grammar.Macros
GF.Grammar.Parser
GF.Grammar.PatternMatch
GF.Grammar.Predef
GF.Grammar.Printer
GF.Grammar.ShowTerm
GF.Grammar.Unify
GF.Grammar.Values
GF.Haskell
GF.Infra.BuildInfo
GF.Infra.CheckM
GF.Infra.Concurrency
GF.Infra.Dependencies
GF.Infra.GetOpt
GF.Infra.Ident
GF.Infra.Location
GF.Infra.Option
GF.Infra.SIO
GF.Infra.UseIO
GF.Interactive
GF.JavaScript.AbsJS
GF.JavaScript.PrintJS
GF.Main
GF.Quiz
GF.Speech.CFGToFA
GF.Speech.FiniteState
GF.Speech.GSL
GF.Speech.JSGF
GF.Speech.PGFToCFG
GF.Speech.PrRegExp
GF.Speech.RegExp
GF.Speech.SISR
GF.Speech.SLF
GF.Speech.SRG
GF.Speech.SRGS_ABNF
GF.Speech.SRGS_XML
GF.Speech.VoiceXML
GF.Support
GF.System.Catch
GF.System.Concurrency
GF.System.Console
GF.System.Directory
GF.System.Process
GF.System.Signal
GF.Text.Clitics
GF.Text.Coding
GF.Text.Lexing
GF.Text.Pretty
GF.Text.Transliterations
LPGF
PGF
PGF.Binary
PGF.ByteCode
PGF.CId
PGF.Data
PGF.Expr
PGF.Expr
PGF.Forest
PGF.Generate
PGF.Internal
PGF.Linearize
PGF.Macros
PGF.Morphology
PGF.OldBinary
PGF.Optimize
PGF.Paraphrase
PGF.Parse
PGF.Printer
PGF.Probabilistic
PGF.Tree
PGF.TrieMap
PGF.Type
PGF.TypeCheck
PGF.Utilities
PGF.VisualizeTree
PGF2
PGF2.Expr
PGF2.Type
PGF2.FFI
Paths_gf
if flag(interrupt)
cpp-options: -DUSE_INTERRUPT
other-modules: GF.System.UseSignal
else
other-modules: GF.System.NoSignal
hs-source-dirs:
src/runtime/haskell-bind
other-modules:
PGF2
PGF2.FFI
PGF2.Expr
PGF2.Type
build-tools: hsc2hs
extra-libraries: pgf gu
c-sources: src/runtime/haskell-bind/utils.c
cc-options: -std=c99
build-depends:
ansi-terminal,
array,
base>=4.6 && <5,
bytestring,
containers,
deepseq,
directory,
filepath,
ghc-prim,
hashable,
haskeline,
json,
mtl,
parallel>=3,
pretty,
process,
random,
terminfo,
text,
time,
transformers-compat,
unix,
unordered-containers,
utf8-string
default-language: Haskell2010 default-language: Haskell2010

View File

@@ -8,7 +8,7 @@
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no"> <meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
<link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css" integrity="sha384-MCw98/SFnGE8fJT3GXwEOngsV7Zt27NXFoaoApmYm81iuXoPkFOJwJ8ERdknLPMO" crossorigin="anonymous"> <link rel="stylesheet" href="https://stackpath.bootstrapcdn.com/bootstrap/4.1.3/css/bootstrap.min.css" integrity="sha384-MCw98/SFnGE8fJT3GXwEOngsV7Zt27NXFoaoApmYm81iuXoPkFOJwJ8ERdknLPMO" crossorigin="anonymous">
<link rel="stylesheet" href="https://use.fontawesome.com/releases/v5.15.4/css/all.css" crossorigin="anonymous"> <link rel="stylesheet" href="https://use.fontawesome.com/releases/v5.4.2/css/all.css" integrity="sha384-/rXc/GQVaYpyDdyxK+ecHPVYJSN9bmVFBvjA/9eOB+pb3F2w2N6fc5qB9Ew5yIns" crossorigin="anonymous">
<link rel="alternate" href="https://github.com/GrammaticalFramework/gf-core/" title="GF GitHub repository"> <link rel="alternate" href="https://github.com/GrammaticalFramework/gf-core/" title="GF GitHub repository">
</head> </head>
@@ -57,7 +57,6 @@
<li><a href="doc/gf-shell-reference.html">Shell Reference</a></li> <li><a href="doc/gf-shell-reference.html">Shell Reference</a></li>
<li><a href="http://www.molto-project.eu/sites/default/files/MOLTO_D2.3.pdf">Best Practices</a> <small>[PDF]</small></li> <li><a href="http://www.molto-project.eu/sites/default/files/MOLTO_D2.3.pdf">Best Practices</a> <small>[PDF]</small></li>
<li><a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Scaling Up (Computational Linguistics 2020)</a></li> <li><a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Scaling Up (Computational Linguistics 2020)</a></li>
<li><a href="https://inariksit.github.io/blog/">GF blog</a></li>
</ul> </ul>
<a href="lib/doc/synopsis/index.html" class="btn btn-primary ml-3"> <a href="lib/doc/synopsis/index.html" class="btn btn-primary ml-3">
@@ -86,22 +85,10 @@
<div class="col-sm-6 col-md-3 mb-4"> <div class="col-sm-6 col-md-3 mb-4">
<h3>Contribute</h3> <h3>Contribute</h3>
<ul class="mb-2"> <ul class="mb-2">
<li> <li><a href="http://groups.google.com/group/gf-dev">Mailing List</a></li>
<a href="https://discord.gg/EvfUsjzmaz">
<i class="fab fa-discord"></i>
Discord
</a>
</li>
<li>
<a href="https://stackoverflow.com/questions/tagged/gf">
<i class="fab fa-stack-overflow"></i>
Stack Overflow
</a>
</li>
<li><a href="https://groups.google.com/group/gf-dev">Mailing List</a></li>
<li><a href="https://github.com/GrammaticalFramework/gf-core/issues">Issue Tracker</a></li> <li><a href="https://github.com/GrammaticalFramework/gf-core/issues">Issue Tracker</a></li>
<li><a href="//school.grammaticalframework.org/">Summer School</a></li>
<li><a href="doc/gf-people.html">Authors</a></li> <li><a href="doc/gf-people.html">Authors</a></li>
<li><a href="//school.grammaticalframework.org/2020/">Summer School</a></li>
</ul> </ul>
<a href="https://github.com/GrammaticalFramework/" class="btn btn-primary ml-3"> <a href="https://github.com/GrammaticalFramework/" class="btn btn-primary ml-3">
<i class="fab fa-github mr-1"></i> <i class="fab fa-github mr-1"></i>
@@ -167,7 +154,7 @@ least one, it may help you to get a first idea of what GF is.
<div class="row"> <div class="row">
<div class="col-md-6"> <div class="col-md-6">
<h2>Applications & availability</h2> <h2>Applications & Availability</h2>
<p> <p>
GF can be used for building GF can be used for building
<a href="//cloud.grammaticalframework.org/translator/">translation systems</a>, <a href="//cloud.grammaticalframework.org/translator/">translation systems</a>,
@@ -227,50 +214,56 @@ least one, it may help you to get a first idea of what GF is.
</p> </p>
<p> <p>
We run the <a href="https://discord.gg/EvfUsjzmaz">GF server on Discord</a>, where you are welcome to look for help with small questions or just start a general discussion. We run the IRC channel <strong><code>#gf</code></strong> on the Freenode network, where you are welcome to look for help with small questions or just start a general discussion.
You can <a href="https://webchat.freenode.net/?channels=gf">open a web chat</a>
or <a href="/irc/">browse the channel logs</a>.
</p> </p>
<p> <p>
For bug reports and feature requests, please create an issue in the If you have a larger question which the community may benefit from, we recommend you ask it on the <a href="http://groups.google.com/group/gf-dev">mailing list</a>.
<a href="https://github.com/GrammaticalFramework/gf-core/issues">GF Core</a> or
<a href="https://github.com/GrammaticalFramework/gf-rgl/issues">RGL</a> repository.
For programming questions, consider asking them on <a href="https://stackoverflow.com/questions/tagged/gf">Stack Overflow with the <code>gf</code> tag</a>.
If you have a more general question to the community, we recommend you ask it on the <a href="http://groups.google.com/group/gf-dev">mailing list</a>.
</p> </p>
</div> </div>
<div class="col-md-6"> <div class="col-md-6">
<h2>News</h2> <h2>News</h2>
<dl class="row"> <dl class="row">
<dt class="col-sm-3 text-center text-nowrap">2025-08-08</dt> <dt class="col-sm-3 text-center text-nowrap">2021-03-01</dt>
<dd class="col-sm-9"> <dd class="col-sm-9">
<strong>GF 3.12 released.</strong> <a href="//school.grammaticalframework.org/2020/">Seventh GF Summer School</a>, in Singapore and online, 26 July &ndash; 8 August 2021.
<a href="download/release-3.12.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2025-01-18</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2025/">9th GF Summer School</a>, in Gothenburg, Sweden, 18 &ndash; 29 August 2025.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2023-01-24</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2023/">8th GF Summer School</a>, in Tampere, Finland, 14 &ndash; 25 August 2023.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2021-07-25</dt>
<dd class="col-sm-9">
<strong>GF 3.11 released.</strong>
<a href="download/release-3.11.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2021-05-05</dt>
<dd class="col-sm-9">
<a href="https://cloud.grammaticalframework.org/wordnet/">GF WordNet</a> now supports languages for which there are no other WordNets. New additions: Afrikaans, German, Korean, Maltese, Polish, Somali, Swahili.
</dd> </dd>
<dt class="col-sm-3 text-center text-nowrap">2020-09-29</dt> <dt class="col-sm-3 text-center text-nowrap">2020-09-29</dt>
<dd class="col-sm-9"> <dd class="col-sm-9">
<a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Abstract Syntax as Interlingua</a>: Scaling Up the Grammatical Framework from Controlled Languages to Robust Pipelines. A paper in Computational Linguistics (2020) summarizing much of the development in GF in the past ten years. <a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Abstract Syntax as Interlingua</a>: Scaling Up the Grammatical Framework from Controlled Languages to Robust Pipelines. A paper in Computational Linguistics (2020) summarizing much of the development in GF in the past ten years.
</dd> </dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-03</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2018/">Sixth GF Summer School</a> in Stellenbosch (South Africa), 314 December 2018
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-02</dt>
<dd class="col-sm-9">
<strong>GF 3.10 released.</strong>
<a href="download/release-3.10.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-07-25</dt>
<dd class="col-sm-9">
The GF repository has been split in two:
<a href="https://github.com/GrammaticalFramework/gf-core">gf-core</a> and
<a href="https://github.com/GrammaticalFramework/gf-rgl">gf-rgl</a>.
The original <a href="https://github.com/GrammaticalFramework/GF">GF</a> repository is now archived.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-08-11</dt>
<dd class="col-sm-9">
<strong>GF 3.9 released.</strong>
<a href="download/release-3.9.html">Release notes</a>
</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-06-29</dt>
<dd class="col-sm-9">
GF is moving to <a href="https://github.com/GrammaticalFramework/GF/">GitHub</a>.</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-03-13</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2017/">GF Summer School</a> in Riga (Latvia), 14-25 August 2017
</dd>
</dl> </dl>
<h2>Projects</h2> <h2>Projects</h2>
@@ -340,7 +333,7 @@ least one, it may help you to get a first idea of what GF is.
Libraries are at the heart of modern software engineering. In natural language Libraries are at the heart of modern software engineering. In natural language
applications, libraries are a way to cope with thousands of details involved in applications, libraries are a way to cope with thousands of details involved in
syntax, lexicon, and inflection. The syntax, lexicon, and inflection. The
<a href="lib/doc/synopsis/index.html">GF resource grammar library</a> (RGL) has <a href="lib/doc/synopsis/index.html">GF resource grammar library</a> has
support for an increasing number of languages, currently including support for an increasing number of languages, currently including
Afrikaans, Afrikaans,
Amharic (partial), Amharic (partial),

View File

@@ -1,12 +0,0 @@
diff --git a/gf.cabal b/gf.cabal
index 0076e7638..8d3fe4b49 100644
--- a/gf.cabal
+++ b/gf.cabal
@@ -168,7 +168,6 @@ Library
GF.Text.Lexing
GF.Grammar.Canonical
- other-modules:
GF.Main
GF.Compiler
GF.Interactive

View File

@@ -1,193 +0,0 @@
commit 45e5473fcd5707af93646d9a116867a4d4e3e9c9
Author: Andreas Källberg <anka.213@gmail.com>
Date: Mon Oct 10 14:57:12 2022 +0200
Revert "workaround for the Nix madness"
This reverts commit 1294269cd60f3db7b056135104615625baeb528c.
There are easier workarounds, like using
cabal v1-build
etc. instead of just `cabal build`
These changes also broke a whole bunch of other stuff
diff --git a/README.md b/README.md
index ba35795a4..79e6ab68f 100644
--- a/README.md
+++ b/README.md
@@ -38,21 +38,6 @@ or:
```
stack install
```
-Note that if you are unlucky to have Cabal 3.0 or later, then it uses
-the so-called Nix style commands. Using those for GF development is
-a pain. Every time when you change something in the source code, Cabal
-will generate a new folder for GF to look for the GF libraries and
-the GF cloud. Either reinstall everything with every change in the
-compiler, or be sane and stop using cabal-install. Instead you can do:
-```
-runghc Setup.hs configure
-runghc Setup.hs build
-sudo runghc Setup.hs install
-```
-The script will install the GF dependencies globally. The only solution
-to the Nix madness that I found is radical:
-
- "No person, no problem" (Нет человека нет проблемы).
For more information, including links to precompiled binaries, see the [download page](https://www.grammaticalframework.org/download/index.html).
diff --git a/Setup.hs b/Setup.hs
index 58dc3e0c6..f8309cc00 100644
--- a/Setup.hs
+++ b/Setup.hs
@@ -4,68 +4,42 @@ import Distribution.Simple.LocalBuildInfo(LocalBuildInfo(..),absoluteInstallDirs
import Distribution.Simple.Setup(BuildFlags(..),Flag(..),InstallFlags(..),CopyDest(..),CopyFlags(..),SDistFlags(..))
import Distribution.PackageDescription(PackageDescription(..),emptyHookedBuildInfo)
import Distribution.Simple.BuildPaths(exeExtension)
-import System.Directory
import System.FilePath((</>),(<.>))
-import System.Process
-import Control.Monad(forM_,unless)
-import Control.Exception(bracket_)
-import Data.Char(isSpace)
import WebSetup
+-- | Notice about RGL not built anymore
+noRGLmsg :: IO ()
+noRGLmsg = putStrLn "Notice: the RGL is not built as part of GF anymore. See https://github.com/GrammaticalFramework/gf-rgl"
+
main :: IO ()
main = defaultMainWithHooks simpleUserHooks
- { preConf = gfPreConf
- , preBuild = gfPreBuild
+ { preBuild = gfPreBuild
, postBuild = gfPostBuild
, preInst = gfPreInst
, postInst = gfPostInst
, postCopy = gfPostCopy
}
where
- gfPreConf args flags = do
- pkgs <- fmap (map (dropWhile isSpace) . tail . lines)
- (readProcess "ghc-pkg" ["list"] "")
- forM_ dependencies $ \pkg -> do
- let name = takeWhile (/='/') (drop 36 pkg)
- unless (name `elem` pkgs) $ do
- let fname = name <.> ".tar.gz"
- callProcess "wget" [pkg,"-O",fname]
- callProcess "tar" ["-xzf",fname]
- removeFile fname
- bracket_ (setCurrentDirectory name) (setCurrentDirectory ".." >> removeDirectoryRecursive name) $ do
- exists <- doesFileExist "Setup.hs"
- unless exists $ do
- writeFile "Setup.hs" (unlines [
- "import Distribution.Simple",
- "main = defaultMain"
- ])
- let to_descr = reverse .
- (++) (reverse ".cabal") .
- drop 1 .
- dropWhile (/='-') .
- reverse
- callProcess "wget" [to_descr pkg, "-O", to_descr name]
- callProcess "runghc" ["Setup.hs","configure"]
- callProcess "runghc" ["Setup.hs","build"]
- callProcess "sudo" ["runghc","Setup.hs","install"]
-
- preConf simpleUserHooks args flags
-
- gfPreBuild args = gfPre args . buildDistPref
- gfPreInst args = gfPre args . installDistPref
+ gfPreBuild args = gfPre args . buildDistPref
+ gfPreInst args = gfPre args . installDistPref
gfPre args distFlag = do
return emptyHookedBuildInfo
gfPostBuild args flags pkg lbi = do
+ -- noRGLmsg
let gf = default_gf lbi
buildWeb gf flags (pkg,lbi)
gfPostInst args flags pkg lbi = do
+ -- noRGLmsg
+ saveInstallPath args flags (pkg,lbi)
installWeb (pkg,lbi)
gfPostCopy args flags pkg lbi = do
+ -- noRGLmsg
+ saveCopyPath args flags (pkg,lbi)
copyWeb flags (pkg,lbi)
-- `cabal sdist` will not make a proper dist archive, for that see `make sdist`
@@ -73,16 +47,27 @@ main = defaultMainWithHooks simpleUserHooks
gfSDist pkg lbi hooks flags = do
return ()
-dependencies = [
- "https://hackage.haskell.org/package/utf8-string-1.0.2/utf8-string-1.0.2.tar.gz",
- "https://hackage.haskell.org/package/json-0.10/json-0.10.tar.gz",
- "https://hackage.haskell.org/package/network-bsd-2.8.1.0/network-bsd-2.8.1.0.tar.gz",
- "https://hackage.haskell.org/package/httpd-shed-0.4.1.1/httpd-shed-0.4.1.1.tar.gz",
- "https://hackage.haskell.org/package/exceptions-0.10.5/exceptions-0.10.5.tar.gz",
- "https://hackage.haskell.org/package/stringsearch-0.3.6.6/stringsearch-0.3.6.6.tar.gz",
- "https://hackage.haskell.org/package/multipart-0.2.1/multipart-0.2.1.tar.gz",
- "https://hackage.haskell.org/package/cgi-3001.5.0.0/cgi-3001.5.0.0.tar.gz"
- ]
+saveInstallPath :: [String] -> InstallFlags -> (PackageDescription, LocalBuildInfo) -> IO ()
+saveInstallPath args flags bi = do
+ let
+ dest = NoCopyDest
+ dir = datadir (uncurry absoluteInstallDirs bi dest)
+ writeFile dataDirFile dir
+
+saveCopyPath :: [String] -> CopyFlags -> (PackageDescription, LocalBuildInfo) -> IO ()
+saveCopyPath args flags bi = do
+ let
+ dest = case copyDest flags of
+ NoFlag -> NoCopyDest
+ Flag d -> d
+ dir = datadir (uncurry absoluteInstallDirs bi dest)
+ writeFile dataDirFile dir
+
+-- | Name of file where installation's data directory is recording
+-- This is a last-resort way in which the seprate RGL build script
+-- can determine where to put the compiled RGL files
+dataDirFile :: String
+dataDirFile = "DATA_DIR"
-- | Get path to locally-built gf
default_gf :: LocalBuildInfo -> FilePath
diff --git a/gf.cabal b/gf.cabal
index a055b86be..d00a5b935 100644
--- a/gf.cabal
+++ b/gf.cabal
@@ -2,7 +2,7 @@ name: gf
version: 3.11.0-git
cabal-version: 1.22
-build-type: Simple
+build-type: Custom
license: OtherLicense
license-file: LICENSE
category: Natural Language Processing, Compiler
@@ -44,6 +44,14 @@ data-files:
www/translator/*.css
www/translator/*.js
+custom-setup
+ setup-depends:
+ base >= 4.9.1 && < 4.16,
+ Cabal >= 1.22.0.0,
+ directory >= 1.3.0 && < 1.4,
+ filepath >= 1.4.1 && < 1.5,
+ process >= 1.0.1.1 && < 1.7
+
source-repository head
type: git
location: https://github.com/GrammaticalFramework/gf-core.git

View File

@@ -1,10 +1,9 @@
{-# LANGUAGE FlexibleInstances, UndecidableInstances, CPP #-} {-# LANGUAGE FlexibleInstances, UndecidableInstances #-}
module GF.Command.Commands ( module GF.Command.Commands (
PGFEnv,HasPGFEnv(..),pgf,mos,pgfEnv,pgfCommands, PGFEnv,HasPGFEnv(..),pgf,mos,pgfEnv,pgfCommands,
options,flags, options,flags,
) where ) where
import Prelude hiding (putStrLn,(<>)) -- GHC 8.4.1 clash with Text.PrettyPrint import Prelude hiding (putStrLn,(<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import System.Info(os)
import PGF import PGF
@@ -22,7 +21,6 @@ import GF.Infra.SIO
import GF.Command.Abstract import GF.Command.Abstract
import GF.Command.CommandInfo import GF.Command.CommandInfo
import GF.Command.CommonCommands import GF.Command.CommonCommands
import qualified GF.Command.CommonCommands as Common
import GF.Text.Clitics import GF.Text.Clitics
import GF.Quiz import GF.Quiz
@@ -167,15 +165,14 @@ pgfCommands = Map.fromList [
synopsis = "generate random trees in the current abstract syntax", synopsis = "generate random trees in the current abstract syntax",
syntax = "gr [-cat=CAT] [-number=INT]", syntax = "gr [-cat=CAT] [-number=INT]",
examples = [ examples = [
mkEx $ "gr -- one tree in the startcat of the current grammar, up to depth " ++ Common.default_depth_str, mkEx "gr -- one tree in the startcat of the current grammar",
mkEx "gr -cat=NP -number=16 -- 16 trees in the category NP", mkEx "gr -cat=NP -number=16 -- 16 trees in the category NP",
mkEx "gr -cat=NP -depth=2 -- one tree in the category NP, up to depth 2", mkEx "gr -lang=LangHin,LangTha -cat=Cl -- Cl, both in LangHin and LangTha",
mkEx "gr -lang=LangHin,LangTha -cat=Cl -- Cl, both in LangHin and LangTha", mkEx "gr -probs=FILE -- generate with bias",
mkEx "gr -probs=FILE -- generate with bias", mkEx "gr (AdjCN ? (UseN ?)) -- generate trees of form (AdjCN ? (UseN ?))"
mkEx "gr (AdjCN ? (UseN ?)) -- generate trees of form (AdjCN ? (UseN ?))"
], ],
explanation = unlines [ explanation = unlines [
"Generates a list of random trees, by default one tree up to depth " ++ Common.default_depth_str ++ ".", "Generates a list of random trees, by default one tree.",
"If a tree argument is given, the command completes the Tree with values to", "If a tree argument is given, the command completes the Tree with values to",
"all metavariables in the tree. The generation can be biased by probabilities,", "all metavariables in the tree. The generation can be biased by probabilities,",
"given in a file in the -probs flag." "given in a file in the -probs flag."
@@ -184,13 +181,13 @@ pgfCommands = Map.fromList [
("cat","generation category"), ("cat","generation category"),
("lang","uses only functions that have linearizations in all these languages"), ("lang","uses only functions that have linearizations in all these languages"),
("number","number of trees generated"), ("number","number of trees generated"),
("depth","the maximum generation depth (default: " ++ Common.default_depth_str ++ ")"), ("depth","the maximum generation depth"),
("probs", "file with biased probabilities (format 'f 0.4' one by line)") ("probs", "file with biased probabilities (format 'f 0.4' one by line)")
], ],
exec = getEnv $ \ opts arg (Env pgf mos) -> do exec = getEnv $ \ opts arg (Env pgf mos) -> do
pgf <- optProbs opts (optRestricted opts pgf) pgf <- optProbs opts (optRestricted opts pgf)
gen <- newStdGen gen <- newStdGen
let dp = valIntOpts "depth" Common.default_depth opts let dp = valIntOpts "depth" 4 opts
let ts = case mexp (toExprs arg) of let ts = case mexp (toExprs arg) of
Just ex -> generateRandomFromDepth gen pgf ex (Just dp) Just ex -> generateRandomFromDepth gen pgf ex (Just dp)
Nothing -> generateRandomDepth gen pgf (optType pgf opts) (Just dp) Nothing -> generateRandomDepth gen pgf (optType pgf opts) (Just dp)
@@ -201,28 +198,28 @@ pgfCommands = Map.fromList [
synopsis = "generates a list of trees, by default exhaustive", synopsis = "generates a list of trees, by default exhaustive",
explanation = unlines [ explanation = unlines [
"Generates all trees of a given category. By default, ", "Generates all trees of a given category. By default, ",
"the depth is limited to " ++ Common.default_depth_str ++ ", but this can be changed by a flag.", "the depth is limited to 4, but this can be changed by a flag.",
"If a Tree argument is given, the command completes the Tree with values", "If a Tree argument is given, the command completes the Tree with values",
"to all metavariables in the tree." "to all metavariables in the tree."
], ],
flags = [ flags = [
("cat","the generation category"), ("cat","the generation category"),
("depth","the maximum generation depth (default: " ++ Common.default_depth_str ++ ")"), ("depth","the maximum generation depth"),
("lang","excludes functions that have no linearization in this language"), ("lang","excludes functions that have no linearization in this language"),
("number","the number of trees generated") ("number","the number of trees generated")
], ],
examples = [ examples = [
mkEx $ "gt -- all trees in the startcat, to depth " ++ Common.default_depth_str, mkEx "gt -- all trees in the startcat, to depth 4",
mkEx "gt -cat=NP -number=16 -- 16 trees in the category NP", mkEx "gt -cat=NP -number=16 -- 16 trees in the category NP",
mkEx "gt -cat=NP -depth=2 -- trees in the category NP to depth 2", mkEx "gt -cat=NP -depth=2 -- trees in the category NP to depth 2",
mkEx "gt (AdjCN ? (UseN ?)) -- trees of form (AdjCN ? (UseN ?))" mkEx "gt (AdjCN ? (UseN ?)) -- trees of form (AdjCN ? (UseN ?))"
], ],
exec = getEnv $ \ opts arg (Env pgf mos) -> do exec = getEnv $ \ opts arg (Env pgf mos) -> do
let pgfr = optRestricted opts pgf let pgfr = optRestricted opts pgf
let dp = valIntOpts "depth" Common.default_depth opts let dp = valIntOpts "depth" 4 opts
let ts = case toExprs arg of let ts = case mexp (toExprs arg) of
[] -> generateAllDepth pgfr (optType pgf opts) (Just dp) Just ex -> generateFromDepth pgfr ex (Just dp)
es -> concat [generateFromDepth pgfr e (Just dp) | e <- es] Nothing -> generateAllDepth pgfr (optType pgf opts) (Just dp)
returnFromExprs $ take (optNumInf opts) ts returnFromExprs $ take (optNumInf opts) ts
}), }),
("i", emptyCommandInfo { ("i", emptyCommandInfo {
@@ -430,8 +427,7 @@ pgfCommands = Map.fromList [
"are type checking and semantic computation." "are type checking and semantic computation."
], ],
examples = [ examples = [
mkEx "pt -compute (plus one two) -- compute value", mkEx "pt -compute (plus one two) -- compute value"
mkEx ("p \"the 4 dogs\" | pt -transfer=digits2numeral | l -- \"the four dogs\" ")
], ],
exec = getEnv $ \ opts arg (Env pgf mos) -> exec = getEnv $ \ opts arg (Env pgf mos) ->
returnFromExprs . takeOptNum opts . treeOps pgf opts $ toExprs arg, returnFromExprs . takeOptNum opts . treeOps pgf opts $ toExprs arg,
@@ -745,7 +741,7 @@ pgfCommands = Map.fromList [
Nothing -> do putStrLn ("unknown category of function identifier "++show id) Nothing -> do putStrLn ("unknown category of function identifier "++show id)
return void return void
[e] -> case inferExpr pgf e of [e] -> case inferExpr pgf e of
Left tcErr -> errorWithoutStackTrace $ render (ppTcError tcErr) Left tcErr -> error $ render (ppTcError tcErr)
Right (e,ty) -> do putStrLn ("Expression: "++showExpr [] e) Right (e,ty) -> do putStrLn ("Expression: "++showExpr [] e)
putStrLn ("Type: "++showType [] ty) putStrLn ("Type: "++showType [] ty)
putStrLn ("Probability: "++show (probTree pgf e)) putStrLn ("Probability: "++show (probTree pgf e))
@@ -762,7 +758,7 @@ pgfCommands = Map.fromList [
[] -> [parse_ pgf lang (optType pgf opts) (Just dp) s | lang <- optLangs pgf opts] [] -> [parse_ pgf lang (optType pgf opts) (Just dp) s | lang <- optLangs pgf opts]
open_typs -> [parseWithRecovery pgf lang (optType pgf opts) open_typs (Just dp) s | lang <- optLangs pgf opts] open_typs -> [parseWithRecovery pgf lang (optType pgf opts) open_typs (Just dp) s | lang <- optLangs pgf opts]
where where
dp = valIntOpts "depth" Common.default_depth opts dp = valIntOpts "depth" 4 opts
fromParse opts = foldr (joinPiped . fromParse1 opts) void fromParse opts = foldr (joinPiped . fromParse1 opts) void
@@ -886,15 +882,11 @@ pgfCommands = Map.fromList [
Right ty -> ty Right ty -> ty
Nothing -> error ("Can't parse '"++str++"' as a type") Nothing -> error ("Can't parse '"++str++"' as a type")
optViewFormat opts = valStrOpts "format" "png" opts optViewFormat opts = valStrOpts "format" "png" opts
optViewGraph opts = valStrOpts "view" open_cmd opts optViewGraph opts = valStrOpts "view" "open" opts
optNum opts = valIntOpts "number" 1 opts optNum opts = valIntOpts "number" 1 opts
optNumInf opts = valIntOpts "number" 1000000000 opts ---- 10^9 optNumInf opts = valIntOpts "number" 1000000000 opts ---- 10^9
takeOptNum opts = take (optNumInf opts) takeOptNum opts = take (optNumInf opts)
open_cmd | os == "linux" = "xdg-open"
| os == "mingw32" = "start"
| otherwise = "open"
returnFromExprs es = return $ case es of returnFromExprs es = return $ case es of
[] -> pipeMessage "no trees found" [] -> pipeMessage "no trees found"
_ -> fromExprs es _ -> fromExprs es
@@ -1027,7 +1019,3 @@ stanzas = map unlines . chop . lines where
chop ls = case break (=="") ls of chop ls = case break (=="") ls of
(ls1,[]) -> [ls1] (ls1,[]) -> [ls1]
(ls1,_:ls2) -> ls1 : chop ls2 (ls1,_:ls2) -> ls1 : chop ls2
#if !(MIN_VERSION_base(4,9,0))
errorWithoutStackTrace = error
#endif

View File

@@ -15,16 +15,9 @@ import GF.Command.Abstract --(isOpt,valStrOpts,prOpt)
import GF.Text.Pretty import GF.Text.Pretty
import GF.Text.Transliterations import GF.Text.Transliterations
import GF.Text.Lexing(stringOp,opInEnv) import GF.Text.Lexing(stringOp,opInEnv)
import Data.Char (isSpace)
import qualified PGF as H(showCId,showExpr,toATree,toTrie,Trie(..)) import qualified PGF as H(showCId,showExpr,toATree,toTrie,Trie(..))
-- store default generation depth in a variable and use everywhere
default_depth :: Int
default_depth = 5
default_depth_str = show default_depth
extend old new = Map.union (Map.fromList new) old -- Map.union is left-biased extend old new = Map.union (Map.fromList new) old -- Map.union is left-biased
commonCommands :: (Monad m,MonadSIO m) => Map.Map String (CommandInfo m) commonCommands :: (Monad m,MonadSIO m) => Map.Map String (CommandInfo m)
@@ -177,8 +170,7 @@ commonCommands = fmap (mapCommandExec liftSIO) $ Map.fromList [
restrictedSystem $ syst ++ " <" ++ tmpi ++ " >" ++ tmpo restrictedSystem $ syst ++ " <" ++ tmpi ++ " >" ++ tmpo
fmap fromString $ restricted $ readFile tmpo, fmap fromString $ restricted $ readFile tmpo,
-} -}
fmap (fromStrings . lines) . restricted . readShellProcess syst . unlines . map (dropWhile (=='\n')) $ toStrings $ arg, fmap fromString . restricted . readShellProcess syst $ toString arg,
flags = [ flags = [
("command","the system command applied to the argument") ("command","the system command applied to the argument")
], ],

View File

@@ -18,8 +18,8 @@ import GF.Grammar.Parser (runP, pExp)
import GF.Grammar.ShowTerm import GF.Grammar.ShowTerm
import GF.Grammar.Lookup (allOpers,allOpersTo) import GF.Grammar.Lookup (allOpers,allOpersTo)
import GF.Compile.Rename(renameSourceTerm) import GF.Compile.Rename(renameSourceTerm)
import GF.Compile.Compute.Concrete(normalForm,resourceValues) import qualified GF.Compile.Compute.ConcreteNew as CN(normalForm,resourceValues)
import GF.Compile.TypeCheck.Concrete as TC(inferLType,ppType) import GF.Compile.TypeCheck.RConcrete as TC(inferLType,ppType)
import GF.Infra.Dependencies(depGraph) import GF.Infra.Dependencies(depGraph)
import GF.Infra.CheckM(runCheck) import GF.Infra.CheckM(runCheck)
@@ -259,7 +259,7 @@ checkComputeTerm os sgr t =
((t,_),_) <- runCheck $ do t <- renameSourceTerm sgr mo t ((t,_),_) <- runCheck $ do t <- renameSourceTerm sgr mo t
inferLType sgr [] t inferLType sgr [] t
let opts = modifyFlags (\fs->fs{optTrace=isOpt "trace" os}) let opts = modifyFlags (\fs->fs{optTrace=isOpt "trace" os})
t1 = normalForm (resourceValues opts sgr) (L NoLoc identW) t t1 = CN.normalForm (CN.resourceValues opts sgr) (L NoLoc identW) t
t2 = evalStr t1 t2 = evalStr t1
checkPredefError t2 checkPredefError t2
where where

View File

@@ -5,8 +5,6 @@ module GF.Command.TreeOperations (
) where ) where
import PGF(Expr,PGF,CId,compute,mkApp,unApp,unapply,unMeta,exprSize,exprFunctions) import PGF(Expr,PGF,CId,compute,mkApp,unApp,unapply,unMeta,exprSize,exprFunctions)
import PGF.Data(Expr(EApp,EFun))
import PGF.TypeCheck(inferExpr)
import Data.List import Data.List
type TreeOp = [Expr] -> [Expr] type TreeOp = [Expr] -> [Expr]
@@ -18,17 +16,15 @@ allTreeOps :: PGF -> [(String,(String,Either TreeOp (CId -> TreeOp)))]
allTreeOps pgf = [ allTreeOps pgf = [
("compute",("compute by using semantic definitions (def)", ("compute",("compute by using semantic definitions (def)",
Left $ map (compute pgf))), Left $ map (compute pgf))),
("transfer",("apply this transfer function to all maximal subtrees of suitable type",
Right $ \f -> map (transfer pgf f))), -- HL 12/24, modified from gf-3.3
("largest",("sort trees from largest to smallest, in number of nodes", ("largest",("sort trees from largest to smallest, in number of nodes",
Left $ largest)), Left $ largest)),
("nub\t",("remove duplicate trees", ("nub",("remove duplicate trees",
Left $ nub)), Left $ nub)),
("smallest",("sort trees from smallest to largest, in number of nodes", ("smallest",("sort trees from smallest to largest, in number of nodes",
Left $ smallest)), Left $ smallest)),
("subtrees",("return all fully applied subtrees (stopping at abstractions), by default sorted from the largest", ("subtrees",("return all fully applied subtrees (stopping at abstractions), by default sorted from the largest",
Left $ concatMap subtrees)), Left $ concatMap subtrees)),
("funs\t",("return all fun functions appearing in the tree, with duplications", ("funs",("return all fun functions appearing in the tree, with duplications",
Left $ \es -> [mkApp f [] | e <- es, f <- exprFunctions e])) Left $ \es -> [mkApp f [] | e <- es, f <- exprFunctions e]))
] ]
@@ -52,18 +48,3 @@ subtrees :: Expr -> [Expr]
subtrees t = t : case unApp t of subtrees t = t : case unApp t of
Just (f,ts) -> concatMap subtrees ts Just (f,ts) -> concatMap subtrees ts
_ -> [] -- don't go under abstractions _ -> [] -- don't go under abstractions
-- Apply transfer function f:C -> D to all maximal subtrees s:C of tree e and replace
-- these s by the values of f(s). This modifies the 'simple-minded transfer' of gf-3.3.
-- If applied to strict subtrees s of e, better use with f:C -> C only. HL 12/2024
transfer :: PGF -> CId -> Expr -> Expr
transfer pgf f e = case inferExpr pgf (appf e) of
Left _err -> case e of
EApp g a -> EApp (transfer pgf f g) (transfer pgf f a)
_ -> e
Right _ty -> case (compute pgf (appf e)) of
v | v /= (appf e) -> v
_ -> e -- default case of f, or f has no computation rule
where
appf = EApp (EFun f)

View File

@@ -1,6 +1,7 @@
module GF.Compile (compileToPGF, link, batchCompile, srcAbsName) where module GF.Compile (compileToPGF, compileToLPGF, link, linkl, batchCompile, srcAbsName) where
import GF.Compile.GrammarToPGF(mkCanon2pgf) import GF.Compile.GrammarToPGF(mkCanon2pgf)
import GF.Compile.GrammarToLPGF(mkCanon2lpgf)
import GF.Compile.ReadFiles(ModEnv,getOptionsFromFile,getAllFiles, import GF.Compile.ReadFiles(ModEnv,getOptionsFromFile,getAllFiles,
importsOfModule) importsOfModule)
import GF.CompileOne(compileOne) import GF.CompileOne(compileOne)
@@ -14,7 +15,7 @@ import GF.Infra.UseIO(IOE,FullPath,liftIO,getLibraryDirectory,putIfVerb,
justModuleName,extendPathEnv,putStrE,putPointE) justModuleName,extendPathEnv,putStrE,putPointE)
import GF.Data.Operations(raise,(+++),err) import GF.Data.Operations(raise,(+++),err)
import Control.Monad(foldM,when,(<=<),filterM,liftM) import Control.Monad(foldM,when,(<=<),filterM)
import GF.System.Directory(doesFileExist,getModificationTime) import GF.System.Directory(doesFileExist,getModificationTime)
import System.FilePath((</>),isRelative,dropFileName) import System.FilePath((</>),isRelative,dropFileName)
import qualified Data.Map as Map(empty,insert,elems) --lookup import qualified Data.Map as Map(empty,insert,elems) --lookup
@@ -24,12 +25,16 @@ import GF.Text.Pretty(render,($$),(<+>),nest)
import PGF.Internal(optimizePGF) import PGF.Internal(optimizePGF)
import PGF(PGF,defaultProbabilities,setProbabilities,readProbabilitiesFromFile) import PGF(PGF,defaultProbabilities,setProbabilities,readProbabilitiesFromFile)
import LPGF(LPGF)
-- | Compiles a number of source files and builds a 'PGF' structure for them. -- | Compiles a number of source files and builds a 'PGF' structure for them.
-- This is a composition of 'link' and 'batchCompile'. -- This is a composition of 'link' and 'batchCompile'.
compileToPGF :: Options -> [FilePath] -> IOE PGF compileToPGF :: Options -> [FilePath] -> IOE PGF
compileToPGF opts fs = link opts . snd =<< batchCompile opts fs compileToPGF opts fs = link opts . snd =<< batchCompile opts fs
compileToLPGF :: Options -> [FilePath] -> IOE LPGF
compileToLPGF opts fs = linkl opts . snd =<< batchCompile opts fs
-- | Link a grammar into a 'PGF' that can be used to 'PGF.linearize' and -- | Link a grammar into a 'PGF' that can be used to 'PGF.linearize' and
-- 'PGF.parse' with the "PGF" run-time system. -- 'PGF.parse' with the "PGF" run-time system.
link :: Options -> (ModuleName,Grammar) -> IOE PGF link :: Options -> (ModuleName,Grammar) -> IOE PGF
@@ -42,6 +47,14 @@ link opts (cnc,gr) =
return $ setProbabilities probs return $ setProbabilities probs
$ if flag optOptimizePGF opts then optimizePGF pgf else pgf $ if flag optOptimizePGF opts then optimizePGF pgf else pgf
-- | Link a grammar into a 'LPGF' that can be used for linearization only.
linkl :: Options -> (ModuleName,Grammar) -> IOE LPGF
linkl opts (cnc,gr) =
putPointE Normal opts "linking ... " $ do
let abs = srcAbsName gr cnc
lpgf <- mkCanon2lpgf opts gr abs
return lpgf
-- | Returns the name of the abstract syntax corresponding to the named concrete syntax -- | Returns the name of the abstract syntax corresponding to the named concrete syntax
srcAbsName gr cnc = err (const cnc) id $ abstractOfConcrete gr cnc srcAbsName gr cnc = err (const cnc) id $ abstractOfConcrete gr cnc

View File

@@ -130,5 +130,5 @@ cf2concr cfg = Concr Map.empty Map.empty
mkRuleName rule = mkRuleName rule =
case ruleName rule of case ruleName rule of
CFObj n _ -> n CFObj n _ -> n
_ -> wildCId _ -> wildCId

View File

@@ -27,9 +27,9 @@ import GF.Infra.Ident
import GF.Infra.Option import GF.Infra.Option
import GF.Compile.TypeCheck.Abstract import GF.Compile.TypeCheck.Abstract
import GF.Compile.TypeCheck.Concrete(computeLType,checkLType,inferLType,ppType) import GF.Compile.TypeCheck.RConcrete
import qualified GF.Compile.TypeCheck.ConcreteNew as CN(checkLType,inferLType) import qualified GF.Compile.TypeCheck.ConcreteNew as CN
import qualified GF.Compile.Compute.Concrete as CN(normalForm,resourceValues) import qualified GF.Compile.Compute.ConcreteNew as CN
import GF.Grammar import GF.Grammar
import GF.Grammar.Lexer import GF.Grammar.Lexer
@@ -175,7 +175,7 @@ checkInfo opts cwd sgr (m,mo) c info = checkInModule cwd mo NoLoc empty $ do
checkTyp gr typ checkTyp gr typ
case md of case md of
Just eqs -> mapM_ (\(L loc eq) -> mkCheck loc "the definition of function" $ Just eqs -> mapM_ (\(L loc eq) -> mkCheck loc "the definition of function" $
checkDef gr (m,c) typ eq) eqs checkDef gr (m,c) typ eq) eqs
Nothing -> return () Nothing -> return ()
return (AbsFun (Just (L loc typ)) ma md moper) return (AbsFun (Just (L loc typ)) ma md moper)
@@ -316,7 +316,7 @@ linTypeOfType cnc m typ = do
mkLinArg (i,(n,mc@(m,cat))) = do mkLinArg (i,(n,mc@(m,cat))) = do
val <- lookLin mc val <- lookLin mc
let vars = mkRecType varLabel $ replicate n typeStr let vars = mkRecType varLabel $ replicate n typeStr
symb = argIdent n cat i symb = argIdent n cat i
rec <- if n==0 then return val else rec <- if n==0 then return val else
errIn (render ("extending" $$ errIn (render ("extending" $$
nest 2 vars $$ nest 2 vars $$

View File

@@ -1,590 +1,3 @@
-- | Functions for computing the values of terms in the concrete syntax, in module GF.Compile.Compute.Concrete{-(module M)-} where
-- | preparation for PMCFG generation. --import GF.Compile.Compute.ConcreteLazy as M -- New
module GF.Compile.Compute.Concrete --import GF.Compile.Compute.ConcreteStrict as M -- Old, inefficient
(GlobalEnv, GLocation, resourceValues, geLoc, geGrammar,
normalForm,
Value(..), Bind(..), Env, value2term, eval, vapply
) where
import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import GF.Grammar hiding (Env, VGen, VApp, VRecType)
import GF.Grammar.Lookup(lookupResDefLoc,allParamValues)
import GF.Grammar.Predef(cPredef,cErrorType,cTok,cStr,cTrace,cPBool)
import GF.Grammar.PatternMatch(matchPattern,measurePatt)
import GF.Grammar.Lockfield(isLockLabel,lockRecType) --unlockRecord,lockLabel
import GF.Compile.Compute.Value hiding (Error)
import GF.Compile.Compute.Predef(predef,predefName,delta)
import GF.Data.Str(Str,glueStr,str2strings,str,sstr,plusStr,strTok)
import GF.Data.Operations(Err,err,errIn,maybeErr,mapPairsM)
import GF.Data.Utilities(mapFst,mapSnd)
import GF.Infra.Option
import Control.Monad(ap,liftM,liftM2) -- ,unless,mplus
import Data.List (findIndex,intersect,nub,elemIndex,(\\)) --,isInfixOf
--import Data.Char (isUpper,toUpper,toLower)
import GF.Text.Pretty
import qualified Data.Map as Map
import Debug.Trace(trace)
-- * Main entry points
normalForm :: GlobalEnv -> L Ident -> Term -> Term
normalForm (GE gr rv opts _) loc = err (bugloc loc) id . nfx (GE gr rv opts loc)
nfx :: GlobalEnv -> Term -> Err Term
nfx env@(GE _ _ _ loc) t = do
v <- eval env [] t
return (value2term loc [] v)
-- Old value2term error message:
-- Left i -> fail ("variable #"++show i++" is out of scope")
eval :: GlobalEnv -> Env -> Term -> Err Value
eval (GE gr rvs opts loc) env t = ($ (map snd env)) # value cenv t
where
cenv = CE gr rvs opts loc (map fst env)
--apply env = apply' env
--------------------------------------------------------------------------------
-- * Environments
type ResourceValues = Map.Map ModuleName (Map.Map Ident (Err Value))
data GlobalEnv = GE Grammar ResourceValues Options GLocation
data CompleteEnv = CE {srcgr::Grammar,rvs::ResourceValues,
opts::Options,
gloc::GLocation,local::LocalScope}
type GLocation = L Ident
type LocalScope = [Ident]
type Stack = [Value]
type OpenValue = Stack->Value
geLoc (GE _ _ _ loc) = loc
geGrammar (GE gr _ _ _) = gr
ext b env = env{local=b:local env}
extend bs env = env{local=bs++local env}
global env = GE (srcgr env) (rvs env) (opts env) (gloc env)
var :: CompleteEnv -> Ident -> Err OpenValue
var env x = maybe unbound pick' (elemIndex x (local env))
where
unbound = fail ("Unknown variable: "++showIdent x)
pick' i = return $ \ vs -> maybe (err i vs) ok (pick i vs)
err i vs = bug $ "Stack problem: "++showIdent x++": "
++unwords (map showIdent (local env))
++" => "++show (i,length vs)
ok v = --trace ("var "++show x++" = "++show v) $
v
pick :: Int -> Stack -> Maybe Value
pick 0 (v:_) = Just v
pick i (_:vs) = pick (i-1) vs
pick i vs = Nothing -- bug $ "pick "++show (i,vs)
resource env (m,c) =
-- err bug id $
if isPredefCat c
then value0 env =<< lockRecType c defLinType -- hmm
else maybe e id $ Map.lookup c =<< Map.lookup m (rvs env)
where e = fail $ "Not found: "++render m++"."++showIdent c
-- | Convert operators once, not every time they are looked up
resourceValues :: Options -> SourceGrammar -> GlobalEnv
resourceValues opts gr = env
where
env = GE gr rvs opts (L NoLoc identW)
rvs = Map.mapWithKey moduleResources (moduleMap gr)
moduleResources m = Map.mapWithKey (moduleResource m) . jments
moduleResource m c _info = do L l t <- lookupResDefLoc gr (m,c)
let loc = L l c
qloc = L l (Q (m,c))
eval (GE gr rvs opts loc) [] (traceRes qloc t)
traceRes = if flag optTrace opts
then traceResource
else const id
-- * Tracing
-- | Insert a call to the trace function under the top-level lambdas
traceResource (L l q) t =
case termFormCnc t of
(abs,body) -> mkAbs abs (mkApp traceQ [args,body])
where
args = R $ tuple2record (K lstr:[Vr x|(bt,x)<-abs,bt==Explicit])
lstr = render (l<>":"<>ppTerm Qualified 0 q)
traceQ = Q (cPredef,cTrace)
-- * Computing values
-- | Computing the value of a top-level term
value0 :: CompleteEnv -> Term -> Err Value
value0 env = eval (global env) []
-- | Computing the value of a term
value :: CompleteEnv -> Term -> Err OpenValue
value env t0 =
-- Each terms is traversed only once by this function, using only statically
-- available information. Notably, the values of lambda bound variables
-- will be unknown during the term traversal phase.
-- The result is an OpenValue, which is a function that may be applied many
-- times to different dynamic values, but without the term traversal overhead
-- and without recomputing other statically known information.
-- For this to work, there should be no recursive calls under lambdas here.
-- Whenever we need to construct the OpenValue function with an explicit
-- lambda, we have to lift the recursive calls outside the lambda.
-- (See e.g. the rules for Let, Prod and Abs)
{-
trace (render $ text "value"<+>sep [ppL (gloc env)<>text ":",
brackets (fsep (map ppIdent (local env))),
ppTerm Unqualified 10 t0]) $
--}
errIn (render t0) $
case t0 of
Vr x -> var env x
Q x@(m,f)
| m == cPredef -> if f==cErrorType -- to be removed
then let p = identS "P"
in const # value0 env (mkProd [(Implicit,p,typeType)] (Vr p) [])
else if f==cPBool
then const # resource env x
else const . flip VApp [] # predef f
| otherwise -> const # resource env x --valueResDef (fst env) x
QC x -> return $ const (VCApp x [])
App e1 e2 -> apply' env e1 . (:[]) =<< value env e2
Let (x,(oty,t)) body -> do vb <- value (ext x env) body
vt <- value env t
return $ \ vs -> vb (vt vs:vs)
Meta i -> return $ \ vs -> VMeta i (zip (local env) vs) []
Prod bt x t1 t2 ->
do vt1 <- value env t1
vt2 <- value (ext x env) t2
return $ \ vs -> VProd bt (vt1 vs) x $ Bind $ \ vx -> vt2 (vx:vs)
Abs bt x t -> do vt <- value (ext x env) t
return $ VAbs bt x . Bind . \ vs vx -> vt (vx:vs)
EInt n -> return $ const (VInt n)
EFloat f -> return $ const (VFloat f)
K s -> return $ const (VString s)
Empty -> return $ const (VString "")
Sort s | s == cTok -> return $ const (VSort cStr) -- to be removed
| otherwise -> return $ const (VSort s)
ImplArg t -> (VImplArg.) # value env t
Table p res -> liftM2 VTblType # value env p <# value env res
RecType rs -> do lovs <- mapPairsM (value env) rs
return $ \vs->VRecType $ mapSnd ($ vs) lovs
t@(ExtR t1 t2) -> ((extR t.)# both id) # both (value env) (t1,t2)
FV ts -> ((vfv .) # sequence) # mapM (value env) ts
R as -> do lovs <- mapPairsM (value env.snd) as
return $ \ vs->VRec $ mapSnd ($ vs) lovs
T i cs -> valueTable env i cs
V ty ts -> do pvs <- paramValues env ty
((VV ty pvs .) . sequence) # mapM (value env) ts
C t1 t2 -> ((ok2p vconcat.) # both id) # both (value env) (t1,t2)
S t1 t2 -> ((select env.) # both id) # both (value env) (t1,t2)
P t l -> --maybe (bug $ "project "++show l++" from "++show v) id $
do ov <- value env t
return $ \ vs -> let v = ov vs
in maybe (VP v l) id (proj l v)
Alts t tts -> (\v vts -> VAlts # v <# mapM (both id) vts) # value env t <# mapM (both (value env)) tts
Strs ts -> ((VStrs.) # sequence) # mapM (value env) ts
Glue t1 t2 -> ((ok2p (glue env).) # both id) # both (value env) (t1,t2)
ELin c r -> (unlockVRec (gloc env) c.) # value env r
EPatt p -> return $ const (VPatt p) -- hmm
EPattType ty -> do vt <- value env ty
return (VPattType . vt)
Typed t ty -> value env t
t -> fail.render $ "value"<+>ppTerm Unqualified 10 t $$ show t
vconcat vv@(v1,v2) =
case vv of
(VString "",_) -> v2
(_,VString "") -> v1
(VApp NonExist _,_) -> v1
(_,VApp NonExist _) -> v2
_ -> VC v1 v2
proj l v | isLockLabel l = return (VRec [])
---- a workaround 18/2/2005: take this away and find the reason
---- why earlier compilation destroys the lock field
proj l v =
case v of
VFV vs -> liftM vfv (mapM (proj l) vs)
VRec rs -> lookup l rs
-- VExtR v1 v2 -> proj l v2 `mplus` proj l v1 -- hmm
VS (VV pty pvs rs) v2 -> flip VS v2 . VV pty pvs # mapM (proj l) rs
_ -> return (ok1 VP v l)
ok1 f v1@(VError {}) _ = v1
ok1 f v1 v2 = f v1 v2
ok2 f v1@(VError {}) _ = v1
ok2 f _ v2@(VError {}) = v2
ok2 f v1 v2 = f v1 v2
ok2p f (v1@VError {},_) = v1
ok2p f (_,v2@VError {}) = v2
ok2p f vv = f vv
unlockVRec loc c0 v0 = v0
{-
unlockVRec loc c0 v0 = unlockVRec' c0 v0
where
unlockVRec' ::Ident -> Value -> Value
unlockVRec' c v =
case v of
-- VClosure env t -> err bug (VClosure env) (unlockRecord c t)
VAbs bt x (Bind f) -> VAbs bt x (Bind $ \ v -> unlockVRec' c (f v))
VRec rs -> plusVRec rs lock
-- _ -> VExtR v (VRec lock) -- hmm
_ -> {-trace (render $ ppL loc $ "unlock non-record "++show v0)-} v -- hmm
-- _ -> bugloc loc $ "unlock non-record "++show v0
where
lock = [(lockLabel c,VRec [])]
-}
-- suspicious, but backwards compatible
plusVRec rs1 rs2 = VRec ([(l,v)|(l,v)<-rs1,l `notElem` ls2] ++ rs2)
where ls2 = map fst rs2
extR t vv =
case vv of
(VFV vs,v2) -> vfv [extR t (v1,v2)|v1<-vs]
(v1,VFV vs) -> vfv [extR t (v1,v2)|v2<-vs]
(VRecType rs1, VRecType rs2) ->
case intersect (map fst rs1) (map fst rs2) of
[] -> VRecType (rs1 ++ rs2)
ls -> error $ "clash"<+>show ls
(VRec rs1, VRec rs2) -> plusVRec rs1 rs2
(v1 , VRec [(l,_)]) | isLockLabel l -> v1 -- hmm
(VS (VV t pvs vs) s,v2) -> VS (VV t pvs [extR t (v1,v2)|v1<-vs]) s
-- (v1,v2) -> ok2 VExtR v1 v2 -- hmm
(v1,v2) -> error $ "not records" $$ show v1 $$ show v2
where
error explain = ppbug $ "The term" <+> t
<+> "is not reducible" $$ explain
glue env (v1,v2) = glu v1 v2
where
glu v1 v2 =
case (v1,v2) of
(VFV vs,v2) -> vfv [glu v1 v2|v1<-vs]
(v1,VFV vs) -> vfv [glu v1 v2|v2<-vs]
(VString s1,VString s2) -> VString (s1++s2)
(v1,VAlts d vs) -> VAlts (glx d) [(glx v,c) | (v,c) <- vs]
where glx v2 = glu v1 v2
(v1@(VAlts {}),v2) ->
--err (const (ok2 VGlue v1 v2)) id $
err bug id $
do y' <- strsFromValue v2
x' <- strsFromValue v1
return $ vfv [foldr1 VC (map VString (str2strings (glueStr v u))) | v <- x', u <- y']
(VC va vb,v2) -> VC va (glu vb v2)
(v1,VC va vb) -> VC (glu v1 va) vb
(VS (VV ty pvs vs) vb,v2) -> VS (VV ty pvs [glu v v2|v<-vs]) vb
(v1,VS (VV ty pvs vs) vb) -> VS (VV ty pvs [glu v1 v|v<-vs]) vb
(v1@(VApp NonExist _),_) -> v1
(_,v2@(VApp NonExist _)) -> v2
-- (v1,v2) -> ok2 VGlue v1 v2
(v1,v2) -> if flag optPlusAsBind (opts env)
then VC v1 (VC (VApp BIND []) v2)
else let loc = gloc env
vt v = value2term loc (local env) v
-- Old value2term error message:
-- Left i -> Error ('#':show i)
originalMsg = render $ ppL loc (hang "unsupported token gluing" 4
(Glue (vt v1) (vt v2)))
term = render $ pp $ Glue (vt v1) (vt v2)
in error $ unlines
[originalMsg
,""
,"There was a problem in the expression `"++term++"`, either:"
,"1) You are trying to use + on runtime arguments, possibly via an oper."
,"2) One of the arguments in `"++term++"` is a bound variable from pattern matching a string, but the cases are non-exhaustive."
,"For more help see https://github.com/GrammaticalFramework/gf-core/tree/master/doc/errors/gluing.md"
]
-- | to get a string from a value that represents a sequence of terminals
strsFromValue :: Value -> Err [Str]
strsFromValue t = case t of
VString s -> return [str s]
VC s t -> do
s' <- strsFromValue s
t' <- strsFromValue t
return [plusStr x y | x <- s', y <- t']
{-
VGlue s t -> do
s' <- strsFromValue s
t' <- strsFromValue t
return [glueStr x y | x <- s', y <- t']
-}
VAlts d vs -> do
d0 <- strsFromValue d
v0 <- mapM (strsFromValue . fst) vs
c0 <- mapM (strsFromValue . snd) vs
--let vs' = zip v0 c0
return [strTok (str2strings def) vars |
def <- d0,
vars <- [[(str2strings v, map sstr c) | (v,c) <- zip vv c0] |
vv <- sequence v0]
]
VFV ts -> concat # mapM strsFromValue ts
VStrs ts -> concat # mapM strsFromValue ts
_ -> fail ("cannot get Str from value " ++ show t)
vfv vs = case nub vs of
[v] -> v
vs -> VFV vs
select env vv =
case vv of
(v1,VFV vs) -> vfv [select env (v1,v2)|v2<-vs]
(VFV vs,v2) -> vfv [select env (v1,v2)|v1<-vs]
(v1@(VV pty vs rs),v2) ->
err (const (VS v1 v2)) id $
do --ats <- allParamValues (srcgr env) pty
--let vs = map (value0 env) ats
i <- maybeErr "no match" $ findIndex (==v2) vs
return (ix (gloc env) "select" rs i)
(VT _ _ [(PW,Bind b)],_) -> {-trace "eliminate wild card table" $-} b []
(v1@(VT _ _ cs),v2) ->
err (\_->ok2 VS v1 v2) (err bug id . valueMatch env) $
match (gloc env) cs v2
(VS (VV pty pvs rs) v12,v2) -> VS (VV pty pvs [select env (v11,v2)|v11<-rs]) v12
(v1,v2) -> ok2 VS v1 v2
match loc cs v =
err bad return (matchPattern cs (value2term loc [] v))
-- Old value2term error message:
-- Left i -> bad ("variable #"++show i++" is out of scope")
where
bad = fail . ("In pattern matching: "++)
valueMatch :: CompleteEnv -> (Bind Env,Substitution) -> Err Value
valueMatch env (Bind f,env') = f # mapPairsM (value0 env) env'
valueTable :: CompleteEnv -> TInfo -> [Case] -> Err OpenValue
valueTable env i cs =
case i of
TComp ty -> do pvs <- paramValues env ty
((VV ty pvs .) # sequence) # mapM (value env.snd) cs
_ -> do ty <- getTableType i
cs' <- mapM valueCase cs
err (dynamic cs' ty) return (convert cs' ty)
where
dynamic cs' ty _ = cases cs' # value env ty
cases cs' vty vs = err keep ($ vs) (convertv cs' (vty vs))
where
keep msg = --trace (msg++"\n"++render (ppTerm Unqualified 0 (T i cs))) $
VT wild (vty vs) (mapSnd ($ vs) cs')
wild = case i of TWild _ -> True; _ -> False
convertv cs' vty =
convert' cs' =<< paramValues'' env (value2term (gloc env) [] vty)
-- Old value2term error message: Left i -> fail ("variable #"++show i++" is out of scope")
convert cs' ty = convert' cs' =<< paramValues' env ty
convert' cs' ((pty,vs),pvs) =
do sts <- mapM (matchPattern cs') vs
return $ \ vs -> VV pty pvs $ map (err bug id . valueMatch env)
(mapFst ($ vs) sts)
valueCase (p,t) = do p' <- measurePatt # inlinePattMacro p
pvs <- linPattVars p'
vt <- value (extend pvs env) t
return (p',\vs-> Bind $ \bs-> vt (push' p' bs pvs vs))
inlinePattMacro p =
case p of
PM qc -> do r <- resource env qc
case r of
VPatt p' -> inlinePattMacro p'
_ -> ppbug $ hang "Expected pattern macro:" 4
(show r)
_ -> composPattOp inlinePattMacro p
paramValues env ty = snd # paramValues' env ty
paramValues' env ty = paramValues'' env =<< nfx (global env) ty
paramValues'' env pty = do ats <- allParamValues (srcgr env) pty
pvs <- mapM (eval (global env) []) ats
return ((pty,ats),pvs)
push' p bs xs = if length bs/=length xs
then bug $ "push "++show (p,bs,xs)
else push bs xs
push :: Env -> LocalScope -> Stack -> Stack
push bs [] vs = vs
push bs (x:xs) vs = maybe err id (lookup x bs):push bs xs vs
where err = bug $ "Unbound pattern variable "++showIdent x
apply' :: CompleteEnv -> Term -> [OpenValue] -> Err OpenValue
apply' env t [] = value env t
apply' env t vs =
case t of
QC x -> return $ \ svs -> VCApp x (map ($ svs) vs)
{-
Q x@(m,f) | m==cPredef -> return $
let constr = --trace ("predef "++show x) .
VApp x
in \ svs -> maybe constr id (Map.lookup f predefs)
$ map ($ svs) vs
| otherwise -> do r <- resource env x
return $ \ svs -> vapply (gloc env) r (map ($ svs) vs)
-}
App t1 t2 -> apply' env t1 . (:vs) =<< value env t2
_ -> do fv <- value env t
return $ \ svs -> vapply (gloc env) (fv svs) (map ($ svs) vs)
vapply :: GLocation -> Value -> [Value] -> Value
vapply loc v [] = v
vapply loc v vs =
case v of
VError {} -> v
-- VClosure env (Abs b x t) -> beta gr env b x t vs
VAbs bt _ (Bind f) -> vbeta loc bt f vs
VApp pre vs1 -> delta' pre (vs1++vs)
where
delta' Trace (v1:v2:vs) = let vr = vapply loc v2 vs
in vtrace loc v1 vr
delta' pre vs = err msg vfv $ mapM (delta pre) (varyList vs)
--msg = const (VApp pre (vs1++vs))
msg = bug . (("Applying Predef."++showIdent (predefName pre)++": ")++)
VS (VV t pvs fs) s -> VS (VV t pvs [vapply loc f vs|f<-fs]) s
VFV fs -> vfv [vapply loc f vs|f<-fs]
VCApp f vs0 -> VCApp f (vs0++vs)
VMeta i env vs0 -> VMeta i env (vs0++vs)
VGen i vs0 -> VGen i (vs0++vs)
v -> bug $ "vapply "++show v++" "++show vs
vbeta loc bt f (v:vs) =
case (bt,v) of
(Implicit,VImplArg v) -> ap v
(Explicit, v) -> ap v
where
ap (VFV avs) = vfv [vapply loc (f v) vs|v<-avs]
ap v = vapply loc (f v) vs
vary (VFV vs) = vs
vary v = [v]
varyList = mapM vary
{-
beta env b x t (v:vs) =
case (b,v) of
(Implicit,VImplArg v) -> apply' (ext (x,v) env) t vs
(Explicit, v) -> apply' (ext (x,v) env) t vs
-}
vtrace loc arg res = trace (render (hang (pv arg) 4 ("->"<+>pv res))) res
where
pv v = case v of
VRec (f:as) -> hang (pf f) 4 (fsep (map pa as))
_ -> ppV v
pf (_,VString n) = pp n
pf (_,v) = ppV v
pa (_,v) = ppV v
ppV v = ppTerm Unqualified 10 (value2term' True loc [] v)
-- Old value2term error message:
-- Left i -> "variable #" <> pp i <+> "is out of scope"
-- | Convert a value back to a term
value2term :: GLocation -> [Ident] -> Value -> Term
value2term = value2term' False
value2term' :: Bool -> p -> [Ident] -> Value -> Term
value2term' stop loc xs v0 =
case v0 of
VApp pre vs -> applyMany (Q (cPredef,predefName pre)) vs
VCApp f vs -> applyMany (QC f) vs
VGen j vs -> applyMany (var j) vs
VMeta j env vs -> applyMany (Meta j) vs
VProd bt v x f -> Prod bt x (v2t v) (v2t' x f)
VAbs bt x f -> Abs bt x (v2t' x f)
VInt n -> EInt n
VFloat f -> EFloat f
VString s -> if null s then Empty else K s
VSort s -> Sort s
VImplArg v -> ImplArg (v2t v)
VTblType p res -> Table (v2t p) (v2t res)
VRecType rs -> RecType [(l, v2t v) | (l,v) <- rs]
VRec as -> R [(l, (Nothing, v2t v)) | (l,v) <- as]
VV t _ vs -> V t (map v2t vs)
VT wild v cs -> T ((if wild then TWild else TTyped) (v2t v)) (map nfcase cs)
VFV vs -> FV (map v2t vs)
VC v1 v2 -> C (v2t v1) (v2t v2)
VS v1 v2 -> S (v2t v1) (v2t v2)
VP v l -> P (v2t v) l
VPatt p -> EPatt p
VPattType v -> EPattType $ v2t v
VAlts v vvs -> Alts (v2t v) [(v2t x, v2t y) | (x,y) <- vvs]
VStrs vs -> Strs (map v2t vs)
-- VGlue v1 v2 -> Glue (v2t v1) (v2t v2)
-- VExtR v1 v2 -> ExtR (v2t v1) (v2t v2)
VError err -> Error err
where
applyMany f vs = foldl App f (map v2t vs)
v2t = v2txs xs
v2txs = value2term' stop loc
v2t' x f = v2txs (x:xs) (bind f (gen xs))
var j
| j<length xs = Vr (reverse xs !! j)
| otherwise = error ("variable #"++show j++" is out of scope")
pushs xs e = foldr push e xs
push x (env,xs) = ((x,gen xs):env,x:xs)
gen xs = VGen (length xs) []
nfcase (p,f) = (,) p (v2txs xs' (bind f env'))
where (env',xs') = pushs (pattVars p) ([],xs)
bind (Bind f) x = if stop
then VSort (identS "...") -- hmm
else f x
linPattVars p =
if null dups
then return pvs
else fail.render $ hang "Pattern is not linear. All variable names on the left-hand side must be distinct." 4 (ppPatt Unqualified 0 p)
where
allpvs = allPattVars p
pvs = nub allpvs
dups = allpvs \\ pvs
pattVars = nub . allPattVars
allPattVars p =
case p of
PV i -> [i]
PAs i p -> i:allPattVars p
_ -> collectPattOp allPattVars p
---
ix loc fn xs i =
if i<n
then xs !! i
else bugloc loc $ "(!!): index too large in "++fn++", "++show i++"<"++show n
where n = length xs
infixl 1 #,<# --,@@
f # x = fmap f x
mf <# mx = ap mf mx
--m1 @@ m2 = (m1 =<<) . m2
both f (x,y) = (,) # f x <# f y
bugloc loc s = ppbug $ ppL loc s
bug msg = ppbug msg
ppbug doc = error $ render $ hang "Internal error in Compute.Concrete:" 4 doc

View File

@@ -0,0 +1,588 @@
-- | Functions for computing the values of terms in the concrete syntax, in
-- | preparation for PMCFG generation.
module GF.Compile.Compute.ConcreteNew
(GlobalEnv, GLocation, resourceValues, geLoc, geGrammar,
normalForm,
Value(..), Bind(..), Env, value2term, eval, vapply
) where
import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import GF.Grammar hiding (Env, VGen, VApp, VRecType)
import GF.Grammar.Lookup(lookupResDefLoc,allParamValues)
import GF.Grammar.Predef(cPredef,cErrorType,cTok,cStr,cTrace,cPBool)
import GF.Grammar.PatternMatch(matchPattern,measurePatt)
import GF.Grammar.Lockfield(isLockLabel,lockRecType) --unlockRecord,lockLabel
import GF.Compile.Compute.Value hiding (Error)
import GF.Compile.Compute.Predef(predef,predefName,delta)
import GF.Data.Str(Str,glueStr,str2strings,str,sstr,plusStr,strTok)
import GF.Data.Operations(Err,err,errIn,maybeErr,mapPairsM)
import GF.Data.Utilities(mapFst,mapSnd)
import GF.Infra.Option
import Control.Monad(ap,liftM,liftM2) -- ,unless,mplus
import Data.List (findIndex,intersect,nub,elemIndex,(\\)) --,isInfixOf
--import Data.Char (isUpper,toUpper,toLower)
import GF.Text.Pretty
import qualified Data.Map as Map
import Debug.Trace(trace)
-- * Main entry points
normalForm :: GlobalEnv -> L Ident -> Term -> Term
normalForm (GE gr rv opts _) loc = err (bugloc loc) id . nfx (GE gr rv opts loc)
nfx env@(GE _ _ _ loc) t = do
v <- eval env [] t
case value2term loc [] v of
Left i -> fail ("variable #"++show i++" is out of scope")
Right t -> return t
eval :: GlobalEnv -> Env -> Term -> Err Value
eval (GE gr rvs opts loc) env t = ($ (map snd env)) # value cenv t
where
cenv = CE gr rvs opts loc (map fst env)
--apply env = apply' env
--------------------------------------------------------------------------------
-- * Environments
type ResourceValues = Map.Map ModuleName (Map.Map Ident (Err Value))
data GlobalEnv = GE Grammar ResourceValues Options GLocation
data CompleteEnv = CE {srcgr::Grammar,rvs::ResourceValues,
opts::Options,
gloc::GLocation,local::LocalScope}
type GLocation = L Ident
type LocalScope = [Ident]
type Stack = [Value]
type OpenValue = Stack->Value
geLoc (GE _ _ _ loc) = loc
geGrammar (GE gr _ _ _) = gr
ext b env = env{local=b:local env}
extend bs env = env{local=bs++local env}
global env = GE (srcgr env) (rvs env) (opts env) (gloc env)
var :: CompleteEnv -> Ident -> Err OpenValue
var env x = maybe unbound pick' (elemIndex x (local env))
where
unbound = fail ("Unknown variable: "++showIdent x)
pick' i = return $ \ vs -> maybe (err i vs) ok (pick i vs)
err i vs = bug $ "Stack problem: "++showIdent x++": "
++unwords (map showIdent (local env))
++" => "++show (i,length vs)
ok v = --trace ("var "++show x++" = "++show v) $
v
pick :: Int -> Stack -> Maybe Value
pick 0 (v:_) = Just v
pick i (_:vs) = pick (i-1) vs
pick i vs = Nothing -- bug $ "pick "++show (i,vs)
resource env (m,c) =
-- err bug id $
if isPredefCat c
then value0 env =<< lockRecType c defLinType -- hmm
else maybe e id $ Map.lookup c =<< Map.lookup m (rvs env)
where e = fail $ "Not found: "++render m++"."++showIdent c
-- | Convert operators once, not every time they are looked up
resourceValues :: Options -> SourceGrammar -> GlobalEnv
resourceValues opts gr = env
where
env = GE gr rvs opts (L NoLoc identW)
rvs = Map.mapWithKey moduleResources (moduleMap gr)
moduleResources m = Map.mapWithKey (moduleResource m) . jments
moduleResource m c _info = do L l t <- lookupResDefLoc gr (m,c)
let loc = L l c
qloc = L l (Q (m,c))
eval (GE gr rvs opts loc) [] (traceRes qloc t)
traceRes = if flag optTrace opts
then traceResource
else const id
-- * Tracing
-- | Insert a call to the trace function under the top-level lambdas
traceResource (L l q) t =
case termFormCnc t of
(abs,body) -> mkAbs abs (mkApp traceQ [args,body])
where
args = R $ tuple2record (K lstr:[Vr x|(bt,x)<-abs,bt==Explicit])
lstr = render (l<>":"<>ppTerm Qualified 0 q)
traceQ = Q (cPredef,cTrace)
-- * Computing values
-- | Computing the value of a top-level term
value0 :: CompleteEnv -> Term -> Err Value
value0 env = eval (global env) []
-- | Computing the value of a term
value :: CompleteEnv -> Term -> Err OpenValue
value env t0 =
-- Each terms is traversed only once by this function, using only statically
-- available information. Notably, the values of lambda bound variables
-- will be unknown during the term traversal phase.
-- The result is an OpenValue, which is a function that may be applied many
-- times to different dynamic values, but without the term traversal overhead
-- and without recomputing other statically known information.
-- For this to work, there should be no recursive calls under lambdas here.
-- Whenever we need to construct the OpenValue function with an explicit
-- lambda, we have to lift the recursive calls outside the lambda.
-- (See e.g. the rules for Let, Prod and Abs)
{-
trace (render $ text "value"<+>sep [ppL (gloc env)<>text ":",
brackets (fsep (map ppIdent (local env))),
ppTerm Unqualified 10 t0]) $
--}
errIn (render t0) $
case t0 of
Vr x -> var env x
Q x@(m,f)
| m == cPredef -> if f==cErrorType -- to be removed
then let p = identS "P"
in const # value0 env (mkProd [(Implicit,p,typeType)] (Vr p) [])
else if f==cPBool
then const # resource env x
else const . flip VApp [] # predef f
| otherwise -> const # resource env x --valueResDef (fst env) x
QC x -> return $ const (VCApp x [])
App e1 e2 -> apply' env e1 . (:[]) =<< value env e2
Let (x,(oty,t)) body -> do vb <- value (ext x env) body
vt <- value env t
return $ \ vs -> vb (vt vs:vs)
Meta i -> return $ \ vs -> VMeta i (zip (local env) vs) []
Prod bt x t1 t2 ->
do vt1 <- value env t1
vt2 <- value (ext x env) t2
return $ \ vs -> VProd bt (vt1 vs) x $ Bind $ \ vx -> vt2 (vx:vs)
Abs bt x t -> do vt <- value (ext x env) t
return $ VAbs bt x . Bind . \ vs vx -> vt (vx:vs)
EInt n -> return $ const (VInt n)
EFloat f -> return $ const (VFloat f)
K s -> return $ const (VString s)
Empty -> return $ const (VString "")
Sort s | s == cTok -> return $ const (VSort cStr) -- to be removed
| otherwise -> return $ const (VSort s)
ImplArg t -> (VImplArg.) # value env t
Table p res -> liftM2 VTblType # value env p <# value env res
RecType rs -> do lovs <- mapPairsM (value env) rs
return $ \vs->VRecType $ mapSnd ($vs) lovs
t@(ExtR t1 t2) -> ((extR t.)# both id) # both (value env) (t1,t2)
FV ts -> ((vfv .) # sequence) # mapM (value env) ts
R as -> do lovs <- mapPairsM (value env.snd) as
return $ \ vs->VRec $ mapSnd ($vs) lovs
T i cs -> valueTable env i cs
V ty ts -> do pvs <- paramValues env ty
((VV ty pvs .) . sequence) # mapM (value env) ts
C t1 t2 -> ((ok2p vconcat.) # both id) # both (value env) (t1,t2)
S t1 t2 -> ((select env.) # both id) # both (value env) (t1,t2)
P t l -> --maybe (bug $ "project "++show l++" from "++show v) id $
do ov <- value env t
return $ \ vs -> let v = ov vs
in maybe (VP v l) id (proj l v)
Alts t tts -> (\v vts -> VAlts # v <# mapM (both id) vts) # value env t <# mapM (both (value env)) tts
Strs ts -> ((VStrs.) # sequence) # mapM (value env) ts
Glue t1 t2 -> ((ok2p (glue env).) # both id) # both (value env) (t1,t2)
ELin c r -> (unlockVRec (gloc env) c.) # value env r
EPatt p -> return $ const (VPatt p) -- hmm
EPattType ty -> do vt <- value env ty
return (VPattType . vt)
Typed t ty -> value env t
t -> fail.render $ "value"<+>ppTerm Unqualified 10 t $$ show t
vconcat vv@(v1,v2) =
case vv of
(VString "",_) -> v2
(_,VString "") -> v1
(VApp NonExist _,_) -> v1
(_,VApp NonExist _) -> v2
_ -> VC v1 v2
proj l v | isLockLabel l = return (VRec [])
---- a workaround 18/2/2005: take this away and find the reason
---- why earlier compilation destroys the lock field
proj l v =
case v of
VFV vs -> liftM vfv (mapM (proj l) vs)
VRec rs -> lookup l rs
-- VExtR v1 v2 -> proj l v2 `mplus` proj l v1 -- hmm
VS (VV pty pvs rs) v2 -> flip VS v2 . VV pty pvs # mapM (proj l) rs
_ -> return (ok1 VP v l)
ok1 f v1@(VError {}) _ = v1
ok1 f v1 v2 = f v1 v2
ok2 f v1@(VError {}) _ = v1
ok2 f _ v2@(VError {}) = v2
ok2 f v1 v2 = f v1 v2
ok2p f (v1@VError {},_) = v1
ok2p f (_,v2@VError {}) = v2
ok2p f vv = f vv
unlockVRec loc c0 v0 = v0
{-
unlockVRec loc c0 v0 = unlockVRec' c0 v0
where
unlockVRec' ::Ident -> Value -> Value
unlockVRec' c v =
case v of
-- VClosure env t -> err bug (VClosure env) (unlockRecord c t)
VAbs bt x (Bind f) -> VAbs bt x (Bind $ \ v -> unlockVRec' c (f v))
VRec rs -> plusVRec rs lock
-- _ -> VExtR v (VRec lock) -- hmm
_ -> {-trace (render $ ppL loc $ "unlock non-record "++show v0)-} v -- hmm
-- _ -> bugloc loc $ "unlock non-record "++show v0
where
lock = [(lockLabel c,VRec [])]
-}
-- suspicious, but backwards compatible
plusVRec rs1 rs2 = VRec ([(l,v)|(l,v)<-rs1,l `notElem` ls2] ++ rs2)
where ls2 = map fst rs2
extR t vv =
case vv of
(VFV vs,v2) -> vfv [extR t (v1,v2)|v1<-vs]
(v1,VFV vs) -> vfv [extR t (v1,v2)|v2<-vs]
(VRecType rs1, VRecType rs2) ->
case intersect (map fst rs1) (map fst rs2) of
[] -> VRecType (rs1 ++ rs2)
ls -> error $ "clash"<+>show ls
(VRec rs1, VRec rs2) -> plusVRec rs1 rs2
(v1 , VRec [(l,_)]) | isLockLabel l -> v1 -- hmm
(VS (VV t pvs vs) s,v2) -> VS (VV t pvs [extR t (v1,v2)|v1<-vs]) s
-- (v1,v2) -> ok2 VExtR v1 v2 -- hmm
(v1,v2) -> error $ "not records" $$ show v1 $$ show v2
where
error explain = ppbug $ "The term" <+> t
<+> "is not reducible" $$ explain
glue env (v1,v2) = glu v1 v2
where
glu v1 v2 =
case (v1,v2) of
(VFV vs,v2) -> vfv [glu v1 v2|v1<-vs]
(v1,VFV vs) -> vfv [glu v1 v2|v2<-vs]
(VString s1,VString s2) -> VString (s1++s2)
(v1,VAlts d vs) -> VAlts (glx d) [(glx v,c) | (v,c) <- vs]
where glx v2 = glu v1 v2
(v1@(VAlts {}),v2) ->
--err (const (ok2 VGlue v1 v2)) id $
err bug id $
do y' <- strsFromValue v2
x' <- strsFromValue v1
return $ vfv [foldr1 VC (map VString (str2strings (glueStr v u))) | v <- x', u <- y']
(VC va vb,v2) -> VC va (glu vb v2)
(v1,VC va vb) -> VC (glu v1 va) vb
(VS (VV ty pvs vs) vb,v2) -> VS (VV ty pvs [glu v v2|v<-vs]) vb
(v1,VS (VV ty pvs vs) vb) -> VS (VV ty pvs [glu v1 v|v<-vs]) vb
(v1@(VApp NonExist _),_) -> v1
(_,v2@(VApp NonExist _)) -> v2
-- (v1,v2) -> ok2 VGlue v1 v2
(v1,v2) -> if flag optPlusAsBind (opts env)
then VC v1 (VC (VApp BIND []) v2)
else let loc = gloc env
vt v = case value2term loc (local env) v of
Left i -> Error ('#':show i)
Right t -> t
originalMsg = render $ ppL loc (hang "unsupported token gluing" 4
(Glue (vt v1) (vt v2)))
term = render $ pp $ Glue (vt v1) (vt v2)
in error $ unlines
[originalMsg
,""
,"There was a problem in the expression `"++term++"`, either:"
,"1) You are trying to use + on runtime arguments, possibly via an oper."
,"2) One of the arguments in `"++term++"` is a bound variable from pattern matching a string, but the cases are non-exhaustive."
,"For more help see https://github.com/GrammaticalFramework/gf-core/tree/master/doc/errors/gluing.md"
]
-- | to get a string from a value that represents a sequence of terminals
strsFromValue :: Value -> Err [Str]
strsFromValue t = case t of
VString s -> return [str s]
VC s t -> do
s' <- strsFromValue s
t' <- strsFromValue t
return [plusStr x y | x <- s', y <- t']
{-
VGlue s t -> do
s' <- strsFromValue s
t' <- strsFromValue t
return [glueStr x y | x <- s', y <- t']
-}
VAlts d vs -> do
d0 <- strsFromValue d
v0 <- mapM (strsFromValue . fst) vs
c0 <- mapM (strsFromValue . snd) vs
--let vs' = zip v0 c0
return [strTok (str2strings def) vars |
def <- d0,
vars <- [[(str2strings v, map sstr c) | (v,c) <- zip vv c0] |
vv <- sequence v0]
]
VFV ts -> concat # mapM strsFromValue ts
VStrs ts -> concat # mapM strsFromValue ts
_ -> fail ("cannot get Str from value " ++ show t)
vfv vs = case nub vs of
[v] -> v
vs -> VFV vs
select env vv =
case vv of
(v1,VFV vs) -> vfv [select env (v1,v2)|v2<-vs]
(VFV vs,v2) -> vfv [select env (v1,v2)|v1<-vs]
(v1@(VV pty vs rs),v2) ->
err (const (VS v1 v2)) id $
do --ats <- allParamValues (srcgr env) pty
--let vs = map (value0 env) ats
i <- maybeErr "no match" $ findIndex (==v2) vs
return (ix (gloc env) "select" rs i)
(VT _ _ [(PW,Bind b)],_) -> {-trace "eliminate wild card table" $-} b []
(v1@(VT _ _ cs),v2) ->
err (\_->ok2 VS v1 v2) (err bug id . valueMatch env) $
match (gloc env) cs v2
(VS (VV pty pvs rs) v12,v2) -> VS (VV pty pvs [select env (v11,v2)|v11<-rs]) v12
(v1,v2) -> ok2 VS v1 v2
match loc cs v =
case value2term loc [] v of
Left i -> bad ("variable #"++show i++" is out of scope")
Right t -> err bad return (matchPattern cs t)
where
bad = fail . ("In pattern matching: "++)
valueMatch :: CompleteEnv -> (Bind Env,Substitution) -> Err Value
valueMatch env (Bind f,env') = f # mapPairsM (value0 env) env'
valueTable :: CompleteEnv -> TInfo -> [Case] -> Err OpenValue
valueTable env i cs =
case i of
TComp ty -> do pvs <- paramValues env ty
((VV ty pvs .) # sequence) # mapM (value env.snd) cs
_ -> do ty <- getTableType i
cs' <- mapM valueCase cs
err (dynamic cs' ty) return (convert cs' ty)
where
dynamic cs' ty _ = cases cs' # value env ty
cases cs' vty vs = err keep ($vs) (convertv cs' (vty vs))
where
keep msg = --trace (msg++"\n"++render (ppTerm Unqualified 0 (T i cs))) $
VT wild (vty vs) (mapSnd ($vs) cs')
wild = case i of TWild _ -> True; _ -> False
convertv cs' vty =
case value2term (gloc env) [] vty of
Left i -> fail ("variable #"++show i++" is out of scope")
Right pty -> convert' cs' =<< paramValues'' env pty
convert cs' ty = convert' cs' =<< paramValues' env ty
convert' cs' ((pty,vs),pvs) =
do sts <- mapM (matchPattern cs') vs
return $ \ vs -> VV pty pvs $ map (err bug id . valueMatch env)
(mapFst ($vs) sts)
valueCase (p,t) = do p' <- measurePatt # inlinePattMacro p
pvs <- linPattVars p'
vt <- value (extend pvs env) t
return (p',\vs-> Bind $ \bs-> vt (push' p' bs pvs vs))
inlinePattMacro p =
case p of
PM qc -> do r <- resource env qc
case r of
VPatt p' -> inlinePattMacro p'
_ -> ppbug $ hang "Expected pattern macro:" 4
(show r)
_ -> composPattOp inlinePattMacro p
paramValues env ty = snd # paramValues' env ty
paramValues' env ty = paramValues'' env =<< nfx (global env) ty
paramValues'' env pty = do ats <- allParamValues (srcgr env) pty
pvs <- mapM (eval (global env) []) ats
return ((pty,ats),pvs)
push' p bs xs = if length bs/=length xs
then bug $ "push "++show (p,bs,xs)
else push bs xs
push :: Env -> LocalScope -> Stack -> Stack
push bs [] vs = vs
push bs (x:xs) vs = maybe err id (lookup x bs):push bs xs vs
where err = bug $ "Unbound pattern variable "++showIdent x
apply' :: CompleteEnv -> Term -> [OpenValue] -> Err OpenValue
apply' env t [] = value env t
apply' env t vs =
case t of
QC x -> return $ \ svs -> VCApp x (map ($svs) vs)
{-
Q x@(m,f) | m==cPredef -> return $
let constr = --trace ("predef "++show x) .
VApp x
in \ svs -> maybe constr id (Map.lookup f predefs)
$ map ($svs) vs
| otherwise -> do r <- resource env x
return $ \ svs -> vapply (gloc env) r (map ($svs) vs)
-}
App t1 t2 -> apply' env t1 . (:vs) =<< value env t2
_ -> do fv <- value env t
return $ \ svs -> vapply (gloc env) (fv svs) (map ($svs) vs)
vapply :: GLocation -> Value -> [Value] -> Value
vapply loc v [] = v
vapply loc v vs =
case v of
VError {} -> v
-- VClosure env (Abs b x t) -> beta gr env b x t vs
VAbs bt _ (Bind f) -> vbeta loc bt f vs
VApp pre vs1 -> delta' pre (vs1++vs)
where
delta' Trace (v1:v2:vs) = let vr = vapply loc v2 vs
in vtrace loc v1 vr
delta' pre vs = err msg vfv $ mapM (delta pre) (varyList vs)
--msg = const (VApp pre (vs1++vs))
msg = bug . (("Applying Predef."++showIdent (predefName pre)++": ")++)
VS (VV t pvs fs) s -> VS (VV t pvs [vapply loc f vs|f<-fs]) s
VFV fs -> vfv [vapply loc f vs|f<-fs]
VCApp f vs0 -> VCApp f (vs0++vs)
VMeta i env vs0 -> VMeta i env (vs0++vs)
VGen i vs0 -> VGen i (vs0++vs)
v -> bug $ "vapply "++show v++" "++show vs
vbeta loc bt f (v:vs) =
case (bt,v) of
(Implicit,VImplArg v) -> ap v
(Explicit, v) -> ap v
where
ap (VFV avs) = vfv [vapply loc (f v) vs|v<-avs]
ap v = vapply loc (f v) vs
vary (VFV vs) = vs
vary v = [v]
varyList = mapM vary
{-
beta env b x t (v:vs) =
case (b,v) of
(Implicit,VImplArg v) -> apply' (ext (x,v) env) t vs
(Explicit, v) -> apply' (ext (x,v) env) t vs
-}
vtrace loc arg res = trace (render (hang (pv arg) 4 ("->"<+>pv res))) res
where
pv v = case v of
VRec (f:as) -> hang (pf f) 4 (fsep (map pa as))
_ -> ppV v
pf (_,VString n) = pp n
pf (_,v) = ppV v
pa (_,v) = ppV v
ppV v = case value2term' True loc [] v of
Left i -> "variable #" <> pp i <+> "is out of scope"
Right t -> ppTerm Unqualified 10 t
-- | Convert a value back to a term
value2term :: GLocation -> [Ident] -> Value -> Either Int Term
value2term = value2term' False
value2term' stop loc xs v0 =
case v0 of
VApp pre vs -> liftM (foldl App (Q (cPredef,predefName pre))) (mapM v2t vs)
VCApp f vs -> liftM (foldl App (QC f)) (mapM v2t vs)
VGen j vs -> liftM2 (foldl App) (var j) (mapM v2t vs)
VMeta j env vs -> liftM (foldl App (Meta j)) (mapM v2t vs)
VProd bt v x f -> liftM2 (Prod bt x) (v2t v) (v2t' x f)
VAbs bt x f -> liftM (Abs bt x) (v2t' x f)
VInt n -> return (EInt n)
VFloat f -> return (EFloat f)
VString s -> return (if null s then Empty else K s)
VSort s -> return (Sort s)
VImplArg v -> liftM ImplArg (v2t v)
VTblType p res -> liftM2 Table (v2t p) (v2t res)
VRecType rs -> liftM RecType (mapM (\(l,v) -> fmap ((,) l) (v2t v)) rs)
VRec as -> liftM R (mapM (\(l,v) -> v2t v >>= \t -> return (l,(Nothing,t))) as)
VV t _ vs -> liftM (V t) (mapM v2t vs)
VT wild v cs -> v2t v >>= \t -> liftM (T ((if wild then TWild else TTyped) t)) (mapM nfcase cs)
VFV vs -> liftM FV (mapM v2t vs)
VC v1 v2 -> liftM2 C (v2t v1) (v2t v2)
VS v1 v2 -> liftM2 S (v2t v1) (v2t v2)
VP v l -> v2t v >>= \t -> return (P t l)
VPatt p -> return (EPatt p)
VPattType v -> v2t v >>= return . EPattType
VAlts v vvs -> liftM2 Alts (v2t v) (mapM (\(x,y) -> liftM2 (,) (v2t x) (v2t y)) vvs)
VStrs vs -> liftM Strs (mapM v2t vs)
-- VGlue v1 v2 -> Glue (v2t v1) (v2t v2)
-- VExtR v1 v2 -> ExtR (v2t v1) (v2t v2)
VError err -> return (Error err)
_ -> bug ("value2term "++show loc++" : "++show v0)
where
v2t = v2txs xs
v2txs = value2term' stop loc
v2t' x f = v2txs (x:xs) (bind f (gen xs))
var j
| j<length xs = Right (Vr (reverse xs !! j))
| otherwise = Left j
pushs xs e = foldr push e xs
push x (env,xs) = ((x,gen xs):env,x:xs)
gen xs = VGen (length xs) []
nfcase (p,f) = liftM ((,) p) (v2txs xs' (bind f env'))
where (env',xs') = pushs (pattVars p) ([],xs)
bind (Bind f) x = if stop
then VSort (identS "...") -- hmm
else f x
linPattVars p =
if null dups
then return pvs
else fail.render $ hang "Pattern is not linear. All variable names on the left-hand side must be distinct." 4 (ppPatt Unqualified 0 p)
where
allpvs = allPattVars p
pvs = nub allpvs
dups = allpvs \\ pvs
pattVars = nub . allPattVars
allPattVars p =
case p of
PV i -> [i]
PAs i p -> i:allPattVars p
_ -> collectPattOp allPattVars p
---
ix loc fn xs i =
if i<n
then xs !! i
else bugloc loc $ "(!!): index too large in "++fn++", "++show i++"<"++show n
where n = length xs
infixl 1 #,<# --,@@
f # x = fmap f x
mf <# mx = ap mf mx
--m1 @@ m2 = (m1 =<<) . m2
both f (x,y) = (,) # f x <# f y
bugloc loc s = ppbug $ ppL loc s
bug msg = ppbug msg
ppbug doc = error $ render $ hang "Internal error in Compute.ConcreteNew:" 4 doc

View File

@@ -27,10 +27,6 @@ instance Predef Int where
instance Predef Bool where instance Predef Bool where
toValue = boolV toValue = boolV
fromValue v = case v of
VCApp (mn,i) [] | mn == cPredef && i == cPTrue -> return True
VCApp (mn,i) [] | mn == cPredef && i == cPFalse -> return False
_ -> verror "Bool" v
instance Predef String where instance Predef String where
toValue = string toValue = string

View File

@@ -12,8 +12,8 @@ data Value
| VGen Int [Value] -- for lambda bound variables, possibly applied | VGen Int [Value] -- for lambda bound variables, possibly applied
| VMeta MetaId Env [Value] | VMeta MetaId Env [Value]
-- -- | VClosure Env Term -- used in Typecheck.ConcreteNew -- -- | VClosure Env Term -- used in Typecheck.ConcreteNew
| VAbs BindType Ident Binding -- used in Compute.Concrete | VAbs BindType Ident Binding -- used in Compute.ConcreteNew
| VProd BindType Value Ident Binding -- used in Compute.Concrete | VProd BindType Value Ident Binding -- used in Compute.ConcreteNew
| VInt Int | VInt Int
| VFloat Double | VFloat Double
| VString String | VString String

View File

@@ -7,7 +7,7 @@ import GF.Text.Pretty
--import GF.Grammar.Predef(cPredef,cInts) --import GF.Grammar.Predef(cPredef,cInts)
--import GF.Compile.Compute.Predef(predef) --import GF.Compile.Compute.Predef(predef)
--import GF.Compile.Compute.Value(Predefined(..)) --import GF.Compile.Compute.Value(Predefined(..))
import GF.Infra.Ident(Ident,identC,identS,identW,prefixIdent,showRawIdent,rawIdentS) import GF.Infra.Ident(Ident,identS,identW,prefixIdent)
import GF.Infra.Option import GF.Infra.Option
import GF.Haskell as H import GF.Haskell as H
import GF.Grammar.Canonical as C import GF.Grammar.Canonical as C
@@ -21,7 +21,7 @@ concretes2haskell opts absname gr =
| let Grammar abstr cncs = grammar2canonical opts absname gr, | let Grammar abstr cncs = grammar2canonical opts absname gr,
cncmod<-cncs, cncmod<-cncs,
let ModId name = concName cncmod let ModId name = concName cncmod
filename = showRawIdent name ++ ".hs" :: FilePath filename = name ++ ".hs" :: FilePath
] ]
-- | Generate Haskell code for the given concrete module. -- | Generate Haskell code for the given concrete module.
@@ -53,7 +53,7 @@ concrete2haskell opts
labels = S.difference (S.unions (map S.fromList recs)) common_labels labels = S.difference (S.unions (map S.fromList recs)) common_labels
common_records = S.fromList [[label_s]] common_records = S.fromList [[label_s]]
common_labels = S.fromList [label_s] common_labels = S.fromList [label_s]
label_s = LabelId (rawIdentS "s") label_s = LabelId "s"
signature (CatDef c _) = TypeSig lf (Fun abs (pure lin)) signature (CatDef c _) = TypeSig lf (Fun abs (pure lin))
where where
@@ -321,7 +321,7 @@ coerce env ty t =
TableValue ti [TableRow p (coerce env tv t)|TableRow p t<-cs] TableValue ti [TableRow p (coerce env tv t)|TableRow p t<-cs]
(RecordType rt,RecordValue r) -> (RecordType rt,RecordValue r) ->
RecordValue [RecordRow l (coerce env ft f) | RecordValue [RecordRow l (coerce env ft f) |
RecordRow l f<-r,ft<-[ft | RecordRow l' ft <- rt, l'==l]] RecordRow l f<-r,ft<-[ft|RecordRow l' ft<-rt,l'==l]]
(RecordType rt,VarValue x)-> (RecordType rt,VarValue x)->
case lookup x env of case lookup x env of
Just ty' | ty'/=ty -> -- better to compare to normal form of ty' Just ty' | ty'/=ty -> -- better to compare to normal form of ty'
@@ -334,17 +334,18 @@ coerce env ty t =
_ -> t _ -> t
where where
app f ts = ParamConstant (Param f ts) -- !! a hack app f ts = ParamConstant (Param f ts) -- !! a hack
to_rcon = ParamId . Unqual . rawIdentS . to_rcon' . labels to_rcon = ParamId . Unqual . to_rcon' . labels
patVars p = [] patVars p = []
labels r = [l | RecordRow l _ <- r] labels r = [l|RecordRow l _<-r]
proj = Var . identS . proj' proj = Var . identS . proj'
proj' (LabelId l) = "proj_" ++ showRawIdent l proj' (LabelId l) = "proj_"++l
rcon = Var . rcon' rcon = Var . rcon'
rcon' = identS . rcon_name rcon' = identS . rcon_name
rcon_name ls = "R"++concat (sort ['_':showRawIdent l | LabelId l <- ls]) rcon_name ls = "R"++concat (sort ['_':l|LabelId l<-ls])
to_rcon' = ("to_"++) . rcon_name to_rcon' = ("to_"++) . rcon_name
recordType ls = recordType ls =
@@ -399,17 +400,17 @@ linfunName c = prefixIdent "lin" (toIdent c)
class ToIdent i where toIdent :: i -> Ident class ToIdent i where toIdent :: i -> Ident
instance ToIdent ParamId where toIdent (ParamId q) = qIdentC q instance ToIdent ParamId where toIdent (ParamId q) = qIdentS q
instance ToIdent PredefId where toIdent (PredefId s) = identC s instance ToIdent PredefId where toIdent (PredefId s) = identS s
instance ToIdent CatId where toIdent (CatId s) = identC s instance ToIdent CatId where toIdent (CatId s) = identS s
instance ToIdent C.FunId where toIdent (FunId s) = identC s instance ToIdent C.FunId where toIdent (FunId s) = identS s
instance ToIdent VarValueId where toIdent (VarValueId q) = qIdentC q instance ToIdent VarValueId where toIdent (VarValueId q) = qIdentS q
qIdentC = identS . unqual qIdentS = identS . unqual
unqual (Qual (ModId m) n) = showRawIdent m++"_"++ showRawIdent n unqual (Qual (ModId m) n) = m++"_"++n
unqual (Unqual n) = showRawIdent n unqual (Unqual n) = n
instance ToIdent VarId where instance ToIdent VarId where
toIdent Anonymous = identW toIdent Anonymous = identW
toIdent (VarId s) = identC s toIdent (VarId s) = identS s

View File

@@ -25,7 +25,7 @@ import GF.Data.BacktrackM
import GF.Data.Operations import GF.Data.Operations
import GF.Infra.UseIO (ePutStr,ePutStrLn) -- IOE, import GF.Infra.UseIO (ePutStr,ePutStrLn) -- IOE,
import GF.Data.Utilities (updateNthM) --updateNth import GF.Data.Utilities (updateNthM) --updateNth
import GF.Compile.Compute.Concrete(normalForm,resourceValues) import GF.Compile.Compute.ConcreteNew(normalForm,resourceValues)
import qualified Data.Map as Map import qualified Data.Map as Map
import qualified Data.Set as Set import qualified Data.Set as Set
import qualified Data.List as List import qualified Data.List as List
@@ -201,11 +201,11 @@ instance Fail.MonadFail CnvMonad where
fail = bug fail = bug
instance Applicative CnvMonad where instance Applicative CnvMonad where
pure a = CM (\gr c s -> c a s) pure = return
(<*>) = ap (<*>) = ap
instance Monad CnvMonad where instance Monad CnvMonad where
return = pure return a = CM (\gr c s -> c a s)
CM m >>= k = CM (\gr c s -> m gr (\a s -> unCM (k a) gr c s) s) CM m >>= k = CM (\gr c s -> m gr (\a s -> unCM (k a) gr c s) s)
instance MonadState ([ProtoFCat],[Symbol]) CnvMonad where instance MonadState ([ProtoFCat],[Symbol]) CnvMonad where

View File

@@ -42,12 +42,11 @@ getSourceModule opts file0 =
raw <- liftIO $ keepTemp tmp raw <- liftIO $ keepTemp tmp
--ePutStrLn $ "1 "++file0 --ePutStrLn $ "1 "++file0
(optCoding,parsed) <- parseSource opts pModDef raw (optCoding,parsed) <- parseSource opts pModDef raw
let indentLines = unlines . map (" "++) . lines
case parsed of case parsed of
Left (Pn l c,msg) -> do file <- liftIO $ writeTemp tmp Left (Pn l c,msg) -> do file <- liftIO $ writeTemp tmp
cwd <- getCurrentDirectory cwd <- getCurrentDirectory
let location = makeRelative cwd file++":"++show l++":"++show c let location = makeRelative cwd file++":"++show l++":"++show c
raise (location++":\n" ++ indentLines msg) raise (location++":\n "++msg)
Right (i,mi0) -> Right (i,mi0) ->
do liftIO $ removeTemp tmp do liftIO $ removeTemp tmp
let mi =mi0 {mflags=mflags mi0 `addOptions` opts, msrc=file0} let mi =mi0 {mflags=mflags mi0 `addOptions` opts, msrc=file0}

View File

@@ -6,35 +6,31 @@ module GF.Compile.GrammarToCanonical(
) where ) where
import Data.List(nub,partition) import Data.List(nub,partition)
import qualified Data.Map as M import qualified Data.Map as M
import Data.Maybe(fromMaybe)
import qualified Data.Set as S import qualified Data.Set as S
import GF.Data.ErrM import GF.Data.ErrM
import GF.Text.Pretty import GF.Text.Pretty
import GF.Grammar.Grammar as G import GF.Grammar.Grammar
import GF.Grammar.Lookup(lookupOrigInfo,allOrigInfos,allParamValues) import GF.Grammar.Lookup(lookupOrigInfo,allOrigInfos,allParamValues)
import GF.Grammar.Macros(typeForm,collectOp,collectPattOp,composSafeOp,mkAbs,mkApp,term2patt,sortRec) import GF.Grammar.Macros(typeForm,collectOp,collectPattOp,mkAbs,mkApp,term2patt)
import GF.Grammar.Lockfield(isLockLabel) import GF.Grammar.Lockfield(isLockLabel)
import GF.Grammar.Predef(cPredef,cInts) import GF.Grammar.Predef(cPredef,cInts)
import GF.Compile.Compute.Predef(predef) import GF.Compile.Compute.Predef(predef)
import GF.Compile.Compute.Value(Predefined(..)) import GF.Compile.Compute.Value(Predefined(..))
import GF.Infra.Ident(ModuleName(..),Ident,ident2raw,rawIdentS,showIdent,isWildIdent) import GF.Infra.Ident(ModuleName(..),Ident,prefixIdent,showIdent,isWildIdent)
import GF.Infra.Option(Options,optionsPGF) import GF.Infra.Option(Options, optionsPGF)
import PGF.Internal(Literal(..)) import PGF.Internal(Literal(..))
import GF.Compile.Compute.Concrete(GlobalEnv,normalForm,resourceValues) import GF.Compile.Compute.ConcreteNew(normalForm,resourceValues)
import GF.Grammar.Canonical as C import GF.Grammar.Canonical as C
import System.FilePath ((</>), (<.>)) import Debug.Trace
import qualified Debug.Trace as T
-- | Generate Canonical code for the named abstract syntax and all associated -- | Generate Canonical code for the named abstract syntax and all associated
-- concrete syntaxes -- concrete syntaxes
grammar2canonical :: Options -> ModuleName -> G.Grammar -> C.Grammar grammar2canonical :: Options -> ModuleName -> SourceGrammar -> C.Grammar
grammar2canonical opts absname gr = grammar2canonical opts absname gr =
Grammar (abstract2canonical absname gr) Grammar (abstract2canonical absname gr)
(map snd (concretes2canonical opts absname gr)) (map snd (concretes2canonical opts absname gr))
-- | Generate Canonical code for the named abstract syntax -- | Generate Canonical code for the named abstract syntax
abstract2canonical :: ModuleName -> G.Grammar -> Abstract
abstract2canonical absname gr = abstract2canonical absname gr =
Abstract (modId absname) (convFlags gr absname) cats funs Abstract (modId absname) (convFlags gr absname) cats funs
where where
@@ -49,7 +45,6 @@ abstract2canonical absname gr =
convHypo (bt,name,t) = convHypo (bt,name,t) =
case typeForm t of case typeForm t of
([],(_,cat),[]) -> gId cat -- !! ([],(_,cat),[]) -> gId cat -- !!
tf -> error $ "abstract2canonical convHypo: " ++ show tf
convType t = convType t =
case typeForm t of case typeForm t of
@@ -60,24 +55,23 @@ abstract2canonical absname gr =
convHypo' (bt,name,t) = TypeBinding (gId name) (convType t) convHypo' (bt,name,t) = TypeBinding (gId name) (convType t)
-- | Generate Canonical code for the all concrete syntaxes associated with -- | Generate Canonical code for the all concrete syntaxes associated with
-- the named abstract syntax in given the grammar. -- the named abstract syntax in given the grammar.
concretes2canonical :: Options -> ModuleName -> G.Grammar -> [(FilePath, Concrete)]
concretes2canonical opts absname gr = concretes2canonical opts absname gr =
[(cncname,concrete2canonical gr cenv absname cnc cncmod) [(cncname,concrete2canonical gr cenv absname cnc cncmod)
| let cenv = resourceValues opts gr, | let cenv = resourceValues opts gr,
cnc<-allConcretes gr absname, cnc<-allConcretes gr absname,
let cncname = "canonical" </> render cnc <.> "gf" let cncname = "canonical/"++render cnc ++ ".gf" :: FilePath
Ok cncmod = lookupModule gr cnc Ok cncmod = lookupModule gr cnc
] ]
-- | Generate Canonical GF for the given concrete module. -- | Generate Canonical GF for the given concrete module.
concrete2canonical :: G.Grammar -> GlobalEnv -> ModuleName -> ModuleName -> ModuleInfo -> Concrete
concrete2canonical gr cenv absname cnc modinfo = concrete2canonical gr cenv absname cnc modinfo =
Concrete (modId cnc) (modId absname) (convFlags gr cnc) Concrete (modId cnc) (modId absname) (convFlags gr cnc)
(neededParamTypes S.empty (params defs)) (neededParamTypes S.empty (params defs))
[lincat | (_,Left lincat) <- defs] [lincat|(_,Left lincat)<-defs]
[lin | (_,Right lin) <- defs] [lin|(_,Right lin)<-defs]
where where
defs = concatMap (toCanonical gr absname cenv) . defs = concatMap (toCanonical gr absname cenv) .
M.toList $ M.toList $
@@ -92,7 +86,6 @@ concrete2canonical gr cenv absname cnc modinfo =
else let ((got,need),def) = paramType gr q else let ((got,need),def) = paramType gr q
in def++neededParamTypes (S.union got have) (S.toList need++qs) in def++neededParamTypes (S.union got have) (S.toList need++qs)
toCanonical :: G.Grammar -> ModuleName -> GlobalEnv -> (Ident, Info) -> [(S.Set QIdent, Either LincatDef LinDef)]
toCanonical gr absname cenv (name,jment) = toCanonical gr absname cenv (name,jment) =
case jment of case jment of
CncCat (Just (L loc typ)) _ _ pprn _ -> CncCat (Just (L loc typ)) _ _ pprn _ ->
@@ -105,8 +98,7 @@ toCanonical gr absname cenv (name,jment) =
where where
tts = tableTypes gr [e'] tts = tableTypes gr [e']
e' = cleanupRecordFields lincat $ e' = unAbs (length params) $
unAbs (length params) $
nf loc (mkAbs params (mkApp def (map Vr args))) nf loc (mkAbs params (mkApp def (map Vr args)))
params = [(b,x)|(b,x,_)<-ctx] params = [(b,x)|(b,x,_)<-ctx]
args = map snd params args = map snd params
@@ -117,12 +109,12 @@ toCanonical gr absname cenv (name,jment) =
_ -> [] _ -> []
where where
nf loc = normalForm cenv (L loc name) nf loc = normalForm cenv (L loc name)
-- aId n = prefixIdent "A." (gId n)
unAbs 0 t = t unAbs 0 t = t
unAbs n (Abs _ _ t) = unAbs (n-1) t unAbs n (Abs _ _ t) = unAbs (n-1) t
unAbs _ t = t unAbs _ t = t
tableTypes :: G.Grammar -> [Term] -> S.Set QIdent
tableTypes gr ts = S.unions (map tabtys ts) tableTypes gr ts = S.unions (map tabtys ts)
where where
tabtys t = tabtys t =
@@ -131,7 +123,6 @@ tableTypes gr ts = S.unions (map tabtys ts)
T (TTyped t) cs -> S.union (paramTypes gr t) (tableTypes gr (map snd cs)) T (TTyped t) cs -> S.union (paramTypes gr t) (tableTypes gr (map snd cs))
_ -> collectOp tabtys t _ -> collectOp tabtys t
paramTypes :: G.Grammar -> G.Type -> S.Set QIdent
paramTypes gr t = paramTypes gr t =
case t of case t of
RecType fs -> S.unions (map (paramTypes gr.snd) fs) RecType fs -> S.unions (map (paramTypes gr.snd) fs)
@@ -150,26 +141,11 @@ paramTypes gr t =
Ok (_,ResParam {}) -> S.singleton q Ok (_,ResParam {}) -> S.singleton q
_ -> ignore _ -> ignore
ignore = T.trace ("Ignore: " ++ show t) S.empty ignore = trace ("Ignore: "++show t) S.empty
-- | Filter out record fields from definitions which don't appear in lincat.
cleanupRecordFields :: G.Type -> Term -> Term
cleanupRecordFields (RecType ls) (R as) =
let defnFields = M.fromList ls
in R
[ (lbl, (mty, t'))
| (lbl, (mty, t)) <- as
, M.member lbl defnFields
, let Just ty = M.lookup lbl defnFields
, let t' = cleanupRecordFields ty t
]
cleanupRecordFields ty t@(FV _) = composSafeOp (cleanupRecordFields ty) t
cleanupRecordFields _ t = t
convert :: G.Grammar -> Term -> LinValue
convert gr = convert' gr [] convert gr = convert' gr []
convert' :: G.Grammar -> [Ident] -> Term -> LinValue
convert' gr vs = ppT convert' gr vs = ppT
where where
ppT0 = convert' gr vs ppT0 = convert' gr vs
@@ -187,20 +163,20 @@ convert' gr vs = ppT
S t p -> selection (ppT t) (ppT p) S t p -> selection (ppT t) (ppT p)
C t1 t2 -> concatValue (ppT t1) (ppT t2) C t1 t2 -> concatValue (ppT t1) (ppT t2)
App f a -> ap (ppT f) (ppT a) App f a -> ap (ppT f) (ppT a)
R r -> RecordValue (fields (sortRec r)) R r -> RecordValue (fields r)
P t l -> projection (ppT t) (lblId l) P t l -> projection (ppT t) (lblId l)
Vr x -> VarValue (gId x) Vr x -> VarValue (gId x)
Cn x -> VarValue (gId x) -- hmm Cn x -> VarValue (gId x) -- hmm
Con c -> ParamConstant (Param (gId c) []) Con c -> ParamConstant (Param (gId c) [])
Sort k -> VarValue (gId k) Sort k -> VarValue (gId k)
EInt n -> LiteralValue (IntConstant n) EInt n -> LiteralValue (IntConstant n)
Q (m,n) -> if m==cPredef then ppPredef n else VarValue (gQId m n) Q (m,n) -> if m==cPredef then ppPredef n else VarValue ((gQId m n))
QC (m,n) -> ParamConstant (Param (gQId m n) []) QC (m,n) -> ParamConstant (Param ((gQId m n)) [])
K s -> LiteralValue (StrConstant s) K s -> LiteralValue (StrConstant s)
Empty -> LiteralValue (StrConstant "") Empty -> LiteralValue (StrConstant "")
FV ts -> VariantValue (map ppT ts) FV ts -> VariantValue (map ppT ts)
Alts t' vs -> alts vs (ppT t') Alts t' vs -> alts vs (ppT t')
_ -> error $ "convert' ppT: " ++ show t _ -> error $ "convert' "++show t
ppCase (p,t) = TableRow (ppP p) (ppTv (patVars p++vs) t) ppCase (p,t) = TableRow (ppP p) (ppTv (patVars p++vs) t)
@@ -213,12 +189,12 @@ convert' gr vs = ppT
Ok ALL_CAPIT -> p "ALL_CAPIT" Ok ALL_CAPIT -> p "ALL_CAPIT"
_ -> VarValue (gQId cPredef n) -- hmm _ -> VarValue (gQId cPredef n) -- hmm
where where
p = PredefValue . PredefId . rawIdentS p = PredefValue . PredefId
ppP p = ppP p =
case p of case p of
PC c ps -> ParamPattern (Param (gId c) (map ppP ps)) PC c ps -> ParamPattern (Param (gId c) (map ppP ps))
PP (m,c) ps -> ParamPattern (Param (gQId m c) (map ppP ps)) PP (m,c) ps -> ParamPattern (Param ((gQId m c)) (map ppP ps))
PR r -> RecordPattern (fields r) {- PR r -> RecordPattern (fields r) {-
PW -> WildPattern PW -> WildPattern
PV x -> VarP x PV x -> VarP x
@@ -227,7 +203,6 @@ convert' gr vs = ppT
PFloat x -> Lit (show x) PFloat x -> Lit (show x)
PT _ p -> ppP p PT _ p -> ppP p
PAs x p -> AsP x (ppP p) -} PAs x p -> AsP x (ppP p) -}
_ -> error $ "convert' ppP: " ++ show p
where where
fields = map field . filter (not.isLockLabel.fst) fields = map field . filter (not.isLockLabel.fst)
field (l,p) = RecordRow (lblId l) (ppP p) field (l,p) = RecordRow (lblId l) (ppP p)
@@ -244,12 +219,12 @@ convert' gr vs = ppT
pre Empty = [""] -- Empty == K "" pre Empty = [""] -- Empty == K ""
pre (Strs ts) = concatMap pre ts pre (Strs ts) = concatMap pre ts
pre (EPatt p) = pat p pre (EPatt p) = pat p
pre t = error $ "convert' alts pre: " ++ show t pre t = error $ "pre "++show t
pat (PString s) = [s] pat (PString s) = [s]
pat (PAlt p1 p2) = pat p1++pat p2 pat (PAlt p1 p2) = pat p1++pat p2
pat (PSeq p1 p2) = [s1++s2 | s1<-pat p1, s2<-pat p2] pat (PSeq p1 p2) = [s1++s2 | s1<-pat p1, s2<-pat p2]
pat p = error $ "convert' alts pat: "++show p pat p = error $ "pat "++show p
fields = map field . filter (not.isLockLabel.fst) fields = map field . filter (not.isLockLabel.fst)
field (l,(_,t)) = RecordRow (lblId l) (ppT t) field (l,(_,t)) = RecordRow (lblId l) (ppT t)
@@ -262,7 +237,6 @@ convert' gr vs = ppT
ParamConstant (Param p (ps++[a])) ParamConstant (Param p (ps++[a]))
_ -> error $ "convert' ap: "++render (ppA f <+> ppA a) _ -> error $ "convert' ap: "++render (ppA f <+> ppA a)
concatValue :: LinValue -> LinValue -> LinValue
concatValue v1 v2 = concatValue v1 v2 =
case (v1,v2) of case (v1,v2) of
(LiteralValue (StrConstant ""),_) -> v2 (LiteralValue (StrConstant ""),_) -> v2
@@ -270,24 +244,21 @@ concatValue v1 v2 =
_ -> ConcatValue v1 v2 _ -> ConcatValue v1 v2
-- | Smart constructor for projections -- | Smart constructor for projections
projection :: LinValue -> LabelId -> LinValue projection r l = maybe (Projection r l) id (proj r l)
projection r l = fromMaybe (Projection r l) (proj r l)
proj :: LinValue -> LabelId -> Maybe LinValue
proj r l = proj r l =
case r of case r of
RecordValue r -> case [v | RecordRow l' v <- r, l'==l] of RecordValue r -> case [v|RecordRow l' v<-r,l'==l] of
[v] -> Just v [v] -> Just v
_ -> Nothing _ -> Nothing
_ -> Nothing _ -> Nothing
-- | Smart constructor for selections -- | Smart constructor for selections
selection :: LinValue -> LinValue -> LinValue
selection t v = selection t v =
-- Note: impossible cases can become possible after grammar transformation -- Note: impossible cases can become possible after grammar transformation
case t of case t of
TableValue tt r -> TableValue tt r ->
case nub [rv | TableRow _ rv <- keep] of case nub [rv|TableRow _ rv<-keep] of
[rv] -> rv [rv] -> rv
_ -> Selection (TableValue tt r') v _ -> Selection (TableValue tt r') v
where where
@@ -306,16 +277,13 @@ selection t v =
(keep,discard) = partition (mightMatchRow v) r (keep,discard) = partition (mightMatchRow v) r
_ -> Selection t v _ -> Selection t v
impossible :: LinValue -> LinValue
impossible = CommentedValue "impossible" impossible = CommentedValue "impossible"
mightMatchRow :: LinValue -> TableRow rhs -> Bool
mightMatchRow v (TableRow p _) = mightMatchRow v (TableRow p _) =
case p of case p of
WildPattern -> True WildPattern -> True
_ -> mightMatch v p _ -> mightMatch v p
mightMatch :: LinValue -> LinPattern -> Bool
mightMatch v p = mightMatch v p =
case v of case v of
ConcatValue _ _ -> False ConcatValue _ _ -> False
@@ -327,18 +295,16 @@ mightMatch v p =
RecordValue rv -> RecordValue rv ->
case p of case p of
RecordPattern rp -> RecordPattern rp ->
and [maybe False (`mightMatch` p) (proj v l) | RecordRow l p<-rp] and [maybe False (flip mightMatch p) (proj v l) | RecordRow l p<-rp]
_ -> False _ -> False
_ -> True _ -> True
patVars :: Patt -> [Ident]
patVars p = patVars p =
case p of case p of
PV x -> [x] PV x -> [x]
PAs x p -> x:patVars p PAs x p -> x:patVars p
_ -> collectPattOp patVars p _ -> collectPattOp patVars p
convType :: Term -> LinType
convType = ppT convType = ppT
where where
ppT t = ppT t =
@@ -350,9 +316,9 @@ convType = ppT
Sort k -> convSort k Sort k -> convSort k
-- EInt n -> tcon0 (identS ("({-"++show n++"-})")) -- type level numeric literal -- EInt n -> tcon0 (identS ("({-"++show n++"-})")) -- type level numeric literal
FV (t:ts) -> ppT t -- !! FV (t:ts) -> ppT t -- !!
QC (m,n) -> ParamType (ParamTypeId (gQId m n)) QC (m,n) -> ParamType (ParamTypeId ((gQId m n)))
Q (m,n) -> ParamType (ParamTypeId (gQId m n)) Q (m,n) -> ParamType (ParamTypeId ((gQId m n)))
_ -> error $ "convType ppT: " ++ show t _ -> error $ "Missing case in convType for: "++show t
convFields = map convField . filter (not.isLockLabel.fst) convFields = map convField . filter (not.isLockLabel.fst)
convField (l,r) = RecordRow (lblId l) (ppT r) convField (l,r) = RecordRow (lblId l) (ppT r)
@@ -361,20 +327,15 @@ convType = ppT
"Float" -> FloatType "Float" -> FloatType
"Int" -> IntType "Int" -> IntType
"Str" -> StrType "Str" -> StrType
_ -> error $ "convType convSort: " ++ show k _ -> error ("convSort "++show k)
toParamType :: Term -> ParamType
toParamType t = case convType t of toParamType t = case convType t of
ParamType pt -> pt ParamType pt -> pt
_ -> error $ "toParamType: " ++ show t _ -> error ("toParamType "++show t)
toParamId :: Term -> ParamId
toParamId t = case toParamType t of toParamId t = case toParamType t of
ParamTypeId p -> p ParamTypeId p -> p
paramType :: G.Grammar
-> (ModuleName, Ident)
-> ((S.Set (ModuleName, Ident), S.Set QIdent), [ParamDef])
paramType gr q@(_,n) = paramType gr q@(_,n) =
case lookupOrigInfo gr q of case lookupOrigInfo gr q of
Ok (m,ResParam (Just (L _ ps)) _) Ok (m,ResParam (Just (L _ ps)) _)
@@ -382,7 +343,7 @@ paramType gr q@(_,n) =
((S.singleton (m,n),argTypes ps), ((S.singleton (m,n),argTypes ps),
[ParamDef name (map (param m) ps)] [ParamDef name (map (param m) ps)]
) )
where name = gQId m n where name = (gQId m n)
Ok (m,ResOper _ (Just (L _ t))) Ok (m,ResOper _ (Just (L _ t)))
| m==cPredef && n==cInts -> | m==cPredef && n==cInts ->
((S.empty,S.empty),[]) {- ((S.empty,S.empty),[]) {-
@@ -390,46 +351,36 @@ paramType gr q@(_,n) =
[Type (ConAp ((gQId m n)) [identS "n"]) (TId (identS "Int"))])-} [Type (ConAp ((gQId m n)) [identS "n"]) (TId (identS "Int"))])-}
| otherwise -> | otherwise ->
((S.singleton (m,n),paramTypes gr t), ((S.singleton (m,n),paramTypes gr t),
[ParamAliasDef (gQId m n) (convType t)]) [ParamAliasDef ((gQId m n)) (convType t)])
_ -> ((S.empty,S.empty),[]) _ -> ((S.empty,S.empty),[])
where where
param m (n,ctx) = Param (gQId m n) [toParamId t|(_,_,t)<-ctx] param m (n,ctx) = Param ((gQId m n)) [toParamId t|(_,_,t)<-ctx]
argTypes = S.unions . map argTypes1 argTypes = S.unions . map argTypes1
argTypes1 (n,ctx) = S.unions [paramTypes gr t|(_,_,t)<-ctx] argTypes1 (n,ctx) = S.unions [paramTypes gr t|(_,_,t)<-ctx]
lblId :: Label -> C.LabelId lblId = LabelId . render -- hmm
lblId (LIdent ri) = LabelId ri modId (MN m) = ModId (showIdent m)
lblId (LVar i) = LabelId (rawIdentS (show i)) -- hmm
modId :: ModuleName -> C.ModId class FromIdent i where gId :: Ident -> i
modId (MN m) = ModId (ident2raw m)
class FromIdent i where
gId :: Ident -> i
instance FromIdent VarId where instance FromIdent VarId where
gId i = if isWildIdent i then Anonymous else VarId (ident2raw i) gId i = if isWildIdent i then Anonymous else VarId (showIdent i)
instance FromIdent C.FunId where gId = C.FunId . ident2raw instance FromIdent C.FunId where gId = C.FunId . showIdent
instance FromIdent CatId where gId = CatId . ident2raw instance FromIdent CatId where gId = CatId . showIdent
instance FromIdent ParamId where gId = ParamId . unqual instance FromIdent ParamId where gId = ParamId . unqual
instance FromIdent VarValueId where gId = VarValueId . unqual instance FromIdent VarValueId where gId = VarValueId . unqual
class FromIdent i => QualIdent i where class FromIdent i => QualIdent i where gQId :: ModuleName -> Ident -> i
gQId :: ModuleName -> Ident -> i
instance QualIdent ParamId where gQId m n = ParamId (qual m n) instance QualIdent ParamId where gQId m n = ParamId (qual m n)
instance QualIdent VarValueId where gQId m n = VarValueId (qual m n) instance QualIdent VarValueId where gQId m n = VarValueId (qual m n)
qual :: ModuleName -> Ident -> QualId qual m n = Qual (modId m) (showIdent n)
qual m n = Qual (modId m) (ident2raw n) unqual n = Unqual (showIdent n)
unqual :: Ident -> QualId
unqual n = Unqual (ident2raw n)
convFlags :: G.Grammar -> ModuleName -> Flags
convFlags gr mn = convFlags gr mn =
Flags [(rawIdentS n,convLit v) | Flags [(n,convLit v) |
(n,v)<-err (const []) (optionsPGF.mflags) (lookupModule gr mn)] (n,v)<-err (const []) (optionsPGF.mflags) (lookupModule gr mn)]
where where
convLit l = convLit l =

View File

@@ -0,0 +1,468 @@
module GF.Compile.GrammarToLPGF (mkCanon2lpgf) where
import LPGF (LPGF (..))
import qualified LPGF as L
import PGF.CId
import GF.Grammar.Grammar
import qualified GF.Grammar.Canonical as C
import GF.Compile.GrammarToCanonical (grammar2canonical)
import GF.Data.Operations (ErrorMonad (..))
import qualified GF.Data.IntMapBuilder as IntMapBuilder
import GF.Infra.Option (Options)
import GF.Infra.UseIO (IOE)
import GF.Text.Pretty (pp, render)
import Control.Applicative ((<|>))
import Control.Monad (when, forM, forM_)
import qualified Control.Monad.State.Strict as CMS
import Data.List (elemIndex)
import qualified Data.List as L
import qualified Data.Map.Strict as Map
import Data.Maybe (fromJust, isJust)
import Data.Text (Text)
import qualified Data.Text as T
import System.Environment (lookupEnv)
import System.FilePath ((</>), (<.>))
import Text.Printf (printf)
import qualified Debug.Trace
trace x = Debug.Trace.trace ("> " ++ show x) (return ())
mkCanon2lpgf :: Options -> SourceGrammar -> ModuleName -> IOE LPGF
mkCanon2lpgf opts gr am = do
debug <- isJust <$> lookupEnv "DEBUG"
when debug $ do
ppCanonical debugDir canon
dumpCanonical debugDir canon
(an,abs) <- mkAbstract ab
cncs <- mapM (mkConcrete debug ab) cncs
let lpgf = LPGF {
L.absname = an,
L.abstract = abs,
L.concretes = Map.fromList cncs
}
when debug $ ppLPGF debugDir lpgf
return lpgf
where
canon@(C.Grammar ab cncs) = grammar2canonical opts am gr
mkAbstract :: (ErrorMonad err) => C.Abstract -> err (CId, L.Abstract)
mkAbstract (C.Abstract modId flags cats funs) = return (mdi2i modId, L.Abstract {})
mkConcrete :: (ErrorMonad err) => Bool -> C.Abstract -> C.Concrete -> err (CId, L.Concrete)
mkConcrete debug (C.Abstract _ _ _ funs) (C.Concrete modId absModId flags params0 lincats lindefs0) = do
let
-- Some transformations on canonical grammar
params :: [C.ParamDef]
params = inlineParamAliases params0
lindefs :: [C.LinDef]
lindefs =
[ C.LinDef funId varIds linValue'
| (C.LinDef funId varIds linValue) <- lindefs0
, let Right linType = lookupLinType funId
, let linValue' = cleanupRecordFields linValue linType
]
-- Filter out record fields from definitions which don't appear in lincat.
-- Workaround for https://github.com/GrammaticalFramework/gf-core/issues/101
cleanupRecordFields :: C.LinValue -> C.LinType -> C.LinValue
cleanupRecordFields (C.RecordValue rrvs) (C.RecordType rrs) =
let defnFields = Map.fromList [ (lid, lt) | (C.RecordRow lid lt) <- rrs ]
in C.RecordValue
[ C.RecordRow lid lv'
| C.RecordRow lid lv <- rrvs
, Map.member lid defnFields
, let Just lt = Map.lookup lid defnFields
, let lv' = cleanupRecordFields lv lt
]
cleanupRecordFields lv _ = lv
-- Builds maps for lookups
paramValueMap :: Map.Map C.ParamId C.ParamDef -- constructor -> definition
paramValueMap = Map.fromList [ (v,d) | d@(C.ParamDef _ vs) <- params, (C.Param v _) <- vs ]
lincatMap :: Map.Map C.CatId C.LincatDef
lincatMap = Map.fromList [ (cid,d) | d@(C.LincatDef cid _) <- lincats ]
funMap :: Map.Map C.FunId C.FunDef
funMap = Map.fromList [ (fid,d) | d@(C.FunDef fid _) <- funs ]
-- | Lookup paramdef, providing dummy fallback when not found
-- Workaround for https://github.com/GrammaticalFramework/gf-core/issues/100
lookupParamDef :: C.ParamId -> Either String C.ParamDef
lookupParamDef pid = case Map.lookup pid paramValueMap of
Just d -> Right d
Nothing ->
-- Left $ printf "Cannot find param definition: %s" (show pid)
Right $ C.ParamDef (C.ParamId (C.Unqual "DUMMY")) [C.Param pid []]
-- | Lookup lintype for a function
lookupLinType :: C.FunId -> Either String C.LinType
lookupLinType funId = do
fun <- m2e (printf "Cannot find type for: %s" (show funId)) (Map.lookup funId funMap)
let (C.FunDef _ (C.Type _ (C.TypeApp catId _))) = fun
lincat <- m2e (printf "Cannot find lincat for: %s" (show catId)) (Map.lookup catId lincatMap)
let (C.LincatDef _ lt) = lincat
return lt
-- | Lookup lintype for a function's argument
lookupLinTypeArg :: C.FunId -> Int -> Either String C.LinType
lookupLinTypeArg funId argIx = do
fun <- m2e (printf "Cannot find type for: %s" (show funId)) (Map.lookup funId funMap)
let (C.FunDef _ (C.Type args _)) = fun
let (C.TypeBinding _ (C.Type _ (C.TypeApp catId _))) = args !! argIx
lincat <- m2e (printf "Cannot find lincat for: %s" (show catId)) (Map.lookup catId lincatMap)
let (C.LincatDef _ lt) = lincat
return lt
-- Code generation
-- | Main code generation function
mkLin :: C.LinDef -> CodeGen (CId, L.LinFun)
mkLin (C.LinDef funId varIds linValue) = do
-- when debug $ trace funId
(lf, _) <- val2lin' linValue --skip memoisation at top level
return (fi2i funId, lf)
where
val2lin :: C.LinValue -> CodeGen (L.LinFun, Maybe C.LinType)
val2lin lv@(C.TableValue _ _) = do
-- val2lin lv@(C.ParamConstant _) = do
m <- CMS.get
case Map.lookup lv m of
Just r -> return r
Nothing -> do
r <- val2lin' lv
CMS.modify (Map.insert lv r)
return r
val2lin lv = val2lin' lv
val2lin' :: C.LinValue -> CodeGen (L.LinFun, Maybe C.LinType)
val2lin' lv = case lv of
C.ConcatValue v1 v2 -> do
(v1',t1) <- val2lin v1
(v2',t2) <- val2lin v2
return (L.Concat v1' v2', t1 <|> t2) -- t1 else t2
C.LiteralValue ll -> case ll of
C.FloatConstant f -> return (L.Token $ T.pack $ show f, Just C.FloatType)
C.IntConstant i -> return (L.Token $ T.pack $ show i, Just C.IntType)
C.StrConstant s -> return (L.Token $ T.pack s, Just C.StrType)
C.ErrorValue err -> return (L.Error err, Nothing)
C.ParamConstant (C.Param pid lvs) -> do
let
collectProjections :: C.LinValue -> CodeGen [L.LinFun]
collectProjections (C.ParamConstant (C.Param pid lvs)) = do
def <- CMS.lift $ lookupParamDef pid
let (C.ParamDef tpid defpids) = def
pidIx <- CMS.lift $ eitherElemIndex pid [ p | C.Param p _ <- defpids ]
rest <- mapM collectProjections lvs
return $ L.Ix (pidIx+1) : concat rest
collectProjections lv = do
(lf,_) <- val2lin lv
return [lf]
lfs <- collectProjections lv
let term = L.Tuple lfs
def <- CMS.lift $ lookupParamDef pid
let (C.ParamDef tpid _) = def
return (term, Just $ C.ParamType (C.ParamTypeId tpid))
C.PredefValue (C.PredefId pid) -> case pid of
"BIND" -> return (L.Bind, Nothing)
"SOFT_BIND" -> return (L.Bind, Nothing)
"SOFT_SPACE" -> return (L.Space, Nothing)
"CAPIT" -> return (L.Capit, Nothing)
"ALL_CAPIT" -> return (L.AllCapit, Nothing)
_ -> CMS.lift $ Left $ printf "Unknown predef function: %s" pid
C.RecordValue rrvs -> do
let rrvs' = sortRecordRows rrvs
ts <- sequence [ val2lin lv | C.RecordRow lid lv <- rrvs' ]
return (L.Tuple (map fst ts), Just $ C.RecordType [ C.RecordRow lid lt | (C.RecordRow lid _, (_, Just lt)) <- zip rrvs' ts])
C.TableValue lt trvs -> do
-- group the rows by "left-most" value
let
groupRow :: C.TableRowValue -> C.TableRowValue -> Bool
groupRow (C.TableRow p1 _) (C.TableRow p2 _) = groupPattern p1 p2
groupPattern :: C.LinPattern -> C.LinPattern -> Bool
groupPattern p1 p2 = case (p1,p2) of
(C.ParamPattern (C.Param pid1 _), C.ParamPattern (C.Param pid2 _)) -> pid1 == pid2 -- compare only constructors
(C.RecordPattern (C.RecordRow lid1 patt1:_), C.RecordPattern (C.RecordRow lid2 patt2:_)) -> groupPattern patt1 patt2 -- lid1 == lid2 necessarily
_ -> error $ printf "Mismatched patterns in grouping:\n%s\n%s" (show p1) (show p2)
grps :: [[C.TableRowValue]]
grps = L.groupBy groupRow trvs
-- remove one level of depth and recurse
let
handleGroup :: [C.TableRowValue] -> CodeGen (L.LinFun, Maybe C.LinType)
handleGroup [C.TableRow patt lv] =
case reducePattern patt of
Just patt' -> do
(lf,lt) <- handleGroup [C.TableRow patt' lv]
return (L.Tuple [lf],lt)
Nothing -> val2lin lv
handleGroup rows = do
let rows' = map reduceRow rows
val2lin (C.TableValue lt rows') -- lt is wrong here, but is unused
reducePattern :: C.LinPattern -> Maybe C.LinPattern
reducePattern patt =
case patt of
C.ParamPattern (C.Param _ []) -> Nothing
C.ParamPattern (C.Param _ patts) -> Just $ C.ParamPattern (C.Param pid' patts')
where
C.ParamPattern (C.Param pid1 patts1) = head patts
pid' = pid1
patts' = patts1 ++ tail patts
C.RecordPattern [] -> Nothing
C.RecordPattern (C.RecordRow lid patt:rrs) ->
case reducePattern patt of
Just patt' -> Just $ C.RecordPattern (C.RecordRow lid patt':rrs)
Nothing -> if null rrs then Nothing else Just $ C.RecordPattern rrs
_ -> error $ printf "Unhandled pattern in reducing: %s" (show patt)
reduceRow :: C.TableRowValue -> C.TableRowValue
reduceRow (C.TableRow patt lv) =
let Just patt' = reducePattern patt
in C.TableRow patt' lv
-- ts :: [(L.LinFun, Maybe C.LinType)]
ts <- mapM handleGroup grps
-- return
let typ = case ts of
(_, Just tst):_ -> Just $ C.TableType lt tst
_ -> Nothing
return (L.Tuple (map fst ts), typ)
-- TODO TuplePattern, WildPattern?
C.TupleValue lvs -> do
ts <- mapM val2lin lvs
return (L.Tuple (map fst ts), Just $ C.TupleType (map (fromJust.snd) ts))
C.VariantValue [] -> return (L.Empty, Nothing) -- TODO Just C.StrType ?
C.VariantValue (vr:_) -> val2lin vr -- NOTE variants not supported, just pick first
C.VarValue (C.VarValueId (C.Unqual v)) -> do
ix <- CMS.lift $ eitherElemIndex (C.VarId v) varIds
lt <- CMS.lift $ lookupLinTypeArg funId ix
return (L.Argument (ix+1), Just lt)
C.PreValue pts df -> do
pts' <- forM pts $ \(pfxs, lv) -> do
(lv', _) <- val2lin lv
return (map T.pack pfxs, lv')
(df', lt) <- val2lin df
return (L.Pre pts' df', lt)
C.Projection v1 lblId -> do
(v1', mtyp) <- val2lin v1
-- find label index in argument type
let Just (C.RecordType rrs) = mtyp
let rrs' = [ lid | C.RecordRow lid _ <- rrs ]
-- lblIx <- eitherElemIndex lblId rrs'
let
lblIx = case eitherElemIndex lblId rrs' of
Right x -> x
Left _ -> 0 -- corresponds to Prelude.False
-- lookup lintype for record row
let C.RecordRow _ lt = rrs !! lblIx
return (L.Projection v1' (L.Ix (lblIx+1)), Just lt)
C.Selection v1 v2 -> do
(v1', t1) <- val2lin v1
(v2', t2) <- val2lin v2
let Just (C.TableType t11 t12) = t1 -- t11 == t2
return (L.Projection v1' v2', Just t12)
-- C.CommentedValue cmnt lv -> val2lin lv
C.CommentedValue cmnt lv -> case cmnt of
"impossible" -> val2lin lv >>= \(_, typ) -> return (L.Empty, typ)
_ -> val2lin lv
v -> CMS.lift $ Left $ printf "val2lin not implemented for: %s" (show v)
-- Invoke code generation
let e = flip CMS.evalStateT Map.empty $ mapM mkLin lindefs
case e of
Left err -> raise err
Right lins -> do
let maybeOptimise = if debug then id else extractStrings
let concr = maybeOptimise $ L.Concrete {
L.toks = IntMapBuilder.emptyIntMap,
L.lins = Map.fromList lins
}
return (mdi2i modId, concr)
type CodeGen a = CMS.StateT (Map.Map C.LinValue (L.LinFun, Maybe C.LinType)) (Either String) a
-- | Remove ParamAliasDefs by inlining their definitions
inlineParamAliases :: [C.ParamDef] -> [C.ParamDef]
inlineParamAliases defs = if null aliases then defs else map rp' pdefs
where
(aliases,pdefs) = L.partition isParamAliasDef defs
rp' :: C.ParamDef -> C.ParamDef
rp' (C.ParamDef pid pids) = C.ParamDef pid (map rp'' pids)
rp' (C.ParamAliasDef _ _) = error "inlineParamAliases called on ParamAliasDef" -- impossible
rp'' :: C.ParamValueDef -> C.ParamValueDef
rp'' (C.Param pid pids) = C.Param pid (map rp''' pids)
rp''' :: C.ParamId -> C.ParamId
rp''' pid = case L.find (\(C.ParamAliasDef p _) -> p == pid) aliases of
Just (C.ParamAliasDef _ (C.ParamType (C.ParamTypeId p))) -> p
_ -> pid
-- | Always put 's' reocord field first, then sort alphabetically.
-- Workaround for https://github.com/GrammaticalFramework/gf-core/issues/102
-- Based on GF.Granmar.Macros.sortRec
sortRecordRows :: [C.RecordRowValue] -> [C.RecordRowValue]
sortRecordRows = L.sortBy ordLabel
where
ordLabel (C.RecordRow (C.LabelId l1) _) (C.RecordRow (C.LabelId l2) _) =
case (l1,l2) of
("s",_) -> LT
(_,"s") -> GT
(s1,s2) -> compare s1 s2
-- sortRecord :: C.LinValue -> C.LinValue
-- sortRecord (C.RecordValue rrvs) = C.RecordValue (sortRecordRows rrvs)
-- sortRecord lv = lv
isParamAliasDef :: C.ParamDef -> Bool
isParamAliasDef (C.ParamAliasDef _ _) = True
isParamAliasDef _ = False
isParamType :: C.LinType -> Bool
isParamType (C.ParamType _) = True
isParamType _ = False
isRecordType :: C.LinType -> Bool
isRecordType (C.RecordType _) = True
isRecordType _ = False
-- | Find all token strings, put them in a map and replace with token indexes
extractStrings :: L.Concrete -> L.Concrete
extractStrings concr = L.Concrete { L.toks = toks', L.lins = lins' }
where
imb = IntMapBuilder.fromIntMap (L.toks concr)
(lins',imb') = CMS.runState (go0 (L.lins concr)) imb
toks' = IntMapBuilder.toIntMap imb'
go0 :: Map.Map CId L.LinFun -> CMS.State (IntMapBuilder.IMB Text) (Map.Map CId L.LinFun)
go0 mp = do
xs <- mapM (\(cid,lin) -> go lin >>= \lin' -> return (cid,lin')) (Map.toList mp)
return $ Map.fromList xs
go :: L.LinFun -> CMS.State (IntMapBuilder.IMB Text) L.LinFun
go lf = case lf of
L.Token str -> do
imb <- CMS.get
let (ix,imb') = IntMapBuilder.insert' str imb
CMS.put imb'
return $ L.TokenIx ix
L.Pre pts df -> do
-- pts' <- mapM (\(pfxs,lv) -> go lv >>= \lv' -> return (pfxs,lv')) pts
pts' <- forM pts $ \(pfxs,lv) -> do
imb <- CMS.get
let str = T.pack $ show pfxs
let (ix,imb') = IntMapBuilder.insert' str imb
CMS.put imb'
lv' <- go lv
return (ix,lv')
df' <- go df
return $ L.PreIx pts' df'
L.Concat s t -> do
s' <- go s
t' <- go t
return $ L.Concat s' t'
L.Tuple ts -> do
ts' <- mapM go ts
return $ L.Tuple ts'
L.Projection t u -> do
t' <- go t
u' <- go u
return $ L.Projection t' u'
t -> return t
-- | Convert Maybe to Either value with error
m2e :: String -> Maybe a -> Either String a
m2e err = maybe (Left err) Right
-- | Wrap elemIndex into Either value
eitherElemIndex :: (Eq a, Show a) => a -> [a] -> Either String Int
eitherElemIndex x xs = m2e (printf "Cannot find: %s in %s" (show x) (show xs)) (elemIndex x xs)
mdi2s :: C.ModId -> String
mdi2s (C.ModId i) = i
mdi2i :: C.ModId -> CId
mdi2i (C.ModId i) = mkCId i
fi2i :: C.FunId -> CId
fi2i (C.FunId i) = mkCId i
-- Debugging
debugDir :: FilePath
debugDir = "DEBUG"
-- | Pretty-print canonical grammars to file
ppCanonical :: FilePath -> C.Grammar -> IO ()
ppCanonical path (C.Grammar ab cncs) = do
let (C.Abstract modId flags cats funs) = ab
writeFile (path </> mdi2s modId <.> "canonical.gf") (render $ pp ab)
forM_ cncs $ \cnc@(C.Concrete modId absModId flags params lincats lindefs) ->
writeFile' (path </> mdi2s modId <.> "canonical.gf") (render $ pp cnc)
-- | Dump canonical grammars to file
dumpCanonical :: FilePath -> C.Grammar -> IO ()
dumpCanonical path (C.Grammar ab cncs) = do
let (C.Abstract modId flags cats funs) = ab
let body = unlines $ map show cats ++ [""] ++ map show funs
writeFile' (path </> mdi2s modId <.> "canonical.dump") body
forM_ cncs $ \(C.Concrete modId absModId flags params lincats lindefs) -> do
let body = unlines $ concat [
map show params,
[""],
map show lincats,
[""],
map show lindefs
]
writeFile' (path </> mdi2s modId <.> "canonical.dump") body
-- | Pretty-print LPGF to file
ppLPGF :: FilePath -> LPGF -> IO ()
ppLPGF path lpgf =
forM_ (Map.toList $ L.concretes lpgf) $ \(cid,concr) ->
writeFile' (path </> showCId cid <.> "lpgf.txt") (L.render $ L.pp concr)
-- | Dump LPGF to file
dumpLPGF :: FilePath -> LPGF -> IO ()
dumpLPGF path lpgf =
forM_ (Map.toList $ L.concretes lpgf) $ \(cid,concr) -> do
let body = unlines $ map show (Map.toList $ L.lins concr)
writeFile' (path </> showCId cid <.> "lpgf.dump") body
-- | Write a file and report it to console
writeFile' :: FilePath -> String -> IO ()
writeFile' p b = do
writeFile p b
putStrLn $ "Wrote " ++ p

View File

@@ -21,7 +21,7 @@ import GF.Grammar.Printer
import GF.Grammar.Macros import GF.Grammar.Macros
import GF.Grammar.Lookup import GF.Grammar.Lookup
import GF.Grammar.Predef import GF.Grammar.Predef
import GF.Compile.Compute.Concrete(GlobalEnv,normalForm,resourceValues) import GF.Compile.Compute.ConcreteNew(GlobalEnv,normalForm,resourceValues)
import GF.Data.Operations import GF.Data.Operations
import GF.Infra.Option import GF.Infra.Option

View File

@@ -22,7 +22,7 @@ import PGF.Internal
import GF.Data.Operations import GF.Data.Operations
import GF.Infra.Option import GF.Infra.Option
import Data.List(isPrefixOf,find,intercalate,intersperse,groupBy,sortBy) import Data.List --(isPrefixOf, find, intersperse)
import qualified Data.Map as Map import qualified Data.Map as Map
type Prefix = String -> String type Prefix = String -> String
@@ -34,12 +34,11 @@ grammar2haskell :: Options
-> PGF -> PGF
-> String -> String
grammar2haskell opts name gr = foldr (++++) [] $ grammar2haskell opts name gr = foldr (++++) [] $
pragmas ++ haskPreamble gadt name derivingClause (extraImports ++ pgfImports) ++ pragmas ++ haskPreamble gadt name derivingClause extraImports ++
[types, gfinstances gId lexical gr'] ++ compos [types, gfinstances gId lexical gr'] ++ compos
where gr' = hSkeleton gr where gr' = hSkeleton gr
gadt = haskellOption opts HaskellGADT gadt = haskellOption opts HaskellGADT
dataExt = haskellOption opts HaskellData dataExt = haskellOption opts HaskellData
pgf2 = haskellOption opts HaskellPGF2
lexical cat = haskellOption opts HaskellLexical && isLexicalCat opts cat lexical cat = haskellOption opts HaskellLexical && isLexicalCat opts cat
gId | haskellOption opts HaskellNoPrefix = rmForbiddenChars gId | haskellOption opts HaskellNoPrefix = rmForbiddenChars
| otherwise = ("G"++) . rmForbiddenChars | otherwise = ("G"++) . rmForbiddenChars
@@ -51,23 +50,21 @@ grammar2haskell opts name gr = foldr (++++) [] $
derivingClause derivingClause
| dataExt = "deriving (Show,Data)" | dataExt = "deriving (Show,Data)"
| otherwise = "deriving Show" | otherwise = "deriving Show"
extraImports | gadt = ["import Control.Monad.Identity", "import Control.Monad", "import Data.Monoid"] extraImports | gadt = ["import Control.Monad.Identity",
"import Data.Monoid"]
| dataExt = ["import Data.Data"] | dataExt = ["import Data.Data"]
| otherwise = [] | otherwise = []
pgfImports | pgf2 = ["import PGF2 hiding (Tree)", "", "showCId :: CId -> String", "showCId = id"]
| otherwise = ["import PGF hiding (Tree)"]
types | gadt = datatypesGADT gId lexical gr' types | gadt = datatypesGADT gId lexical gr'
| otherwise = datatypes gId derivingClause lexical gr' | otherwise = datatypes gId derivingClause lexical gr'
compos | gadt = prCompos gId lexical gr' ++ composClass compos | gadt = prCompos gId lexical gr' ++ composClass
| otherwise = [] | otherwise = []
haskPreamble :: Bool -> String -> String -> [String] -> [String] haskPreamble gadt name derivingClause extraImports =
haskPreamble gadt name derivingClause imports =
[ [
"module " ++ name ++ " where", "module " ++ name ++ " where",
"" ""
] ++ imports ++ [ ] ++ extraImports ++ [
"", "import PGF hiding (Tree)",
"----------------------------------------------------", "----------------------------------------------------",
"-- automatic translation from GF to Haskell", "-- automatic translation from GF to Haskell",
"----------------------------------------------------", "----------------------------------------------------",
@@ -88,11 +85,10 @@ haskPreamble gadt name derivingClause imports =
"" ""
] ]
predefInst :: Bool -> String -> String -> String -> String -> String -> String
predefInst gadt derivingClause gtyp typ destr consr = predefInst gadt derivingClause gtyp typ destr consr =
(if gadt (if gadt
then [] then []
else "newtype" +++ gtyp +++ "=" +++ gtyp +++ typ +++ derivingClause ++ "\n\n" else ("newtype" +++ gtyp +++ "=" +++ gtyp +++ typ +++ derivingClause ++ "\n\n")
) )
++ ++
"instance Gf" +++ gtyp +++ "where" ++++ "instance Gf" +++ gtyp +++ "where" ++++
@@ -107,10 +103,10 @@ type OIdent = String
type HSkeleton = [(OIdent, [(OIdent, [OIdent])])] type HSkeleton = [(OIdent, [(OIdent, [OIdent])])]
datatypes :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (String,HSkeleton) -> String datatypes :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (String,HSkeleton) -> String
datatypes gId derivingClause lexical = foldr (+++++) "" . filter (/="") . map (hDatatype gId derivingClause lexical) . snd datatypes gId derivingClause lexical = (foldr (+++++) "") . (filter (/="")) . (map (hDatatype gId derivingClause lexical)) . snd
gfinstances :: Prefix -> (OIdent -> Bool) -> (String,HSkeleton) -> String gfinstances :: Prefix -> (OIdent -> Bool) -> (String,HSkeleton) -> String
gfinstances gId lexical (m,g) = foldr (+++++) "" $ filter (/="") $ map (gfInstance gId lexical m) g gfinstances gId lexical (m,g) = (foldr (+++++) "") $ (filter (/="")) $ (map (gfInstance gId lexical m)) g
hDatatype :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (OIdent, [(OIdent, [OIdent])]) -> String hDatatype :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (OIdent, [(OIdent, [OIdent])]) -> String
@@ -135,7 +131,6 @@ nonLexicalRules True rules = [r | r@(f,t) <- rules, not (null t)]
lexicalConstructor :: OIdent -> String lexicalConstructor :: OIdent -> String
lexicalConstructor cat = "Lex" ++ cat lexicalConstructor cat = "Lex" ++ cat
predefTypeSkel :: HSkeleton
predefTypeSkel = [(c,[]) | c <- ["String", "Int", "Float"]] predefTypeSkel = [(c,[]) | c <- ["String", "Int", "Float"]]
-- GADT version of data types -- GADT version of data types
@@ -208,12 +203,11 @@ prCompos gId lexical (_,catrules) =
prRec f (v,c) prRec f (v,c)
| isList f = "`a` foldr (a . a (r (:)) . f) (r [])" +++ v | isList f = "`a` foldr (a . a (r (:)) . f) (r [])" +++ v
| otherwise = "`a`" +++ "f" +++ v | otherwise = "`a`" +++ "f" +++ v
isList f = gId "List" `isPrefixOf` f isList f = (gId "List") `isPrefixOf` f
gfInstance :: Prefix -> (OIdent -> Bool) -> String -> (OIdent, [(OIdent, [OIdent])]) -> String gfInstance :: Prefix -> (OIdent -> Bool) -> String -> (OIdent, [(OIdent, [OIdent])]) -> String
gfInstance gId lexical m crs = hInstance gId lexical m crs ++++ fInstance gId lexical m crs gfInstance gId lexical m crs = hInstance gId lexical m crs ++++ fInstance gId lexical m crs
hInstance :: (String -> String) -> (String -> Bool) -> String -> (String, [(OIdent, [OIdent])]) -> String
----hInstance m ("Cn",_) = "" --- seems to belong to an old applic. AR 18/5/2004 ----hInstance m ("Cn",_) = "" --- seems to belong to an old applic. AR 18/5/2004
hInstance gId _ m (cat,[]) = unlines [ hInstance gId _ m (cat,[]) = unlines [
"instance Show" +++ gId cat, "instance Show" +++ gId cat,
@@ -225,7 +219,7 @@ hInstance gId _ m (cat,[]) = unlines [
hInstance gId lexical m (cat,rules) hInstance gId lexical m (cat,rules)
| isListCat (cat,rules) = | isListCat (cat,rules) =
"instance Gf" +++ gId cat +++ "where" ++++ "instance Gf" +++ gId cat +++ "where" ++++
" gf (" ++ gId cat +++ "[" ++ intercalate "," baseVars ++ "])" " gf (" ++ gId cat +++ "[" ++ concat (intersperse "," baseVars) ++ "])"
+++ "=" +++ mkRHS ("Base"++ec) baseVars ++++ +++ "=" +++ mkRHS ("Base"++ec) baseVars ++++
" gf (" ++ gId cat +++ "(x:xs)) = " " gf (" ++ gId cat +++ "(x:xs)) = "
++ mkRHS ("Cons"++ec) ["x",prParenth (gId cat+++"xs")] ++ mkRHS ("Cons"++ec) ["x",prParenth (gId cat+++"xs")]
@@ -239,15 +233,12 @@ hInstance gId lexical m (cat,rules)
ec = elemCat cat ec = elemCat cat
baseVars = mkVars (baseSize (cat,rules)) baseVars = mkVars (baseSize (cat,rules))
mkInst f xx = let xx' = mkVars (length xx) in " gf " ++ mkInst f xx = let xx' = mkVars (length xx) in " gf " ++
(if null xx then gId f else prParenth (gId f +++ foldr1 (+++) xx')) +++ (if length xx == 0 then gId f else prParenth (gId f +++ foldr1 (+++) xx')) +++
"=" +++ mkRHS f xx' "=" +++ mkRHS f xx'
mkRHS f vars = "mkApp (mkCId \"" ++ f ++ "\")" +++ mkRHS f vars = "mkApp (mkCId \"" ++ f ++ "\")" +++
"[" ++ prTList ", " ["gf" +++ x | x <- vars] ++ "]" "[" ++ prTList ", " ["gf" +++ x | x <- vars] ++ "]"
mkVars :: Int -> [String]
mkVars = mkSVars "x" mkVars = mkSVars "x"
mkSVars :: String -> Int -> [String]
mkSVars s n = [s ++ show i | i <- [1..n]] mkSVars s n = [s ++ show i | i <- [1..n]]
----fInstance m ("Cn",_) = "" --- ----fInstance m ("Cn",_) = "" ---
@@ -266,16 +257,15 @@ fInstance gId lexical m (cat,rules) =
" Just (i," ++ " Just (i," ++
"[" ++ prTList "," xx' ++ "])" +++ "[" ++ prTList "," xx' ++ "])" +++
"| i == mkCId \"" ++ f ++ "\" ->" +++ mkRHS f xx' "| i == mkCId \"" ++ f ++ "\" ->" +++ mkRHS f xx'
where where xx' = ["x" ++ show i | (_,i) <- zip xx [1..]]
xx' = ["x" ++ show i | (_,i) <- zip xx [1..]] mkRHS f vars
mkRHS f vars | isList =
| isList = if "Base" `isPrefixOf` f
if "Base" `isPrefixOf` f then "[" ++ prTList ", " [ "fg" +++ x | x <- vars ] ++ "]"
then "[" ++ prTList ", " [ "fg" +++ x | x <- vars ] ++ "]" else "fg" +++ (vars !! 0) +++ ":" +++ "fgs" +++ (vars !! 1)
else "fg" +++ (vars !! 0) +++ ":" +++ "fgs" +++ (vars !! 1) | otherwise =
| otherwise = gId f +++
gId f +++ prTList " " [prParenth ("fg" +++ x) | x <- vars]
prTList " " [prParenth ("fg" +++ x) | x <- vars]
--type HSkeleton = [(OIdent, [(OIdent, [OIdent])])] --type HSkeleton = [(OIdent, [(OIdent, [OIdent])])]
hSkeleton :: PGF -> (String,HSkeleton) hSkeleton :: PGF -> (String,HSkeleton)
@@ -284,7 +274,7 @@ hSkeleton gr =
let fs = let fs =
[(showCId c, [(showCId f, map showCId cs) | (f, (cs,_)) <- fs]) | [(showCId c, [(showCId f, map showCId cs) | (f, (cs,_)) <- fs]) |
fs@((_, (_,c)):_) <- fns] fs@((_, (_,c)):_) <- fns]
in fs ++ [(sc, []) | c <- cts, let sc = showCId c, sc `notElem` (["Int", "Float", "String"] ++ map fst fs)] in fs ++ [(sc, []) | c <- cts, let sc = showCId c, notElem sc (["Int", "Float", "String"] ++ map fst fs)]
) )
where where
cts = Map.keys (cats (abstract gr)) cts = Map.keys (cats (abstract gr))
@@ -301,10 +291,9 @@ updateSkeleton cat skel rule =
-} -}
isListCat :: (OIdent, [(OIdent, [OIdent])]) -> Bool isListCat :: (OIdent, [(OIdent, [OIdent])]) -> Bool
isListCat (cat,rules) = "List" `isPrefixOf` cat && length rules == 2 isListCat (cat,rules) = "List" `isPrefixOf` cat && length rules == 2
&& ("Base"++c) `elem` fs && ("Cons"++c) `elem` fs && ("Base"++c) `elem` fs && ("Cons"++c) `elem` fs
where where c = elemCat cat
c = elemCat cat fs = map fst rules
fs = map fst rules
-- | Gets the element category of a list category. -- | Gets the element category of a list category.
elemCat :: OIdent -> OIdent elemCat :: OIdent -> OIdent
@@ -348,3 +337,4 @@ composClass =
"", "",
"newtype C b a = C { unC :: b }" "newtype C b a = C { unC :: b }"
] ]

View File

@@ -23,9 +23,9 @@
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Compile.Rename ( module GF.Compile.Rename (
renameSourceTerm, renameSourceTerm,
renameModule renameModule
) where ) where
import GF.Infra.Ident import GF.Infra.Ident
import GF.Infra.CheckM import GF.Infra.CheckM
@@ -39,7 +39,6 @@ import GF.Data.Operations
import Control.Monad import Control.Monad
import Data.List (nub,(\\)) import Data.List (nub,(\\))
import qualified Data.List as L
import qualified Data.Map as Map import qualified Data.Map as Map
import Data.Maybe(mapMaybe) import Data.Maybe(mapMaybe)
import GF.Text.Pretty import GF.Text.Pretty
@@ -106,26 +105,7 @@ renameIdentTerm' env@(act,imps) t0 =
ts@(t:_) -> do checkWarn ("atomic term" <+> ppTerm Qualified 0 t0 $$ ts@(t:_) -> do checkWarn ("atomic term" <+> ppTerm Qualified 0 t0 $$
"conflict" <+> hsep (punctuate ',' (map (ppTerm Qualified 0) ts)) $$ "conflict" <+> hsep (punctuate ',' (map (ppTerm Qualified 0) ts)) $$
"given" <+> fsep (punctuate ',' (map fst qualifs))) "given" <+> fsep (punctuate ',' (map fst qualifs)))
return (bestTerm ts) -- Heuristic for resource grammar. Returns t for all others. return t
where
-- Hotfix for https://github.com/GrammaticalFramework/gf-core/issues/56
-- Real bug is probably somewhere deeper in recognising excluded functions. /IL 2020-06-06
notFromCommonModule :: Term -> Bool
notFromCommonModule term =
let t = render $ ppTerm Qualified 0 term :: String
in not $ any (\moduleName -> moduleName `L.isPrefixOf` t)
["CommonX", "ConstructX", "ExtendFunctor"
,"MarkHTMLX", "ParamX", "TenseX", "TextX"]
-- If one of the terms comes from the common modules,
-- we choose the other one, because that's defined in the grammar.
bestTerm :: [Term] -> Term
bestTerm [] = error "constant not found" -- not reached: bestTerm is only called for case ts@(t:_)
bestTerm ts@(t:_) =
let notCommon = [t | t <- ts, notFromCommonModule t]
in case notCommon of
[] -> t -- All terms are from common modules, return first of original list
(u:_) -> u -- ≥1 terms are not from common modules, return first of those
info2status :: Maybe ModuleName -> Ident -> Info -> StatusInfo info2status :: Maybe ModuleName -> Ident -> Info -> StatusInfo
info2status mq c i = case i of info2status mq c i = case i of

View File

@@ -13,11 +13,11 @@
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Compile.TypeCheck.Abstract (-- * top-level type checking functions; TC should not be called directly. module GF.Compile.TypeCheck.Abstract (-- * top-level type checking functions; TC should not be called directly.
checkContext, checkContext,
checkTyp, checkTyp,
checkDef, checkDef,
checkConstrs, checkConstrs,
) where ) where
import GF.Data.Operations import GF.Data.Operations

View File

@@ -1,7 +1,6 @@
{-# LANGUAGE PatternGuards #-} {-# LANGUAGE PatternGuards #-}
module GF.Compile.TypeCheck.Concrete( checkLType, inferLType, computeLType, ppType ) where module GF.Compile.TypeCheck.Concrete( {-checkLType, inferLType, computeLType, ppType-} ) where
import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint {-
import GF.Infra.CheckM import GF.Infra.CheckM
import GF.Data.Operations import GF.Data.Operations
@@ -23,16 +22,10 @@ computeLType gr g0 t = comp (reverse [(b,x, Vr x) | (b,x,_) <- g0] ++ g0) t
_ | Just _ <- isTypeInts ty -> return ty ---- shouldn't be needed _ | Just _ <- isTypeInts ty -> return ty ---- shouldn't be needed
| isPredefConstant ty -> return ty ---- shouldn't be needed | isPredefConstant ty -> return ty ---- shouldn't be needed
Q (m,ident) -> checkIn ("module" <+> m) $ do Q (m,ident) -> checkIn (text "module" <+> ppIdent m) $ do
ty' <- lookupResDef gr (m,ident) ty' <- lookupResDef gr (m,ident)
if ty' == ty then return ty else comp g ty' --- is this necessary to test? if ty' == ty then return ty else comp g ty' --- is this necessary to test?
AdHocOverload ts -> do
over <- getOverload gr g (Just typeType) t
case over of
Just (tr,_) -> return tr
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 t)
Vr ident -> checkLookup ident g -- never needed to compute! Vr ident -> checkLookup ident g -- never needed to compute!
App f a -> do App f a -> do
@@ -69,6 +62,7 @@ computeLType gr g0 t = comp (reverse [(b,x, Vr x) | (b,x,_) <- g0] ++ g0) t
lockRecType c t' ---- locking to be removed AR 20/6/2009 lockRecType c t' ---- locking to be removed AR 20/6/2009
_ | ty == typeTok -> return typeStr _ | ty == typeTok -> return typeStr
_ | isPredefConstant ty -> return ty
_ -> composOp (comp g) ty _ -> composOp (comp g) ty
@@ -79,26 +73,26 @@ inferLType gr g trm = case trm of
Q (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of Q (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of
Just ty -> return ty Just ty -> return ty
Nothing -> checkError ("unknown in Predef:" <+> ident) Nothing -> checkError (text "unknown in Predef:" <+> ppIdent ident)
Q ident -> checks [ Q ident -> checks [
termWith trm $ lookupResType gr ident >>= computeLType gr g termWith trm $ lookupResType gr ident >>= computeLType gr g
, ,
lookupResDef gr ident >>= inferLType gr g lookupResDef gr ident >>= inferLType gr g
, ,
checkError ("cannot infer type of constant" <+> ppTerm Unqualified 0 trm) checkError (text "cannot infer type of constant" <+> ppTerm Unqualified 0 trm)
] ]
QC (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of QC (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of
Just ty -> return ty Just ty -> return ty
Nothing -> checkError ("unknown in Predef:" <+> ident) Nothing -> checkError (text "unknown in Predef:" <+> ppIdent ident)
QC ident -> checks [ QC ident -> checks [
termWith trm $ lookupResType gr ident >>= computeLType gr g termWith trm $ lookupResType gr ident >>= computeLType gr g
, ,
lookupResDef gr ident >>= inferLType gr g lookupResDef gr ident >>= inferLType gr g
, ,
checkError ("cannot infer type of canonical constant" <+> ppTerm Unqualified 0 trm) checkError (text "cannot infer type of canonical constant" <+> ppTerm Unqualified 0 trm)
] ]
Vr ident -> termWith trm $ checkLookup ident g Vr ident -> termWith trm $ checkLookup ident g
@@ -106,12 +100,7 @@ inferLType gr g trm = case trm of
Typed e t -> do Typed e t -> do
t' <- computeLType gr g t t' <- computeLType gr g t
checkLType gr g e t' checkLType gr g e t'
return (e,t')
AdHocOverload ts -> do
over <- getOverload gr g Nothing trm
case over of
Just trty -> return trty
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 trm)
App f a -> do App f a -> do
over <- getOverload gr g Nothing trm over <- getOverload gr g Nothing trm
@@ -127,11 +116,7 @@ inferLType gr g trm = case trm of
then return val then return val
else substituteLType [(bt,z,a')] val else substituteLType [(bt,z,a')] val
return (App f' a',ty) return (App f' a',ty)
_ -> _ -> checkError (text "A function type is expected for" <+> ppTerm Unqualified 0 f <+> text "instead of type" <+> ppType fty)
let term = ppTerm Unqualified 0 f
funName = pp . head . words .render $ term
in checkError ("A function type is expected for" <+> term <+> "instead of type" <+> ppType fty $$
"\n ** Maybe you gave too many arguments to" <+> funName <+> "\n")
S f x -> do S f x -> do
(f', fty) <- inferLType gr g f (f', fty) <- inferLType gr g f
@@ -139,7 +124,7 @@ inferLType gr g trm = case trm of
Table arg val -> do Table arg val -> do
x'<- justCheck g x arg x'<- justCheck g x arg
return (S f' x', val) return (S f' x', val)
_ -> checkError ("table lintype expected for the table in" $$ nest 2 (ppTerm Unqualified 0 trm)) _ -> checkError (text "table lintype expected for the table in" $$ nest 2 (ppTerm Unqualified 0 trm))
P t i -> do P t i -> do
(t',ty) <- inferLType gr g t --- ?? (t',ty) <- inferLType gr g t --- ??
@@ -147,16 +132,16 @@ inferLType gr g trm = case trm of
let tr2 = P t' i let tr2 = P t' i
termWith tr2 $ case ty' of termWith tr2 $ case ty' of
RecType ts -> case lookup i ts of RecType ts -> case lookup i ts of
Nothing -> checkError ("unknown label" <+> i <+> "in" $$ nest 2 (ppTerm Unqualified 0 ty')) Nothing -> checkError (text "unknown label" <+> ppLabel i <+> text "in" $$ nest 2 (ppTerm Unqualified 0 ty'))
Just x -> return x Just x -> return x
_ -> checkError ("record type expected for:" <+> ppTerm Unqualified 0 t $$ _ -> checkError (text "record type expected for:" <+> ppTerm Unqualified 0 t $$
" instead of the inferred:" <+> ppTerm Unqualified 0 ty') text " instead of the inferred:" <+> ppTerm Unqualified 0 ty')
R r -> do R r -> do
let (ls,fs) = unzip r let (ls,fs) = unzip r
fsts <- mapM inferM fs fsts <- mapM inferM fs
let ts = [ty | (Just ty,_) <- fsts] let ts = [ty | (Just ty,_) <- fsts]
checkCond ("cannot infer type of record" $$ nest 2 (ppTerm Unqualified 0 trm)) (length ts == length fsts) checkCond (text "cannot infer type of record" $$ nest 2 (ppTerm Unqualified 0 trm)) (length ts == length fsts)
return $ (R (zip ls fsts), RecType (zip ls ts)) return $ (R (zip ls fsts), RecType (zip ls ts))
T (TTyped arg) pts -> do T (TTyped arg) pts -> do
@@ -168,7 +153,7 @@ inferLType gr g trm = case trm of
T ti pts -> do -- tries to guess: good in oper type inference T ti pts -> do -- tries to guess: good in oper type inference
let pts' = [pt | pt@(p,_) <- pts, isConstPatt p] let pts' = [pt | pt@(p,_) <- pts, isConstPatt p]
case pts' of case pts' of
[] -> checkError ("cannot infer table type of" <+> ppTerm Unqualified 0 trm) [] -> checkError (text "cannot infer table type of" <+> ppTerm Unqualified 0 trm)
---- PInt k : _ -> return $ Ints $ max [i | PInt i <- pts'] ---- PInt k : _ -> return $ Ints $ max [i | PInt i <- pts']
_ -> do _ -> do
(arg,val) <- checks $ map (inferCase Nothing) pts' (arg,val) <- checks $ map (inferCase Nothing) pts'
@@ -202,7 +187,7 @@ inferLType gr g trm = case trm of
---- hack from Rename.identRenameTerm, to live with files with naming conflicts 18/6/2007 ---- hack from Rename.identRenameTerm, to live with files with naming conflicts 18/6/2007
Strs (Cn c : ts) | c == cConflict -> do Strs (Cn c : ts) | c == cConflict -> do
checkWarn ("unresolved constant, could be any of" <+> hcat (map (ppTerm Unqualified 0) ts)) checkWarn (text "unresolved constant, could be any of" <+> hcat (map (ppTerm Unqualified 0) ts))
inferLType gr g (head ts) inferLType gr g (head ts)
Strs ts -> do Strs ts -> do
@@ -223,25 +208,19 @@ inferLType gr g trm = case trm of
return (RecType (zip ls ts'), typeType) return (RecType (zip ls ts'), typeType)
ExtR r s -> do ExtR r s -> do
(r',rT) <- inferLType gr g r
--- over <- getOverload gr g Nothing r
--- let r1 = maybe r fst over
let r1 = r ---
(r',rT) <- inferLType gr g r1
rT' <- computeLType gr g rT rT' <- computeLType gr g rT
(s',sT) <- inferLType gr g s (s',sT) <- inferLType gr g s
sT' <- computeLType gr g sT sT' <- computeLType gr g sT
let trm' = ExtR r' s' let trm' = ExtR r' s'
---- trm' <- plusRecord r' s'
case (rT', sT') of case (rT', sT') of
(RecType rs, RecType ss) -> do (RecType rs, RecType ss) -> do
let rt = RecType ([field | field@(l,_) <- rs, notElem l (map fst ss)] ++ ss) -- select types of later fields rt <- plusRecType rT' sT'
checkLType gr g trm' rt ---- return (trm', rt) checkLType gr g trm' rt ---- return (trm', rt)
_ | rT' == typeType && sT' == typeType -> do _ | rT' == typeType && sT' == typeType -> return (trm', typeType)
return (trm', typeType) _ -> checkError (text "records or record types expected in" <+> ppTerm Unqualified 0 trm)
_ -> checkError ("records or record types expected in" <+> ppTerm Unqualified 0 trm)
Sort _ -> Sort _ ->
termWith trm $ return typeType termWith trm $ return typeType
@@ -273,7 +252,7 @@ inferLType gr g trm = case trm of
ty' <- lockRecType c ty ---- lookup c; remove lock AR 20/6/2009 ty' <- lockRecType c ty ---- lookup c; remove lock AR 20/6/2009
return $ (ELin c trm', ty') return $ (ELin c trm', ty')
_ -> checkError ("cannot infer lintype of" <+> ppTerm Unqualified 0 trm) _ -> checkError (text "cannot infer lintype of" <+> ppTerm Unqualified 0 trm)
where where
isPredef m = elem m [cPredef,cPredefAbs] isPredef m = elem m [cPredef,cPredefAbs]
@@ -320,6 +299,7 @@ inferLType gr g trm = case trm of
PChars _ -> return $ typeStr PChars _ -> return $ typeStr
_ -> inferLType gr g (patt2term p) >>= return . snd _ -> inferLType gr g (patt2term p) >>= return . snd
-- type inference: Nothing, type checking: Just t -- type inference: Nothing, type checking: Just t
-- the latter permits matching with value type -- the latter permits matching with value type
getOverload :: SourceGrammar -> Context -> Maybe Type -> Term -> Check (Maybe (Term,Type)) getOverload :: SourceGrammar -> Context -> Maybe Type -> Term -> Check (Maybe (Term,Type))
@@ -330,21 +310,8 @@ getOverload gr g mt ot = case appForm ot of
v <- matchOverload f typs ttys v <- matchOverload f typs ttys
return $ Just v return $ Just v
_ -> return Nothing _ -> return Nothing
(AdHocOverload cs@(f:_), ts) -> do --- the function name f is only used in error messages
let typs = concatMap collectOverloads cs
ttys <- mapM (inferLType gr g) ts
v <- matchOverload f typs ttys
return $ Just v
_ -> return Nothing _ -> return Nothing
where where
collectOverloads tr@(Q c) = case lookupOverload gr c of
Ok typs -> typs
_ -> case lookupResType gr c of
Ok ty -> let (args,val) = typeFormCnc ty in [(map (\(b,x,t) -> t) args,(val,tr))]
_ -> []
collectOverloads _ = [] --- constructors QC
matchOverload f typs ttys = do matchOverload f typs ttys = do
let (tts,tys) = unzip ttys let (tts,tys) = unzip ttys
let vfs = lookupOverloadInstance tys typs let vfs = lookupOverloadInstance tys typs
@@ -362,26 +329,25 @@ getOverload gr g mt ot = case appForm ot of
case ([vf | (vf,True) <- matches],[vf | (vf,False) <- matches]) of case ([vf | (vf,True) <- matches],[vf | (vf,False) <- matches]) of
([(_,val,fun)],_) -> return (mkApp fun tts, val) ([(_,val,fun)],_) -> return (mkApp fun tts, val)
([],[(pre,val,fun)]) -> do ([],[(pre,val,fun)]) -> do
checkWarn $ "ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot $$ checkWarn $ text "ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot $$
"for" $$ text "for" $$
nest 2 (showTypes tys) $$ nest 2 (showTypes tys) $$
"using" $$ text "using" $$
nest 2 (showTypes pre) nest 2 (showTypes pre)
return (mkApp fun tts, val) return (mkApp fun tts, val)
([],[]) -> do ([],[]) -> do
checkError $ "no overload instance of" <+> ppTerm Qualified 0 f $$ checkError $ text "no overload instance of" <+> ppTerm Unqualified 0 f $$
maybe empty (\x -> "with value type" <+> ppType x) mt $$ text "for" $$
"for argument list" $$
nest 2 stysError $$ nest 2 stysError $$
"among alternatives" $$ text "among" $$
nest 2 (vcat stypsError) nest 2 (vcat stypsError) $$
maybe empty (\x -> text "with value type" <+> ppType x) mt
(vfs1,vfs2) -> case (noProds vfs1,noProds vfs2) of (vfs1,vfs2) -> case (noProds vfs1,noProds vfs2) of
([(val,fun)],_) -> do ([(val,fun)],_) -> do
return (mkApp fun tts, val) return (mkApp fun tts, val)
([],[(val,fun)]) -> do ([],[(val,fun)]) -> do
checkWarn ("ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot) checkWarn (text "ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot)
return (mkApp fun tts, val) return (mkApp fun tts, val)
----- unsafely exclude irritating warning AR 24/5/2008 ----- unsafely exclude irritating warning AR 24/5/2008
@@ -389,22 +355,16 @@ getOverload gr g mt ot = case appForm ot of
----- "resolved by excluding partial applications:" ++++ ----- "resolved by excluding partial applications:" ++++
----- unlines [prtType env ty | (ty,_) <- vfs', not (noProd ty)] ----- unlines [prtType env ty | (ty,_) <- vfs', not (noProd ty)]
--- now forgiving ambiguity with a warning AR 1/2/2014
-- This gives ad hoc overloading the same behaviour as the choice of the first match in renaming did before. _ -> checkError $ text "ambiguous overloading of" <+> ppTerm Unqualified 0 f <+>
-- But it also gives a chance to ambiguous overloadings that were banned before. text "for" <+> hsep (map ppType tys) $$
(nps1,nps2) -> do text "with alternatives" $$
checkWarn $ "ambiguous overloading of" <+> ppTerm Unqualified 0 f <+> nest 2 (vcat [ppType ty | (_,ty,_) <- if null vfs1 then vfs2 else vfs2])
---- "with argument types" <+> hsep (map (ppTerm Qualified 0) tys) $$
"resolved by selecting the first of the alternatives" $$
nest 2 (vcat [ppTerm Qualified 0 fun | (_,ty,fun) <- vfs1 ++ if null vfs1 then vfs2 else []])
case [(mkApp fun tts,val) | (val,fun) <- nps1 ++ nps2] of
[] -> checkError $ "no alternatives left when resolving" <+> ppTerm Unqualified 0 f
h:_ -> return h
matchVal mt v = elem mt [Nothing,Just v,Just (unlocked v)] matchVal mt v = elem mt [Nothing,Just v,Just (unlocked v)]
unlocked v = case v of unlocked v = case v of
RecType fs -> RecType $ filter (not . isLockLabel . fst) (sortRec fs) RecType fs -> RecType $ filter (not . isLockLabel . fst) fs
_ -> v _ -> v
---- TODO: accept subtypes ---- TODO: accept subtypes
---- TODO: use a trie ---- TODO: use a trie
@@ -425,6 +385,7 @@ getOverload gr g mt ot = case appForm ot of
checkLType :: SourceGrammar -> Context -> Term -> Type -> Check (Term, Type) checkLType :: SourceGrammar -> Context -> Term -> Type -> Check (Term, Type)
checkLType gr g trm typ0 = do checkLType gr g trm typ0 = do
typ <- computeLType gr g typ0 typ <- computeLType gr g typ0
case trm of case trm of
@@ -434,12 +395,10 @@ checkLType gr g trm typ0 = do
Prod bt' z a b -> do Prod bt' z a b -> do
(c',b') <- if isWildIdent z (c',b') <- if isWildIdent z
then checkLType gr ((bt,x,a):g) c b then checkLType gr ((bt,x,a):g) c b
else do b' <- checkIn (pp "abs") $ substituteLType [(bt',z,Vr x)] b else do b' <- checkIn (text "abs") $ substituteLType [(bt',z,Vr x)] b
checkLType gr ((bt,x,a):g) c b' checkLType gr ((bt,x,a):g) c b'
return $ (Abs bt x c', Prod bt' z a b') return $ (Abs bt x c', Prod bt' x a b')
_ -> checkError $ "function type expected instead of" <+> ppType typ $$ _ -> checkError $ text "function type expected instead of" <+> ppType typ
"\n ** Double-check that the type signature of the operation" $$
"matches the number of arguments given to it.\n"
App f a -> do App f a -> do
over <- getOverload gr g (Just typ) trm over <- getOverload gr g (Just typ) trm
@@ -449,12 +408,6 @@ checkLType gr g trm typ0 = do
(trm',ty') <- inferLType gr g trm (trm',ty') <- inferLType gr g trm
termWith trm' $ checkEqLType gr g typ ty' trm' termWith trm' $ checkEqLType gr g typ ty' trm'
AdHocOverload ts -> do
over <- getOverload gr g Nothing trm
case over of
Just trty -> return trty
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 trm)
Q _ -> do Q _ -> do
over <- getOverload gr g (Just typ) trm over <- getOverload gr g (Just typ) trm
case over of case over of
@@ -464,7 +417,7 @@ checkLType gr g trm typ0 = do
termWith trm' $ checkEqLType gr g typ ty' trm' termWith trm' $ checkEqLType gr g typ ty' trm'
T _ [] -> T _ [] ->
checkError ("found empty table in type" <+> ppTerm Unqualified 0 typ) checkError (text "found empty table in type" <+> ppTerm Unqualified 0 typ)
T _ cs -> case typ of T _ cs -> case typ of
Table arg val -> do Table arg val -> do
case allParamValues gr arg of case allParamValues gr arg of
@@ -473,12 +426,12 @@ checkLType gr g trm typ0 = do
ps <- testOvershadow ps0 vs ps <- testOvershadow ps0 vs
if null ps if null ps
then return () then return ()
else checkWarn ("patterns never reached:" $$ else checkWarn (text "patterns never reached:" $$
nest 2 (vcat (map (ppPatt Unqualified 0) ps))) nest 2 (vcat (map (ppPatt Unqualified 0) ps)))
_ -> return () -- happens with variable types _ -> return () -- happens with variable types
cs' <- mapM (checkCase arg val) cs cs' <- mapM (checkCase arg val) cs
return (T (TTyped arg) cs', typ) return (T (TTyped arg) cs', typ)
_ -> checkError $ "table type expected for table instead of" $$ nest 2 (ppType typ) _ -> checkError $ text "table type expected for table instead of" $$ nest 2 (ppType typ)
V arg0 vs -> V arg0 vs ->
case typ of case typ of
Table arg1 val -> Table arg1 val ->
@@ -486,54 +439,51 @@ checkLType gr g trm typ0 = do
vs1 <- allParamValues gr arg1 vs1 <- allParamValues gr arg1
if length vs1 == length vs if length vs1 == length vs
then return () then return ()
else checkError $ "wrong number of values in table" <+> ppTerm Unqualified 0 trm else checkError $ text "wrong number of values in table" <+> ppTerm Unqualified 0 trm
vs' <- map fst `fmap` sequence [checkLType gr g v val|v<-vs] vs' <- map fst `fmap` sequence [checkLType gr g v val|v<-vs]
return (V arg' vs',typ) return (V arg' vs',typ)
R r -> case typ of --- why needed? because inference may be too difficult R r -> case typ of --- why needed? because inference may be too difficult
RecType rr -> do RecType rr -> do
--let (ls,_) = unzip rr -- labels of expected type let (ls,_) = unzip rr -- labels of expected type
fsts <- mapM (checkM r) rr -- check that they are found in the record fsts <- mapM (checkM r) rr -- check that they are found in the record
return $ (R fsts, typ) -- normalize record return $ (R fsts, typ) -- normalize record
_ -> checkError ("record type expected in type checking instead of" $$ nest 2 (ppTerm Unqualified 0 typ)) _ -> checkError (text "record type expected in type checking instead of" $$ nest 2 (ppTerm Unqualified 0 typ))
ExtR r s -> case typ of ExtR r s -> case typ of
_ | typ == typeType -> do _ | typ == typeType -> do
trm' <- computeLType gr g trm trm' <- computeLType gr g trm
case trm' of case trm' of
RecType _ -> termWith trm' $ return typeType RecType _ -> termWith trm $ return typeType
ExtR (Vr _) (RecType _) -> termWith trm' $ return typeType ExtR (Vr _) (RecType _) -> termWith trm $ return typeType
-- ext t = t ** ... -- ext t = t ** ...
_ -> checkError ("invalid record type extension" <+> nest 2 (ppTerm Unqualified 0 trm)) _ -> checkError (text "invalid record type extension" <+> nest 2 (ppTerm Unqualified 0 trm))
RecType rr -> do RecType rr -> do
(r',ty,s') <- checks [
do (r',ty) <- inferLType gr g r
return (r',ty,s)
,
do (s',ty) <- inferLType gr g s
return (s',ty,r)
]
ll2 <- case s of case ty of
R ss -> return $ map fst ss RecType rr1 -> do
_ -> do let (rr0,rr2) = recParts rr rr1
(s',typ2) <- inferLType gr g s r2 <- justCheck g r' rr0
case typ2 of s2 <- justCheck g s' rr2
RecType ss -> return $ map fst ss return $ (ExtR r2 s2, typ)
_ -> checkError ("cannot get labels from" $$ nest 2 (ppTerm Unqualified 0 typ2)) _ -> checkError (text "record type expected in extension of" <+> ppTerm Unqualified 0 r $$
let ll1 = [l | (l,_) <- rr, notElem l ll2] text "but found" <+> ppTerm Unqualified 0 ty)
--- over <- getOverload gr g Nothing r --- this would solve #66 but fail ParadigmsAra. AR 6/7/2020
--- let r1 = maybe r fst over
let r1 = r ---
(r',_) <- checkLType gr g r1 (RecType [field | field@(l,_) <- rr, elem l ll1])
(s',_) <- checkLType gr g s (RecType [field | field@(l,_) <- rr, elem l ll2])
let rec = R ([(l,(Nothing,P r' l)) | l <- ll1] ++ [(l,(Nothing,P s' l)) | l <- ll2])
return (rec, typ)
ExtR ty ex -> do ExtR ty ex -> do
r' <- justCheck g r ty r' <- justCheck g r ty
s' <- justCheck g s ex s' <- justCheck g s ex
return $ (ExtR r' s', typ) --- is this all? it assumes the same division in trm and typ return $ (ExtR r' s', typ) --- is this all? it assumes the same division in trm and typ
_ -> checkError ("record extension not meaningful for" <+> ppTerm Unqualified 0 typ) _ -> checkError (text "record extension not meaningful for" <+> ppTerm Unqualified 0 typ)
FV vs -> do FV vs -> do
ttys <- mapM (flip (checkLType gr g) typ) vs ttys <- mapM (flip (checkLType gr g) typ) vs
@@ -548,7 +498,7 @@ checkLType gr g trm typ0 = do
(arg',val) <- checkLType gr g arg p (arg',val) <- checkLType gr g arg p
checkEqLType gr g typ t trm checkEqLType gr g typ t trm
return (S tab' arg', t) return (S tab' arg', t)
_ -> checkError ("table type expected for applied table instead of" <+> ppType ty') _ -> checkError (text "table type expected for applied table instead of" <+> ppType ty')
, do , do
(arg',ty) <- inferLType gr g arg (arg',ty) <- inferLType gr g arg
ty' <- computeLType gr g ty ty' <- computeLType gr g ty
@@ -557,8 +507,7 @@ checkLType gr g trm typ0 = do
] ]
Let (x,(mty,def)) body -> case mty of Let (x,(mty,def)) body -> case mty of
Just ty -> do Just ty -> do
(ty0,_) <- checkLType gr g ty typeType (def',ty') <- checkLType gr g def ty
(def',ty') <- checkLType gr g def ty0
body' <- justCheck ((Explicit,x,ty'):g) body typ body' <- justCheck ((Explicit,x,ty'):g) body typ
return (Let (x,(Just ty',def')) body', typ) return (Let (x,(Just ty',def')) body', typ)
_ -> do _ -> do
@@ -574,10 +523,10 @@ checkLType gr g trm typ0 = do
termWith trm' $ checkEqLType gr g typ ty' trm' termWith trm' $ checkEqLType gr g typ ty' trm'
where where
justCheck g ty te = checkLType gr g ty te >>= return . fst justCheck g ty te = checkLType gr g ty te >>= return . fst
{-
recParts rr t = (RecType rr1,RecType rr2) where recParts rr t = (RecType rr1,RecType rr2) where
(rr1,rr2) = partition (flip elem (map fst t) . fst) rr (rr1,rr2) = partition (flip elem (map fst t) . fst) rr
-}
checkM rms (l,ty) = case lookup l rms of checkM rms (l,ty) = case lookup l rms of
Just (Just ty0,t) -> do Just (Just ty0,t) -> do
checkEqLType gr g ty ty0 t checkEqLType gr g ty ty0 t
@@ -589,9 +538,9 @@ checkLType gr g trm typ0 = do
_ -> checkError $ _ -> checkError $
if isLockLabel l if isLockLabel l
then let cat = drop 5 (showIdent (label2ident l)) then let cat = drop 5 (showIdent (label2ident l))
in ppTerm Unqualified 0 (R rms) <+> "is not in the lincat of" <+> cat <> in ppTerm Unqualified 0 (R rms) <+> text "is not in the lincat of" <+> text cat <>
"; try wrapping it with lin" <+> cat text "; try wrapping it with lin" <+> text cat
else "cannot find value for label" <+> l <+> "in" <+> ppTerm Unqualified 0 (R rms) else text "cannot find value for label" <+> ppLabel l <+> text "in" <+> ppTerm Unqualified 0 (R rms)
checkCase arg val (p,t) = do checkCase arg val (p,t) = do
cont <- pattContext gr g arg p cont <- pattContext gr g arg p
@@ -604,7 +553,7 @@ pattContext env g typ p = case p of
PP (q,c) ps | q /= cPredef -> do ---- why this /=? AR 6/1/2006 PP (q,c) ps | q /= cPredef -> do ---- why this /=? AR 6/1/2006
t <- lookupResType env (q,c) t <- lookupResType env (q,c)
let (cont,v) = typeFormCnc t let (cont,v) = typeFormCnc t
checkCond ("wrong number of arguments for constructor in" <+> ppPatt Unqualified 0 p) checkCond (text "wrong number of arguments for constructor in" <+> ppPatt Unqualified 0 p)
(length cont == length ps) (length cont == length ps)
checkEqLType env g typ v (patt2term p) checkEqLType env g typ v (patt2term p)
mapM (\((_,_,ty),p) -> pattContext env g ty p) (zip cont ps) >>= return . concat mapM (\((_,_,ty),p) -> pattContext env g ty p) (zip cont ps) >>= return . concat
@@ -615,7 +564,7 @@ pattContext env g typ p = case p of
let pts = [(ty,tr) | (l,tr) <- r, Just ty <- [lookup l t]] let pts = [(ty,tr) | (l,tr) <- r, Just ty <- [lookup l t]]
----- checkWarn $ prt p ++++ show pts ----- debug ----- checkWarn $ prt p ++++ show pts ----- debug
mapM (uncurry (pattContext env g)) pts >>= return . concat mapM (uncurry (pattContext env g)) pts >>= return . concat
_ -> checkError ("record type expected for pattern instead of" <+> ppTerm Unqualified 0 typ') _ -> checkError (text "record type expected for pattern instead of" <+> ppTerm Unqualified 0 typ')
PT t p' -> do PT t p' -> do
checkEqLType env g typ t (patt2term p') checkEqLType env g typ t (patt2term p')
pattContext env g typ p' pattContext env g typ p'
@@ -629,9 +578,9 @@ pattContext env g typ p = case p of
g2 <- pattContext env g typ q g2 <- pattContext env g typ q
let pts = nub ([x | pt@(_,x,_) <- g1, notElem pt g2] ++ [x | pt@(_,x,_) <- g2, notElem pt g1]) let pts = nub ([x | pt@(_,x,_) <- g1, notElem pt g2] ++ [x | pt@(_,x,_) <- g2, notElem pt g1])
checkCond checkCond
("incompatible bindings of" <+> (text "incompatible bindings of" <+>
fsep pts <+> fsep (map ppIdent pts) <+>
"in pattern alterantives" <+> ppPatt Unqualified 0 p) (null pts) text "in pattern alterantives" <+> ppPatt Unqualified 0 p) (null pts)
return g1 -- must be g1 == g2 return g1 -- must be g1 == g2
PSeq p q -> do PSeq p q -> do
g1 <- pattContext env g typ p g1 <- pattContext env g typ p
@@ -645,7 +594,7 @@ pattContext env g typ p = case p of
noBind typ p' = do noBind typ p' = do
co <- pattContext env g typ p' co <- pattContext env g typ p'
if not (null co) if not (null co)
then checkWarn ("no variable bound inside pattern" <+> ppPatt Unqualified 0 p) then checkWarn (text "no variable bound inside pattern" <+> ppPatt Unqualified 0 p)
>> return [] >> return []
else return [] else return []
@@ -654,31 +603,9 @@ checkEqLType gr g t u trm = do
(b,t',u',s) <- checkIfEqLType gr g t u trm (b,t',u',s) <- checkIfEqLType gr g t u trm
case b of case b of
True -> return t' True -> return t'
False -> False -> checkError $ text s <+> text "type of" <+> ppTerm Unqualified 0 trm $$
let inferredType = ppTerm Qualified 0 u text "expected:" <+> ppType t $$
expectedType = ppTerm Qualified 0 t text "inferred:" <+> ppType u
term = ppTerm Unqualified 0 trm
funName = pp . head . words .render $ term
helpfulMsg =
case (arrows inferredType, arrows expectedType) of
(0,0) -> pp "" -- None of the types is a function
_ -> "\n **" <+>
if expectedType `isLessApplied` inferredType
then "Maybe you gave too few arguments to" <+> funName
else pp "Double-check that type signature and number of arguments match."
in checkError $ s <+> "type of" <+> term $$
"expected:" <+> expectedType $$ -- ppqType t u $$
"inferred:" <+> inferredType $$ -- ppqType u t
helpfulMsg
where
-- count the number of arrows in the prettyprinted term
arrows :: Doc -> Int
arrows = length . filter (=="->") . words . render
-- If prettyprinted type t has fewer arrows then prettyprinted type u,
-- then t is "less applied", and we can print out more helpful error msg.
isLessApplied :: Doc -> Doc -> Bool
isLessApplied t u = arrows t < arrows u
checkIfEqLType :: SourceGrammar -> Context -> Type -> Type -> Term -> Check (Bool,Type,Type,String) checkIfEqLType :: SourceGrammar -> Context -> Type -> Type -> Term -> Check (Bool,Type,Type,String)
checkIfEqLType gr g t u trm = do checkIfEqLType gr g t u trm = do
@@ -690,13 +617,13 @@ checkIfEqLType gr g t u trm = do
--- better: use a flag to forgive? (AR 31/1/2006) --- better: use a flag to forgive? (AR 31/1/2006)
_ -> case missingLock [] t' u' of _ -> case missingLock [] t' u' of
Ok lo -> do Ok lo -> do
checkWarn $ "missing lock field" <+> fsep lo checkWarn $ text "missing lock field" <+> fsep (map ppLabel lo)
return (True,t',u',[]) return (True,t',u',[])
Bad s -> return (False,t',u',s) Bad s -> return (False,t',u',s)
where where
-- check that u is a subtype of t -- t is a subtype of u
--- quick hack version of TC.eqVal --- quick hack version of TC.eqVal
alpha g t u = case (t,u) of alpha g t u = case (t,u) of
@@ -708,13 +635,12 @@ checkIfEqLType gr g t u trm = do
-- record subtyping -- record subtyping
(RecType rs, RecType ts) -> all (\ (l,a) -> (RecType rs, RecType ts) -> all (\ (l,a) ->
any (\ (k,b) -> l == k && alpha g a b) ts) rs any (\ (k,b) -> alpha g a b && l == k) ts) rs
(ExtR r s, ExtR r' s') -> alpha g r r' && alpha g s s' (ExtR r s, ExtR r' s') -> alpha g r r' && alpha g s s'
(ExtR r s, t) -> alpha g r t || alpha g s t (ExtR r s, t) -> alpha g r t || alpha g s t
-- the following say that Ints n is a subset of Int and of Ints m >= n -- the following say that Ints n is a subset of Int and of Ints m >= n
-- But why does it also allow Int as a subtype of Ints m? /TH 2014-04-04 (t,u) | Just m <- isTypeInts t, Just n <- isTypeInts t -> m >= n
(t,u) | Just m <- isTypeInts t, Just n <- isTypeInts u -> m >= n
| Just _ <- isTypeInts t, u == typeInt -> True ---- check size! | Just _ <- isTypeInts t, u == typeInt -> True ---- check size!
| t == typeInt, Just _ <- isTypeInts u -> True ---- why this ???? AR 11/12/2005 | t == typeInt, Just _ <- isTypeInts u -> True ---- why this ???? AR 11/12/2005
@@ -729,8 +655,7 @@ checkIfEqLType gr g t u trm = do
(Q (m,a), QC (n,b)) | a == b -> elem m (allExtendsPlus gr n) (Q (m,a), QC (n,b)) | a == b -> elem m (allExtendsPlus gr n)
|| elem n (allExtendsPlus gr m) || elem n (allExtendsPlus gr m)
-- contravariance (Table a b, Table c d) -> alpha g a c && alpha g b d
(Table a b, Table c d) -> alpha g c a && alpha g b d
(Vr x, Vr y) -> x == y || elem (x,y) g || elem (y,x) g (Vr x, Vr y) -> x == y || elem (x,y) g || elem (y,x) g
_ -> t == u _ -> t == u
--- the following should be one-way coercions only. AR 4/1/2001 --- the following should be one-way coercions only. AR 4/1/2001
@@ -745,7 +670,7 @@ checkIfEqLType gr g t u trm = do
not (any (\ (k,b) -> alpha g a b && l == k) ts)] not (any (\ (k,b) -> alpha g a b && l == k) ts)]
(locks,others) = partition isLockLabel ls (locks,others) = partition isLockLabel ls
in case others of in case others of
_:_ -> Bad $ render ("missing record fields:" <+> fsep (punctuate ',' (others))) _:_ -> Bad $ render (text "missing record fields:" <+> fsep (punctuate comma (map ppLabel others)))
_ -> return locks _ -> return locks
-- contravariance -- contravariance
(Prod _ x a b, Prod _ y c d) -> do (Prod _ x a b, Prod _ y c d) -> do
@@ -783,18 +708,14 @@ ppType :: Type -> Doc
ppType ty = ppType ty =
case ty of case ty of
RecType fs -> case filter isLockLabel $ map fst fs of RecType fs -> case filter isLockLabel $ map fst fs of
[lock] -> pp (drop 5 (showIdent (label2ident lock))) [lock] -> text (drop 5 (showIdent (label2ident lock)))
_ -> ppTerm Unqualified 0 ty _ -> ppTerm Unqualified 0 ty
Prod _ x a b -> ppType a <+> "->" <+> ppType b Prod _ x a b -> ppType a <+> text "->" <+> ppType b
_ -> ppTerm Unqualified 0 ty _ -> ppTerm Unqualified 0 ty
{-
ppqType :: Type -> Type -> Doc
ppqType t u = case (ppType t, ppType u) of
(pt,pu) | render pt == render pu -> ppTerm Qualified 0 t
(pt,_) -> pt
-}
checkLookup :: Ident -> Context -> Check Type checkLookup :: Ident -> Context -> Check Type
checkLookup x g = checkLookup x g =
case [ty | (b,y,ty) <- g, x == y] of case [ty | (b,y,ty) <- g, x == y] of
[] -> checkError ("unknown variable" <+> x) [] -> checkError (text "unknown variable" <+> ppIdent x)
(ty:_) -> return ty (ty:_) -> return ty
-}

View File

@@ -10,7 +10,7 @@ import GF.Grammar hiding (Env, VGen, VApp, VRecType)
import GF.Grammar.Lookup import GF.Grammar.Lookup
import GF.Grammar.Predef import GF.Grammar.Predef
import GF.Grammar.Lockfield import GF.Grammar.Lockfield
import GF.Compile.Compute.Concrete import GF.Compile.Compute.ConcreteNew
import GF.Compile.Compute.Predef(predef,predefName) import GF.Compile.Compute.Predef(predef,predefName)
import GF.Infra.CheckM import GF.Infra.CheckM
import GF.Data.Operations import GF.Data.Operations
@@ -396,7 +396,7 @@ tcRecTypeFields ge scope ((l,ty):rs) mb_ty = do
return ((l,ty):rs,mb_ty) return ((l,ty):rs,mb_ty)
-- | Invariant: if the third argument is (Just rho), -- | Invariant: if the third argument is (Just rho),
-- then rho is in weak-prenex form -- then rho is in weak-prenex form
instSigma :: GlobalEnv -> Scope -> Term -> Sigma -> Maybe Rho -> TcM (Term, Rho) instSigma :: GlobalEnv -> Scope -> Term -> Sigma -> Maybe Rho -> TcM (Term, Rho)
instSigma ge scope t ty1 Nothing = return (t,ty1) -- INST1 instSigma ge scope t ty1 Nothing = return (t,ty1) -- INST1
instSigma ge scope t ty1 (Just ty2) = do -- INST2 instSigma ge scope t ty1 (Just ty2) = do -- INST2
@@ -568,9 +568,9 @@ unifyVar ge scope i env vs ty2 = do -- Check whether i is bound
Bound ty1 -> do v <- liftErr (eval ge env ty1) Bound ty1 -> do v <- liftErr (eval ge env ty1)
unify ge scope (vapply (geLoc ge) v vs) ty2 unify ge scope (vapply (geLoc ge) v vs) ty2
Unbound scope' _ -> case value2term (geLoc ge) (scopeVars scope') ty2 of Unbound scope' _ -> case value2term (geLoc ge) (scopeVars scope') ty2 of
-- Left i -> let (v,_) = reverse scope !! i Left i -> let (v,_) = reverse scope !! i
-- in tcError ("Variable" <+> pp v <+> "has escaped") in tcError ("Variable" <+> pp v <+> "has escaped")
ty2' -> do ms2 <- getMetaVars (geLoc ge) [(scope,ty2)] Right ty2' -> do ms2 <- getMetaVars (geLoc ge) [(scope,ty2)]
if i `elem` ms2 if i `elem` ms2
then tcError ("Occurs check for" <+> ppMeta i <+> "in:" $$ then tcError ("Occurs check for" <+> ppMeta i <+> "in:" $$
nest 2 (ppTerm Unqualified 0 ty2')) nest 2 (ppTerm Unqualified 0 ty2'))
@@ -631,8 +631,8 @@ allBinders = [ identS [x] | x <- ['a'..'z'] ] ++
type Scope = [(Ident,Value)] type Scope = [(Ident,Value)]
type Sigma = Value type Sigma = Value
type Rho = Value -- No top-level ForAll type Rho = Value -- No top-level ForAll
type Tau = Value -- No ForAlls anywhere type Tau = Value -- No ForAlls anywhere
data MetaValue data MetaValue
= Unbound Scope Sigma = Unbound Scope Sigma
@@ -644,7 +644,7 @@ data TcResult a
newtype TcM a = TcM {unTcM :: MetaStore -> [Message] -> TcResult a} newtype TcM a = TcM {unTcM :: MetaStore -> [Message] -> TcResult a}
instance Monad TcM where instance Monad TcM where
return = pure return x = TcM (\ms msgs -> TcOk x ms msgs)
f >>= g = TcM (\ms msgs -> case unTcM f ms msgs of f >>= g = TcM (\ms msgs -> case unTcM f ms msgs of
TcOk x ms msgs -> unTcM (g x) ms msgs TcOk x ms msgs -> unTcM (g x) ms msgs
TcFail msgs -> TcFail msgs) TcFail msgs -> TcFail msgs)
@@ -659,7 +659,7 @@ instance Fail.MonadFail TcM where
instance Applicative TcM where instance Applicative TcM where
pure x = TcM (\ms msgs -> TcOk x ms msgs) pure = return
(<*>) = ap (<*>) = ap
instance Functor TcM where instance Functor TcM where
@@ -724,8 +724,8 @@ getMetaVars loc sc_tys = do
go (Vr tv) acc = acc go (Vr tv) acc = acc
go (App x y) acc = go x (go y acc) go (App x y) acc = go x (go y acc)
go (Meta i) acc go (Meta i) acc
| i `elem` acc = acc | i `elem` acc = acc
| otherwise = i : acc | otherwise = i : acc
go (Q _) acc = acc go (Q _) acc = acc
go (QC _) acc = acc go (QC _) acc = acc
go (Sort _) acc = acc go (Sort _) acc = acc
@@ -742,9 +742,9 @@ getFreeVars loc sc_tys = do
return (foldr (go []) [] tys) return (foldr (go []) [] tys)
where where
go bound (Vr tv) acc go bound (Vr tv) acc
| tv `elem` bound = acc | tv `elem` bound = acc
| tv `elem` acc = acc | tv `elem` acc = acc
| otherwise = tv : acc | otherwise = tv : acc
go bound (App x y) acc = go bound x (go bound y acc) go bound (App x y) acc = go bound x (go bound y acc)
go bound (Meta _) acc = acc go bound (Meta _) acc = acc
go bound (Q _) acc = acc go bound (Q _) acc = acc
@@ -765,9 +765,9 @@ zonkTerm (Meta i) = do
zonkTerm t = composOp zonkTerm t zonkTerm t = composOp zonkTerm t
tc_value2term loc xs v = tc_value2term loc xs v =
return $ value2term loc xs v case value2term loc xs v of
-- Old value2term error message: Left i -> tcError ("Variable #" <+> pp i <+> "has escaped")
-- Left i -> tcError ("Variable #" <+> pp i <+> "has escaped") Right t -> return t

View File

@@ -0,0 +1,801 @@
{-# LANGUAGE PatternGuards #-}
module GF.Compile.TypeCheck.RConcrete( checkLType, inferLType, computeLType, ppType ) where
import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import GF.Infra.CheckM
import GF.Data.Operations
import GF.Grammar
import GF.Grammar.Lookup
import GF.Grammar.Predef
import GF.Grammar.PatternMatch
import GF.Grammar.Lockfield (isLockLabel, lockRecType, unlockRecord)
import GF.Compile.TypeCheck.Primitives
import Data.List
import Control.Monad
import GF.Text.Pretty
computeLType :: SourceGrammar -> Context -> Type -> Check Type
computeLType gr g0 t = comp (reverse [(b,x, Vr x) | (b,x,_) <- g0] ++ g0) t
where
comp g ty = case ty of
_ | Just _ <- isTypeInts ty -> return ty ---- shouldn't be needed
| isPredefConstant ty -> return ty ---- shouldn't be needed
Q (m,ident) -> checkIn ("module" <+> m) $ do
ty' <- lookupResDef gr (m,ident)
if ty' == ty then return ty else comp g ty' --- is this necessary to test?
AdHocOverload ts -> do
over <- getOverload gr g (Just typeType) t
case over of
Just (tr,_) -> return tr
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 t)
Vr ident -> checkLookup ident g -- never needed to compute!
App f a -> do
f' <- comp g f
a' <- comp g a
case f' of
Abs b x t -> comp ((b,x,a'):g) t
_ -> return $ App f' a'
Prod bt x a b -> do
a' <- comp g a
b' <- comp ((bt,x,Vr x) : g) b
return $ Prod bt x a' b'
Abs bt x b -> do
b' <- comp ((bt,x,Vr x):g) b
return $ Abs bt x b'
Let (x,(_,a)) b -> comp ((Explicit,x,a):g) b
ExtR r s -> do
r' <- comp g r
s' <- comp g s
case (r',s') of
(RecType rs, RecType ss) -> plusRecType r' s' >>= comp g
_ -> return $ ExtR r' s'
RecType fs -> do
let fs' = sortRec fs
liftM RecType $ mapPairsM (comp g) fs'
ELincat c t -> do
t' <- comp g t
lockRecType c t' ---- locking to be removed AR 20/6/2009
_ | ty == typeTok -> return typeStr
_ | isPredefConstant ty -> return ty
_ -> composOp (comp g) ty
-- the underlying algorithms
inferLType :: SourceGrammar -> Context -> Term -> Check (Term, Type)
inferLType gr g trm = case trm of
Q (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of
Just ty -> return ty
Nothing -> checkError ("unknown in Predef:" <+> ident)
Q ident -> checks [
termWith trm $ lookupResType gr ident >>= computeLType gr g
,
lookupResDef gr ident >>= inferLType gr g
,
checkError ("cannot infer type of constant" <+> ppTerm Unqualified 0 trm)
]
QC (m,ident) | isPredef m -> termWith trm $ case typPredefined ident of
Just ty -> return ty
Nothing -> checkError ("unknown in Predef:" <+> ident)
QC ident -> checks [
termWith trm $ lookupResType gr ident >>= computeLType gr g
,
lookupResDef gr ident >>= inferLType gr g
,
checkError ("cannot infer type of canonical constant" <+> ppTerm Unqualified 0 trm)
]
Vr ident -> termWith trm $ checkLookup ident g
Typed e t -> do
t' <- computeLType gr g t
checkLType gr g e t'
AdHocOverload ts -> do
over <- getOverload gr g Nothing trm
case over of
Just trty -> return trty
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 trm)
App f a -> do
over <- getOverload gr g Nothing trm
case over of
Just trty -> return trty
_ -> do
(f',fty) <- inferLType gr g f
fty' <- computeLType gr g fty
case fty' of
Prod bt z arg val -> do
a' <- justCheck g a arg
ty <- if isWildIdent z
then return val
else substituteLType [(bt,z,a')] val
return (App f' a',ty)
_ ->
let term = ppTerm Unqualified 0 f
funName = pp . head . words .render $ term
in checkError ("A function type is expected for" <+> term <+> "instead of type" <+> ppType fty $$
"\n ** Maybe you gave too many arguments to" <+> funName <+> "\n")
S f x -> do
(f', fty) <- inferLType gr g f
case fty of
Table arg val -> do
x'<- justCheck g x arg
return (S f' x', val)
_ -> checkError ("table lintype expected for the table in" $$ nest 2 (ppTerm Unqualified 0 trm))
P t i -> do
(t',ty) <- inferLType gr g t --- ??
ty' <- computeLType gr g ty
let tr2 = P t' i
termWith tr2 $ case ty' of
RecType ts -> case lookup i ts of
Nothing -> checkError ("unknown label" <+> i <+> "in" $$ nest 2 (ppTerm Unqualified 0 ty'))
Just x -> return x
_ -> checkError ("record type expected for:" <+> ppTerm Unqualified 0 t $$
" instead of the inferred:" <+> ppTerm Unqualified 0 ty')
R r -> do
let (ls,fs) = unzip r
fsts <- mapM inferM fs
let ts = [ty | (Just ty,_) <- fsts]
checkCond ("cannot infer type of record" $$ nest 2 (ppTerm Unqualified 0 trm)) (length ts == length fsts)
return $ (R (zip ls fsts), RecType (zip ls ts))
T (TTyped arg) pts -> do
(_,val) <- checks $ map (inferCase (Just arg)) pts
checkLType gr g trm (Table arg val)
T (TComp arg) pts -> do
(_,val) <- checks $ map (inferCase (Just arg)) pts
checkLType gr g trm (Table arg val)
T ti pts -> do -- tries to guess: good in oper type inference
let pts' = [pt | pt@(p,_) <- pts, isConstPatt p]
case pts' of
[] -> checkError ("cannot infer table type of" <+> ppTerm Unqualified 0 trm)
---- PInt k : _ -> return $ Ints $ max [i | PInt i <- pts']
_ -> do
(arg,val) <- checks $ map (inferCase Nothing) pts'
checkLType gr g trm (Table arg val)
V arg pts -> do
(_,val) <- checks $ map (inferLType gr g) pts
-- return (trm, Table arg val) -- old, caused issue 68
checkLType gr g trm (Table arg val)
K s -> do
if elem ' ' s
then do
let ss = foldr C Empty (map K (words s))
----- removed irritating warning AR 24/5/2008
----- checkWarn ("token \"" ++ s ++
----- "\" converted to token list" ++ prt ss)
return (ss, typeStr)
else return (trm, typeStr)
EInt i -> return (trm, typeInt)
EFloat i -> return (trm, typeFloat)
Empty -> return (trm, typeStr)
C s1 s2 ->
check2 (flip (justCheck g) typeStr) C s1 s2 typeStr
Glue s1 s2 ->
check2 (flip (justCheck g) typeStr) Glue s1 s2 typeStr ---- typeTok
---- hack from Rename.identRenameTerm, to live with files with naming conflicts 18/6/2007
Strs (Cn c : ts) | c == cConflict -> do
checkWarn ("unresolved constant, could be any of" <+> hcat (map (ppTerm Unqualified 0) ts))
inferLType gr g (head ts)
Strs ts -> do
ts' <- mapM (\t -> justCheck g t typeStr) ts
return (Strs ts', typeStrs)
Alts t aa -> do
t' <- justCheck g t typeStr
aa' <- flip mapM aa (\ (c,v) -> do
c' <- justCheck g c typeStr
v' <- checks $ map (justCheck g v) [typeStrs, EPattType typeStr]
return (c',v'))
return (Alts t' aa', typeStr)
RecType r -> do
let (ls,ts) = unzip r
ts' <- mapM (flip (justCheck g) typeType) ts
return (RecType (zip ls ts'), typeType)
ExtR r s -> do
--- over <- getOverload gr g Nothing r
--- let r1 = maybe r fst over
let r1 = r ---
(r',rT) <- inferLType gr g r1
rT' <- computeLType gr g rT
(s',sT) <- inferLType gr g s
sT' <- computeLType gr g sT
let trm' = ExtR r' s'
case (rT', sT') of
(RecType rs, RecType ss) -> do
let rt = RecType ([field | field@(l,_) <- rs, notElem l (map fst ss)] ++ ss) -- select types of later fields
checkLType gr g trm' rt ---- return (trm', rt)
_ | rT' == typeType && sT' == typeType -> do
return (trm', typeType)
_ -> checkError ("records or record types expected in" <+> ppTerm Unqualified 0 trm)
Sort _ ->
termWith trm $ return typeType
Prod bt x a b -> do
a' <- justCheck g a typeType
b' <- justCheck ((bt,x,a'):g) b typeType
return (Prod bt x a' b', typeType)
Table p t -> do
p' <- justCheck g p typeType --- check p partype!
t' <- justCheck g t typeType
return $ (Table p' t', typeType)
FV vs -> do
(_,ty) <- checks $ map (inferLType gr g) vs
--- checkIfComplexVariantType trm ty
checkLType gr g trm ty
EPattType ty -> do
ty' <- justCheck g ty typeType
return (EPattType ty',typeType)
EPatt p -> do
ty <- inferPatt p
return (trm, EPattType ty)
ELin c trm -> do
(trm',ty) <- inferLType gr g trm
ty' <- lockRecType c ty ---- lookup c; remove lock AR 20/6/2009
return $ (ELin c trm', ty')
_ -> checkError ("cannot infer lintype of" <+> ppTerm Unqualified 0 trm)
where
isPredef m = elem m [cPredef,cPredefAbs]
justCheck g ty te = checkLType gr g ty te >>= return . fst
-- for record fields, which may be typed
inferM (mty, t) = do
(t', ty') <- case mty of
Just ty -> checkLType gr g t ty
_ -> inferLType gr g t
return (Just ty',t')
inferCase mty (patt,term) = do
arg <- maybe (inferPatt patt) return mty
cont <- pattContext gr g arg patt
(_,val) <- inferLType gr (reverse cont ++ g) term
return (arg,val)
isConstPatt p = case p of
PC _ ps -> True --- all isConstPatt ps
PP _ ps -> True --- all isConstPatt ps
PR ps -> all (isConstPatt . snd) ps
PT _ p -> isConstPatt p
PString _ -> True
PInt _ -> True
PFloat _ -> True
PChar -> True
PChars _ -> True
PSeq p q -> isConstPatt p && isConstPatt q
PAlt p q -> isConstPatt p && isConstPatt q
PRep p -> isConstPatt p
PNeg p -> isConstPatt p
PAs _ p -> isConstPatt p
_ -> False
inferPatt p = case p of
PP (q,c) ps | q /= cPredef -> liftM valTypeCnc (lookupResType gr (q,c))
PAs _ p -> inferPatt p
PNeg p -> inferPatt p
PAlt p q -> checks [inferPatt p, inferPatt q]
PSeq _ _ -> return $ typeStr
PRep _ -> return $ typeStr
PChar -> return $ typeStr
PChars _ -> return $ typeStr
_ -> inferLType gr g (patt2term p) >>= return . snd
-- type inference: Nothing, type checking: Just t
-- the latter permits matching with value type
getOverload :: SourceGrammar -> Context -> Maybe Type -> Term -> Check (Maybe (Term,Type))
getOverload gr g mt ot = case appForm ot of
(f@(Q c), ts) -> case lookupOverload gr c of
Ok typs -> do
ttys <- mapM (inferLType gr g) ts
v <- matchOverload f typs ttys
return $ Just v
_ -> return Nothing
(AdHocOverload cs@(f:_), ts) -> do --- the function name f is only used in error messages
let typs = concatMap collectOverloads cs
ttys <- mapM (inferLType gr g) ts
v <- matchOverload f typs ttys
return $ Just v
_ -> return Nothing
where
collectOverloads tr@(Q c) = case lookupOverload gr c of
Ok typs -> typs
_ -> case lookupResType gr c of
Ok ty -> let (args,val) = typeFormCnc ty in [(map (\(b,x,t) -> t) args,(val,tr))]
_ -> []
collectOverloads _ = [] --- constructors QC
matchOverload f typs ttys = do
let (tts,tys) = unzip ttys
let vfs = lookupOverloadInstance tys typs
let matches = [vf | vf@((_,v,_),_) <- vfs, matchVal mt v]
let showTypes ty = hsep (map ppType ty)
let (stys,styps) = (showTypes tys, [showTypes ty | (ty,_) <- typs])
-- to avoid strange error msg e.g. in case of unmatch record extension, show whole types if needed AR 28/1/2013
let (stysError,stypsError) = if elem (render stys) (map render styps)
then (hsep (map (ppTerm Unqualified 0) tys), [hsep (map (ppTerm Unqualified 0) ty) | (ty,_) <- typs])
else (stys,styps)
case ([vf | (vf,True) <- matches],[vf | (vf,False) <- matches]) of
([(_,val,fun)],_) -> return (mkApp fun tts, val)
([],[(pre,val,fun)]) -> do
checkWarn $ "ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot $$
"for" $$
nest 2 (showTypes tys) $$
"using" $$
nest 2 (showTypes pre)
return (mkApp fun tts, val)
([],[]) -> do
checkError $ "no overload instance of" <+> ppTerm Qualified 0 f $$
maybe empty (\x -> "with value type" <+> ppType x) mt $$
"for argument list" $$
nest 2 stysError $$
"among alternatives" $$
nest 2 (vcat stypsError)
(vfs1,vfs2) -> case (noProds vfs1,noProds vfs2) of
([(val,fun)],_) -> do
return (mkApp fun tts, val)
([],[(val,fun)]) -> do
checkWarn ("ignoring lock fields in resolving" <+> ppTerm Unqualified 0 ot)
return (mkApp fun tts, val)
----- unsafely exclude irritating warning AR 24/5/2008
----- checkWarn $ "overloading of" +++ prt f +++
----- "resolved by excluding partial applications:" ++++
----- unlines [prtType env ty | (ty,_) <- vfs', not (noProd ty)]
--- now forgiving ambiguity with a warning AR 1/2/2014
-- This gives ad hoc overloading the same behaviour as the choice of the first match in renaming did before.
-- But it also gives a chance to ambiguous overloadings that were banned before.
(nps1,nps2) -> do
checkWarn $ "ambiguous overloading of" <+> ppTerm Unqualified 0 f <+>
---- "with argument types" <+> hsep (map (ppTerm Qualified 0) tys) $$
"resolved by selecting the first of the alternatives" $$
nest 2 (vcat [ppTerm Qualified 0 fun | (_,ty,fun) <- vfs1 ++ if null vfs1 then vfs2 else []])
case [(mkApp fun tts,val) | (val,fun) <- nps1 ++ nps2] of
[] -> checkError $ "no alternatives left when resolving" <+> ppTerm Unqualified 0 f
h:_ -> return h
matchVal mt v = elem mt [Nothing,Just v,Just (unlocked v)]
unlocked v = case v of
RecType fs -> RecType $ filter (not . isLockLabel . fst) (sortRec fs)
_ -> v
---- TODO: accept subtypes
---- TODO: use a trie
lookupOverloadInstance tys typs =
[((pre,mkFunType rest val, t),isExact) |
let lt = length tys,
(ty,(val,t)) <- typs, length ty >= lt,
let (pre,rest) = splitAt lt ty,
let isExact = pre == tys,
isExact || map unlocked pre == map unlocked tys
]
noProds vfs = [(v,f) | (_,v,f) <- vfs, noProd v]
noProd ty = case ty of
Prod _ _ _ _ -> False
_ -> True
checkLType :: SourceGrammar -> Context -> Term -> Type -> Check (Term, Type)
checkLType gr g trm typ0 = do
typ <- computeLType gr g typ0
case trm of
Abs bt x c -> do
case typ of
Prod bt' z a b -> do
(c',b') <- if isWildIdent z
then checkLType gr ((bt,x,a):g) c b
else do b' <- checkIn (pp "abs") $ substituteLType [(bt',z,Vr x)] b
checkLType gr ((bt,x,a):g) c b'
return $ (Abs bt x c', Prod bt' z a b')
_ -> checkError $ "function type expected instead of" <+> ppType typ $$
"\n ** Double-check that the type signature of the operation" $$
"matches the number of arguments given to it.\n"
App f a -> do
over <- getOverload gr g (Just typ) trm
case over of
Just trty -> return trty
_ -> do
(trm',ty') <- inferLType gr g trm
termWith trm' $ checkEqLType gr g typ ty' trm'
AdHocOverload ts -> do
over <- getOverload gr g Nothing trm
case over of
Just trty -> return trty
_ -> checkError ("unresolved overloading of constants" <+> ppTerm Qualified 0 trm)
Q _ -> do
over <- getOverload gr g (Just typ) trm
case over of
Just trty -> return trty
_ -> do
(trm',ty') <- inferLType gr g trm
termWith trm' $ checkEqLType gr g typ ty' trm'
T _ [] ->
checkError ("found empty table in type" <+> ppTerm Unqualified 0 typ)
T _ cs -> case typ of
Table arg val -> do
case allParamValues gr arg of
Ok vs -> do
let ps0 = map fst cs
ps <- testOvershadow ps0 vs
if null ps
then return ()
else checkWarn ("patterns never reached:" $$
nest 2 (vcat (map (ppPatt Unqualified 0) ps)))
_ -> return () -- happens with variable types
cs' <- mapM (checkCase arg val) cs
return (T (TTyped arg) cs', typ)
_ -> checkError $ "table type expected for table instead of" $$ nest 2 (ppType typ)
V arg0 vs ->
case typ of
Table arg1 val ->
do arg' <- checkEqLType gr g arg0 arg1 trm
vs1 <- allParamValues gr arg1
if length vs1 == length vs
then return ()
else checkError $ "wrong number of values in table" <+> ppTerm Unqualified 0 trm
vs' <- map fst `fmap` sequence [checkLType gr g v val|v<-vs]
return (V arg' vs',typ)
R r -> case typ of --- why needed? because inference may be too difficult
RecType rr -> do
--let (ls,_) = unzip rr -- labels of expected type
fsts <- mapM (checkM r) rr -- check that they are found in the record
return $ (R fsts, typ) -- normalize record
_ -> checkError ("record type expected in type checking instead of" $$ nest 2 (ppTerm Unqualified 0 typ))
ExtR r s -> case typ of
_ | typ == typeType -> do
trm' <- computeLType gr g trm
case trm' of
RecType _ -> termWith trm' $ return typeType
ExtR (Vr _) (RecType _) -> termWith trm' $ return typeType
-- ext t = t ** ...
_ -> checkError ("invalid record type extension" <+> nest 2 (ppTerm Unqualified 0 trm))
RecType rr -> do
ll2 <- case s of
R ss -> return $ map fst ss
_ -> do
(s',typ2) <- inferLType gr g s
case typ2 of
RecType ss -> return $ map fst ss
_ -> checkError ("cannot get labels from" $$ nest 2 (ppTerm Unqualified 0 typ2))
let ll1 = [l | (l,_) <- rr, notElem l ll2]
--- over <- getOverload gr g Nothing r --- this would solve #66 but fail ParadigmsAra. AR 6/7/2020
--- let r1 = maybe r fst over
let r1 = r ---
(r',_) <- checkLType gr g r1 (RecType [field | field@(l,_) <- rr, elem l ll1])
(s',_) <- checkLType gr g s (RecType [field | field@(l,_) <- rr, elem l ll2])
let rec = R ([(l,(Nothing,P r' l)) | l <- ll1] ++ [(l,(Nothing,P s' l)) | l <- ll2])
return (rec, typ)
ExtR ty ex -> do
r' <- justCheck g r ty
s' <- justCheck g s ex
return $ (ExtR r' s', typ) --- is this all? it assumes the same division in trm and typ
_ -> checkError ("record extension not meaningful for" <+> ppTerm Unqualified 0 typ)
FV vs -> do
ttys <- mapM (flip (checkLType gr g) typ) vs
--- checkIfComplexVariantType trm typ
return (FV (map fst ttys), typ) --- typ' ?
S tab arg -> checks [ do
(tab',ty) <- inferLType gr g tab
ty' <- computeLType gr g ty
case ty' of
Table p t -> do
(arg',val) <- checkLType gr g arg p
checkEqLType gr g typ t trm
return (S tab' arg', t)
_ -> checkError ("table type expected for applied table instead of" <+> ppType ty')
, do
(arg',ty) <- inferLType gr g arg
ty' <- computeLType gr g ty
(tab',_) <- checkLType gr g tab (Table ty' typ)
return (S tab' arg', typ)
]
Let (x,(mty,def)) body -> case mty of
Just ty -> do
(ty0,_) <- checkLType gr g ty typeType
(def',ty') <- checkLType gr g def ty0
body' <- justCheck ((Explicit,x,ty'):g) body typ
return (Let (x,(Just ty',def')) body', typ)
_ -> do
(def',ty) <- inferLType gr g def -- tries to infer type of local constant
checkLType gr g (Let (x,(Just ty,def')) body) typ
ELin c tr -> do
tr1 <- unlockRecord c tr
checkLType gr g tr1 typ
_ -> do
(trm',ty') <- inferLType gr g trm
termWith trm' $ checkEqLType gr g typ ty' trm'
where
justCheck g ty te = checkLType gr g ty te >>= return . fst
{-
recParts rr t = (RecType rr1,RecType rr2) where
(rr1,rr2) = partition (flip elem (map fst t) . fst) rr
-}
checkM rms (l,ty) = case lookup l rms of
Just (Just ty0,t) -> do
checkEqLType gr g ty ty0 t
(t',ty') <- checkLType gr g t ty
return (l,(Just ty',t'))
Just (_,t) -> do
(t',ty') <- checkLType gr g t ty
return (l,(Just ty',t'))
_ -> checkError $
if isLockLabel l
then let cat = drop 5 (showIdent (label2ident l))
in ppTerm Unqualified 0 (R rms) <+> "is not in the lincat of" <+> cat <>
"; try wrapping it with lin" <+> cat
else "cannot find value for label" <+> l <+> "in" <+> ppTerm Unqualified 0 (R rms)
checkCase arg val (p,t) = do
cont <- pattContext gr g arg p
t' <- justCheck (reverse cont ++ g) t val
return (p,t')
pattContext :: SourceGrammar -> Context -> Type -> Patt -> Check Context
pattContext env g typ p = case p of
PV x -> return [(Explicit,x,typ)]
PP (q,c) ps | q /= cPredef -> do ---- why this /=? AR 6/1/2006
t <- lookupResType env (q,c)
let (cont,v) = typeFormCnc t
checkCond ("wrong number of arguments for constructor in" <+> ppPatt Unqualified 0 p)
(length cont == length ps)
checkEqLType env g typ v (patt2term p)
mapM (\((_,_,ty),p) -> pattContext env g ty p) (zip cont ps) >>= return . concat
PR r -> do
typ' <- computeLType env g typ
case typ' of
RecType t -> do
let pts = [(ty,tr) | (l,tr) <- r, Just ty <- [lookup l t]]
----- checkWarn $ prt p ++++ show pts ----- debug
mapM (uncurry (pattContext env g)) pts >>= return . concat
_ -> checkError ("record type expected for pattern instead of" <+> ppTerm Unqualified 0 typ')
PT t p' -> do
checkEqLType env g typ t (patt2term p')
pattContext env g typ p'
PAs x p -> do
g' <- pattContext env g typ p
return ((Explicit,x,typ):g')
PAlt p' q -> do
g1 <- pattContext env g typ p'
g2 <- pattContext env g typ q
let pts = nub ([x | pt@(_,x,_) <- g1, notElem pt g2] ++ [x | pt@(_,x,_) <- g2, notElem pt g1])
checkCond
("incompatible bindings of" <+>
fsep pts <+>
"in pattern alterantives" <+> ppPatt Unqualified 0 p) (null pts)
return g1 -- must be g1 == g2
PSeq p q -> do
g1 <- pattContext env g typ p
g2 <- pattContext env g typ q
return $ g1 ++ g2
PRep p' -> noBind typeStr p'
PNeg p' -> noBind typ p'
_ -> return [] ---- check types!
where
noBind typ p' = do
co <- pattContext env g typ p'
if not (null co)
then checkWarn ("no variable bound inside pattern" <+> ppPatt Unqualified 0 p)
>> return []
else return []
checkEqLType :: SourceGrammar -> Context -> Type -> Type -> Term -> Check Type
checkEqLType gr g t u trm = do
(b,t',u',s) <- checkIfEqLType gr g t u trm
case b of
True -> return t'
False ->
let inferredType = ppTerm Qualified 0 u
expectedType = ppTerm Qualified 0 t
term = ppTerm Unqualified 0 trm
funName = pp . head . words .render $ term
helpfulMsg =
case (arrows inferredType, arrows expectedType) of
(0,0) -> pp "" -- None of the types is a function
_ -> "\n **" <+>
if expectedType `isLessApplied` inferredType
then "Maybe you gave too few arguments to" <+> funName
else pp "Double-check that type signature and number of arguments match."
in checkError $ s <+> "type of" <+> term $$
"expected:" <+> expectedType $$ -- ppqType t u $$
"inferred:" <+> inferredType $$ -- ppqType u t
helpfulMsg
where
-- count the number of arrows in the prettyprinted term
arrows :: Doc -> Int
arrows = length . filter (=="->") . words . render
-- If prettyprinted type t has fewer arrows then prettyprinted type u,
-- then t is "less applied", and we can print out more helpful error msg.
isLessApplied :: Doc -> Doc -> Bool
isLessApplied t u = arrows t < arrows u
checkIfEqLType :: SourceGrammar -> Context -> Type -> Type -> Term -> Check (Bool,Type,Type,String)
checkIfEqLType gr g t u trm = do
t' <- computeLType gr g t
u' <- computeLType gr g u
case t' == u' || alpha [] t' u' of
True -> return (True,t',u',[])
-- forgive missing lock fields by only generating a warning.
--- better: use a flag to forgive? (AR 31/1/2006)
_ -> case missingLock [] t' u' of
Ok lo -> do
checkWarn $ "missing lock field" <+> fsep lo
return (True,t',u',[])
Bad s -> return (False,t',u',s)
where
-- check that u is a subtype of t
--- quick hack version of TC.eqVal
alpha g t u = case (t,u) of
-- error (the empty type!) is subtype of any other type
(_,u) | u == typeError -> True
-- contravariance
(Prod _ x a b, Prod _ y c d) -> alpha g c a && alpha ((x,y):g) b d
-- record subtyping
(RecType rs, RecType ts) -> all (\ (l,a) ->
any (\ (k,b) -> l == k && alpha g a b) ts) rs
(ExtR r s, ExtR r' s') -> alpha g r r' && alpha g s s'
(ExtR r s, t) -> alpha g r t || alpha g s t
-- the following say that Ints n is a subset of Int and of Ints m >= n
-- But why does it also allow Int as a subtype of Ints m? /TH 2014-04-04
(t,u) | Just m <- isTypeInts t, Just n <- isTypeInts u -> m >= n
| Just _ <- isTypeInts t, u == typeInt -> True ---- check size!
| t == typeInt, Just _ <- isTypeInts u -> True ---- why this ???? AR 11/12/2005
---- this should be made in Rename
(Q (m,a), Q (n,b)) | a == b -> elem m (allExtendsPlus gr n)
|| elem n (allExtendsPlus gr m)
|| m == n --- for Predef
(QC (m,a), QC (n,b)) | a == b -> elem m (allExtendsPlus gr n)
|| elem n (allExtendsPlus gr m)
(QC (m,a), Q (n,b)) | a == b -> elem m (allExtendsPlus gr n)
|| elem n (allExtendsPlus gr m)
(Q (m,a), QC (n,b)) | a == b -> elem m (allExtendsPlus gr n)
|| elem n (allExtendsPlus gr m)
-- contravariance
(Table a b, Table c d) -> alpha g c a && alpha g b d
(Vr x, Vr y) -> x == y || elem (x,y) g || elem (y,x) g
_ -> t == u
--- the following should be one-way coercions only. AR 4/1/2001
|| elem t sTypes && elem u sTypes
|| (t == typeType && u == typePType)
|| (u == typeType && t == typePType)
missingLock g t u = case (t,u) of
(RecType rs, RecType ts) ->
let
ls = [l | (l,a) <- rs,
not (any (\ (k,b) -> alpha g a b && l == k) ts)]
(locks,others) = partition isLockLabel ls
in case others of
_:_ -> Bad $ render ("missing record fields:" <+> fsep (punctuate ',' (others)))
_ -> return locks
-- contravariance
(Prod _ x a b, Prod _ y c d) -> do
ls1 <- missingLock g c a
ls2 <- missingLock g b d
return $ ls1 ++ ls2
_ -> Bad ""
sTypes = [typeStr, typeTok, typeString]
-- auxiliaries
-- | light-weight substitution for dep. types
substituteLType :: Context -> Type -> Check Type
substituteLType g t = case t of
Vr x -> return $ maybe t id $ lookup x [(x,t) | (_,x,t) <- g]
_ -> composOp (substituteLType g) t
termWith :: Term -> Check Type -> Check (Term, Type)
termWith t ct = do
ty <- ct
return (t,ty)
-- | compositional check\/infer of binary operations
check2 :: (Term -> Check Term) -> (Term -> Term -> Term) ->
Term -> Term -> Type -> Check (Term,Type)
check2 chk con a b t = do
a' <- chk a
b' <- chk b
return (con a' b', t)
-- printing a type with a lock field lock_C as C
ppType :: Type -> Doc
ppType ty =
case ty of
RecType fs -> case filter isLockLabel $ map fst fs of
[lock] -> pp (drop 5 (showIdent (label2ident lock)))
_ -> ppTerm Unqualified 0 ty
Prod _ x a b -> ppType a <+> "->" <+> ppType b
_ -> ppTerm Unqualified 0 ty
{-
ppqType :: Type -> Type -> Doc
ppqType t u = case (ppType t, ppType u) of
(pt,pu) | render pt == render pu -> ppTerm Qualified 0 t
(pt,_) -> pt
-}
checkLookup :: Ident -> Context -> Check Type
checkLookup x g =
case [ty | (b,y,ty) <- g, x == y] of
[] -> checkError ("unknown variable" <+> x)
(ty:_) -> return ty

View File

@@ -12,15 +12,14 @@
-- Thierry Coquand's type checking algorithm that creates a trace -- Thierry Coquand's type checking algorithm that creates a trace
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Compile.TypeCheck.TC ( module GF.Compile.TypeCheck.TC (AExp(..),
AExp(..), Theory,
Theory, checkExp,
checkExp, inferExp,
inferExp, checkBranch,
checkBranch, eqVal,
eqVal, whnf
whnf ) where
) where
import GF.Data.Operations import GF.Data.Operations
import GF.Grammar import GF.Grammar
@@ -128,10 +127,10 @@ checkExp th tenv@(k,rho,gamma) e ty = do
Abs _ x t -> case typ of Abs _ x t -> case typ of
VClos env (Prod _ y a b) -> do VClos env (Prod _ y a b) -> do
a' <- whnf $ VClos env a --- a' <- whnf $ VClos env a ---
(t',cs) <- checkExp th (t',cs) <- checkExp th
(k+1,(x,v x):rho, (x,a'):gamma) t (VClos ((y,v x):env) b) (k+1,(x,v x):rho, (x,a'):gamma) t (VClos ((y,v x):env) b)
return (AAbs x a' t', cs) return (AAbs x a' t', cs)
_ -> Bad (render ("function type expected for" <+> ppTerm Unqualified 0 e <+> "instead of" <+> ppValue Unqualified 0 typ)) _ -> Bad (render ("function type expected for" <+> ppTerm Unqualified 0 e <+> "instead of" <+> ppValue Unqualified 0 typ))
Let (x, (mb_typ, e1)) e2 -> do Let (x, (mb_typ, e1)) e2 -> do
@@ -206,8 +205,8 @@ inferExp th tenv@(k,rho,gamma) e = case e of
case typ of case typ of
VClos env (Prod _ x a b) -> do VClos env (Prod _ x a b) -> do
(a',csa) <- checkExp th tenv t (VClos env a) (a',csa) <- checkExp th tenv t (VClos env a)
b' <- whnf $ VClos ((x,VClos rho t):env) b b' <- whnf $ VClos ((x,VClos rho t):env) b
return $ (AApp f' a' b', b', csf ++ csa) return $ (AApp f' a' b', b', csf ++ csa)
_ -> Bad (render ("Prod expected for function" <+> ppTerm Unqualified 0 f <+> "instead of" <+> ppValue Unqualified 0 typ)) _ -> Bad (render ("Prod expected for function" <+> ppTerm Unqualified 0 f <+> "instead of" <+> ppValue Unqualified 0 typ))
_ -> Bad (render ("cannot infer type of expression" <+> ppTerm Unqualified 0 e)) _ -> Bad (render ("cannot infer type of expression" <+> ppTerm Unqualified 0 e))
@@ -246,11 +245,11 @@ checkBranch th tenv b@(ps,t) ty = errIn ("branch" +++ show b) $
typ <- whnf ty typ <- whnf ty
case typ of case typ of
VClos env (Prod _ y a b) -> do VClos env (Prod _ y a b) -> do
a' <- whnf $ VClos env a a' <- whnf $ VClos env a
(p', sigma, binds, cs1) <- checkP tenv p y a' (p', sigma, binds, cs1) <- checkP tenv p y a'
let tenv' = (length binds, sigma ++ rho, binds ++ gamma) let tenv' = (length binds, sigma ++ rho, binds ++ gamma)
((ps',exp),cs2) <- chB tenv' ps2 (VClos ((y,p'):env) b) ((ps',exp),cs2) <- chB tenv' ps2 (VClos ((y,p'):env) b)
return ((p:ps',exp), cs1 ++ cs2) -- don't change the patt return ((p:ps',exp), cs1 ++ cs2) -- don't change the patt
_ -> Bad (render ("Product expected for definiens" <+> ppTerm Unqualified 0 t <+> "instead of" <+> ppValue Unqualified 0 typ)) _ -> Bad (render ("Product expected for definiens" <+> ppTerm Unqualified 0 t <+> "instead of" <+> ppValue Unqualified 0 typ))
[] -> do [] -> do
(e,cs) <- checkExp th tenv t ty (e,cs) <- checkExp th tenv t ty
@@ -308,8 +307,8 @@ checkPatt th tenv exp val = do
case typ of case typ of
VClos env (Prod _ x a b) -> do VClos env (Prod _ x a b) -> do
(a',_,csa) <- checkExpP tenv t (VClos env a) (a',_,csa) <- checkExpP tenv t (VClos env a)
b' <- whnf $ VClos ((x,VClos rho t):env) b b' <- whnf $ VClos ((x,VClos rho t):env) b
return $ (AApp f' a' b', b', csf ++ csa) return $ (AApp f' a' b', b', csf ++ csa)
_ -> Bad (render ("Prod expected for function" <+> ppTerm Unqualified 0 f <+> "instead of" <+> ppValue Unqualified 0 typ)) _ -> Bad (render ("Prod expected for function" <+> ppTerm Unqualified 0 f <+> "instead of" <+> ppValue Unqualified 0 typ))
_ -> Bad (render ("cannot typecheck pattern" <+> ppTerm Unqualified 0 exp)) _ -> Bad (render ("cannot typecheck pattern" <+> ppTerm Unqualified 0 exp))
@@ -322,3 +321,4 @@ mkAnnot :: (Val -> AExp) -> Err (Val,[(Val,Val)]) -> Err (AExp,Val,[(Val,Val)])
mkAnnot a ti = do mkAnnot a ti = do
(v,cs) <- ti (v,cs) <- ti
return (a v, v, cs) return (a v, v, cs)

View File

@@ -34,14 +34,14 @@ buildAnyTree :: Fail.MonadFail m => ModuleName -> [(Ident,Info)] -> m (Map.Map I
buildAnyTree m = go Map.empty buildAnyTree m = go Map.empty
where where
go map [] = return map go map [] = return map
go map ((c,j):is) = go map ((c,j):is) = do
case Map.lookup c map of case Map.lookup c map of
Just i -> case unifyAnyInfo m i j of Just i -> case unifyAnyInfo m i j of
Ok k -> go (Map.insert c k map) is Ok k -> go (Map.insert c k map) is
Bad _ -> fail $ render ("conflicting information in module"<+>m $$ Bad _ -> fail $ render ("conflicting information in module"<+>m $$
nest 4 (ppJudgement Qualified (c,i)) $$ nest 4 (ppJudgement Qualified (c,i)) $$
"and" $+$ "and" $+$
nest 4 (ppJudgement Qualified (c,j))) nest 4 (ppJudgement Qualified (c,j)))
Nothing -> go (Map.insert c j map) is Nothing -> go (Map.insert c j map) is
extendModule :: FilePath -> SourceGrammar -> SourceModule -> Check SourceModule extendModule :: FilePath -> SourceGrammar -> SourceModule -> Check SourceModule
@@ -147,18 +147,18 @@ extendMod gr isCompl ((name,mi),cond) base new = foldM try new $ Map.toList (jme
| not (cond c) = return new | not (cond c) = return new
| otherwise = case Map.lookup c new of | otherwise = case Map.lookup c new of
Just j -> case unifyAnyInfo name i j of Just j -> case unifyAnyInfo name i j of
Ok k -> return $ Map.insert c k new Ok k -> return $ Map.insert c k new
Bad _ -> do (base,j) <- case j of Bad _ -> do (base,j) <- case j of
AnyInd _ m -> lookupOrigInfo gr (m,c) AnyInd _ m -> lookupOrigInfo gr (m,c)
_ -> return (base,j) _ -> return (base,j)
(name,i) <- case i of (name,i) <- case i of
AnyInd _ m -> lookupOrigInfo gr (m,c) AnyInd _ m -> lookupOrigInfo gr (m,c)
_ -> return (name,i) _ -> return (name,i)
checkError ("cannot unify the information" $$ checkError ("cannot unify the information" $$
nest 4 (ppJudgement Qualified (c,i)) $$ nest 4 (ppJudgement Qualified (c,i)) $$
"in module" <+> name <+> "with" $$ "in module" <+> name <+> "with" $$
nest 4 (ppJudgement Qualified (c,j)) $$ nest 4 (ppJudgement Qualified (c,j)) $$
"in module" <+> base) "in module" <+> base)
Nothing-> if isCompl Nothing-> if isCompl
then return $ Map.insert c (indirInfo name i) new then return $ Map.insert c (indirInfo name i) new
else return $ Map.insert c i new else return $ Map.insert c i new

View File

@@ -61,10 +61,10 @@ parallelBatchCompile jobs opts rootfiles0 =
usesPresent (_,paths) = take 1 libs==["present"] usesPresent (_,paths) = take 1 libs==["present"]
where where
libs = [p | path<-paths, libs = [p|path<-paths,
let (d,p0) = splitAt n path let (d,p0) = splitAt n path
p = dropSlash p0, p = dropSlash p0,
d==lib_dir, p `elem` all_modes] d==lib_dir,p `elem` all_modes]
n = length lib_dir n = length lib_dir
all_modes = ["alltenses","present"] all_modes = ["alltenses","present"]
@@ -175,7 +175,7 @@ batchCompile1 lib_dir (opts,filepaths) =
" from being compiled." " from being compiled."
else return (maximum ts,(cnc,gr)) else return (maximum ts,(cnc,gr))
splitEither es = ([x | Left x<-es], [y | Right y<-es]) splitEither es = ([x|Left x<-es],[y|Right y<-es])
canonical path = liftIO $ D.canonicalizePath path `catch` const (return path) canonical path = liftIO $ D.canonicalizePath path `catch` const (return path)
@@ -239,11 +239,11 @@ instance Functor m => Functor (CollectOutput m) where
fmap f (CO m) = CO (fmap (fmap f) m) fmap f (CO m) = CO (fmap (fmap f) m)
instance (Functor m,Monad m) => Applicative (CollectOutput m) where instance (Functor m,Monad m) => Applicative (CollectOutput m) where
pure x = CO (return (return (),x)) pure = return
(<*>) = ap (<*>) = ap
instance Monad m => Monad (CollectOutput m) where instance Monad m => Monad (CollectOutput m) where
return = pure return x = CO (return (return (),x))
CO m >>= f = CO $ do (o1,x) <- m CO m >>= f = CO $ do (o1,x) <- m
let CO m2 = f x let CO m2 = f x
(o2,y) <- m2 (o2,y) <- m2

View File

@@ -1,9 +1,11 @@
module GF.Compiler (mainGFC, linkGrammars, writePGF, writeOutputs) where module GF.Compiler (mainGFC, linkGrammars, writePGF, writeLPGF, writeOutputs) where
import PGF import PGF
import PGF.Internal(concretes,optimizePGF,unionPGF) import PGF.Internal(concretes,optimizePGF,unionPGF)
import PGF.Internal(putSplitAbs,encodeFile,runPut) import PGF.Internal(putSplitAbs,encodeFile,runPut)
import GF.Compile as S(batchCompile,link,srcAbsName) import LPGF(LPGF)
import qualified LPGF
import GF.Compile as S(batchCompile,link,linkl,srcAbsName)
import GF.CompileInParallel as P(parallelBatchCompile) import GF.CompileInParallel as P(parallelBatchCompile)
import GF.Compile.Export import GF.Compile.Export
import GF.Compile.ConcreteToHaskell(concretes2haskell) import GF.Compile.ConcreteToHaskell(concretes2haskell)
@@ -11,7 +13,8 @@ import GF.Compile.GrammarToCanonical--(concretes2canonical)
import GF.Compile.CFGtoPGF import GF.Compile.CFGtoPGF
import GF.Compile.GetGrammar import GF.Compile.GetGrammar
import GF.Grammar.BNFC import GF.Grammar.BNFC
import GF.Grammar.CFG import GF.Grammar.CFG hiding (Grammar)
import GF.Grammar.Grammar (Grammar, ModuleName)
--import GF.Infra.Ident(showIdent) --import GF.Infra.Ident(showIdent)
import GF.Infra.UseIO import GF.Infra.UseIO
@@ -23,10 +26,11 @@ import GF.Text.Pretty(render,render80)
import Data.Maybe import Data.Maybe
import qualified Data.Map as Map import qualified Data.Map as Map
import qualified Data.Set as Set import qualified Data.Set as Set
import Data.Time(UTCTime)
import qualified Data.ByteString.Lazy as BSL import qualified Data.ByteString.Lazy as BSL
import GF.Grammar.CanonicalJSON (encodeJSON) import GF.Grammar.CanonicalJSON (encodeJSON)
import System.FilePath import System.FilePath
import Control.Monad(when,unless,forM_) import Control.Monad(when,unless,forM,void)
-- | Compile the given GF grammar files. The result is a number of @.gfo@ files -- | Compile the given GF grammar files. The result is a number of @.gfo@ files
-- and, depending on the options, a @.pgf@ file. (@gf -batch@, @gf -make@) -- and, depending on the options, a @.pgf@ file. (@gf -batch@, @gf -make@)
@@ -93,6 +97,10 @@ compileSourceFiles opts fs =
-- If a @.pgf@ file by the same name already exists and it is newer than the -- If a @.pgf@ file by the same name already exists and it is newer than the
-- source grammar files (as indicated by the 'UTCTime' argument), it is not -- source grammar files (as indicated by the 'UTCTime' argument), it is not
-- recreated. Calls 'writePGF' and 'writeOutputs'. -- recreated. Calls 'writePGF' and 'writeOutputs'.
linkGrammars :: Options -> (UTCTime,[(ModuleName, Grammar)]) -> IOE ()
linkGrammars opts (_,cnc_grs) | FmtLPGF `elem` flag optOutputFormats opts = do
lpgf <- linkl opts (head cnc_grs)
void $ writeLPGF opts lpgf
linkGrammars opts (t_src,~cnc_grs@(~(cnc,gr):_)) = linkGrammars opts (t_src,~cnc_grs@(~(cnc,gr):_)) =
do let abs = render (srcAbsName gr cnc) do let abs = render (srcAbsName gr cnc)
pgfFile = outputPath opts (grammarName' opts abs<.>"pgf") pgfFile = outputPath opts (grammarName' opts abs<.>"pgf")
@@ -145,7 +153,7 @@ unionPGFFiles opts fs =
pgfFile = outputPath opts (grammarName opts pgf <.> "pgf") pgfFile = outputPath opts (grammarName opts pgf <.> "pgf")
if pgfFile `elem` fs if pgfFile `elem` fs
then putStrLnE $ "Refusing to overwrite " ++ pgfFile then putStrLnE $ "Refusing to overwrite " ++ pgfFile
else writePGF opts pgf else void $ writePGF opts pgf
writeOutputs opts pgf writeOutputs opts pgf
readPGFVerbose f = readPGFVerbose f =
@@ -162,26 +170,39 @@ writeOutputs opts pgf = do
-- | Write the result of compiling a grammar (e.g. with 'compileToPGF' or -- | Write the result of compiling a grammar (e.g. with 'compileToPGF' or
-- 'link') to a @.pgf@ file. -- 'link') to a @.pgf@ file.
-- A split PGF file is output if the @-split-pgf@ option is used. -- A split PGF file is output if the @-split-pgf@ option is used.
writePGF :: Options -> PGF -> IOE () writePGF :: Options -> PGF -> IOE [FilePath]
writePGF opts pgf = writePGF opts pgf =
if flag optSplitPGF opts then writeSplitPGF else writeNormalPGF if flag optSplitPGF opts then writeSplitPGF else writeNormalPGF
where where
writeNormalPGF = writeNormalPGF =
do let outfile = outputPath opts (grammarName opts pgf <.> "pgf") do let outfile = outputPath opts (grammarName opts pgf <.> "pgf")
writing opts outfile $ encodeFile outfile pgf writing opts outfile $ encodeFile outfile pgf
return [outfile]
writeSplitPGF = writeSplitPGF =
do let outfile = outputPath opts (grammarName opts pgf <.> "pgf") do let outfile = outputPath opts (grammarName opts pgf <.> "pgf")
writing opts outfile $ BSL.writeFile outfile (runPut (putSplitAbs pgf)) writing opts outfile $ BSL.writeFile outfile (runPut (putSplitAbs pgf))
--encodeFile_ outfile (putSplitAbs pgf) --encodeFile_ outfile (putSplitAbs pgf)
forM_ (Map.toList (concretes pgf)) $ \cnc -> do outfiles <- forM (Map.toList (concretes pgf)) $ \cnc -> do
let outfile = outputPath opts (showCId (fst cnc) <.> "pgf_c") let outfile = outputPath opts (showCId (fst cnc) <.> "pgf_c")
writing opts outfile $ encodeFile outfile cnc writing opts outfile $ encodeFile outfile cnc
return outfile
return (outfile:outfiles)
writeOutput :: Options -> FilePath-> String -> IOE () writeLPGF :: Options -> LPGF -> IOE FilePath
writeOutput opts file str = writing opts path $ writeUTF8File path str writeLPGF opts lpgf = do
where path = outputPath opts file let
grammarName = fromMaybe (showCId (LPGF.abstractName lpgf)) (flag optName opts)
outfile = outputPath opts (grammarName <.> "lpgf")
writing opts outfile $ liftIO $ LPGF.encodeFile outfile lpgf
return outfile
writeOutput :: Options -> FilePath-> String -> IOE FilePath
writeOutput opts file str = do
let outfile = outputPath opts file
writing opts outfile $ writeUTF8File outfile str
return outfile
-- * Useful helper functions -- * Useful helper functions

View File

@@ -16,18 +16,18 @@
{-# LANGUAGE CPP #-} {-# LANGUAGE CPP #-}
module GF.Data.BacktrackM ( module GF.Data.BacktrackM (
-- * the backtracking state monad -- * the backtracking state monad
BacktrackM, BacktrackM,
-- * monad specific utilities -- * monad specific utilities
member, member,
cut, cut,
-- * running the monad -- * running the monad
foldBM, runBM, foldBM, runBM,
foldSolutions, solutions, foldSolutions, solutions,
foldFinalStates, finalStates, foldFinalStates, finalStates,
-- * reexport the 'MonadState' class -- * reexport the 'MonadState' class
module Control.Monad.State.Class, module Control.Monad.State.Class,
) where ) where
import Data.List import Data.List
import Control.Applicative import Control.Applicative
@@ -64,13 +64,13 @@ finalStates :: BacktrackM s () -> s -> [s]
finalStates bm = map fst . runBM bm finalStates bm = map fst . runBM bm
instance Applicative (BacktrackM s) where instance Applicative (BacktrackM s) where
pure a = BM (\c s b -> c a s b) pure = return
(<*>) = ap (<*>) = ap
instance Monad (BacktrackM s) where instance Monad (BacktrackM s) where
return = pure return a = BM (\c s b -> c a s b)
BM m >>= k = BM (\c s b -> m (\a s b -> unBM (k a) c s b) s b) BM m >>= k = BM (\c s b -> m (\a s b -> unBM (k a) c s b) s b)
where unBM (BM m) = m where unBM (BM m) = m
#if !(MIN_VERSION_base(4,13,0)) #if !(MIN_VERSION_base(4,13,0))
fail = Fail.fail fail = Fail.fail

View File

@@ -34,7 +34,7 @@ fromErr :: a -> Err a -> a
fromErr a = err (const a) id fromErr a = err (const a) id
instance Monad Err where instance Monad Err where
return = pure return = Ok
Ok a >>= f = f a Ok a >>= f = f a
Bad s >>= f = Bad s Bad s >>= f = Bad s
@@ -54,7 +54,7 @@ instance Functor Err where
fmap f (Bad s) = Bad s fmap f (Bad s) = Bad s
instance Applicative Err where instance Applicative Err where
pure = Ok pure = return
(<*>) = ap (<*>) = ap
-- | added by KJ -- | added by KJ

View File

@@ -34,7 +34,7 @@ import Data.Set (Set)
import qualified Data.Set as Set import qualified Data.Set as Set
data Graph n a b = Graph [n] ![Node n a] ![Edge n b] data Graph n a b = Graph [n] ![Node n a] ![Edge n b]
deriving (Eq,Show) deriving (Eq,Show)
type Node n a = (n,a) type Node n a = (n,a)
type Edge n b = (n,n,b) type Edge n b = (n,n,b)
@@ -170,7 +170,7 @@ renameNodes :: (n -> m) -- ^ renaming function
-> Graph n a b -> Graph m a b -> Graph n a b -> Graph m a b
renameNodes newName c (Graph _ ns es) = Graph c ns' es' renameNodes newName c (Graph _ ns es) = Graph c ns' es'
where ns' = map' (\ (n,x) -> (newName n,x)) ns where ns' = map' (\ (n,x) -> (newName n,x)) ns
es' = map' (\ (f,t,l) -> (newName f, newName t, l)) es es' = map' (\ (f,t,l) -> (newName f, newName t, l)) es
-- | A strict 'map' -- | A strict 'map'
map' :: (a -> b) -> [a] -> [b] map' :: (a -> b) -> [a] -> [b]

View File

@@ -13,14 +13,14 @@
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Data.Graphviz ( module GF.Data.Graphviz (
Graph(..), GraphType(..), Graph(..), GraphType(..),
Node(..), Edge(..), Node(..), Edge(..),
Attr, Attr,
addSubGraphs, addSubGraphs,
setName, setName,
setAttr, setAttr,
prGraphviz prGraphviz
) where ) where
import Data.Char import Data.Char
@@ -76,8 +76,8 @@ prSubGraph g@(Graph _ i _ _ _ _) =
prGraph :: Graph -> String prGraph :: Graph -> String
prGraph (Graph t id at ns es ss) = prGraph (Graph t id at ns es ss) =
unlines $ map (++";") (map prAttr at unlines $ map (++";") (map prAttr at
++ map prNode ns ++ map prNode ns
++ map (prEdge t) es ++ map (prEdge t) es
++ map prSubGraph ss) ++ map prSubGraph ss)
graphtype :: GraphType -> String graphtype :: GraphType -> String
@@ -96,7 +96,7 @@ edgeop Undirected = "--"
prAttrList :: [Attr] -> String prAttrList :: [Attr] -> String
prAttrList [] = "" prAttrList [] = ""
prAttrList at = "[" ++ join "," (map prAttr at) ++ "]" prAttrList at = "[" ++ join "," (map prAttr at) ++ "]"
prAttr :: Attr -> String prAttr :: Attr -> String
prAttr (n,v) = esc n ++ " = " ++ esc v prAttr (n,v) = esc n ++ " = " ++ esc v

View File

@@ -0,0 +1,61 @@
-- | In order to build an IntMap in one pass, we need a map data structure with
-- fast lookup in both keys and values.
-- This is achieved by keeping a separate reversed map of values to keys during building.
module GF.Data.IntMapBuilder where
import Data.IntMap (IntMap)
import qualified Data.IntMap as IntMap
import Data.Hashable (Hashable)
import Data.HashMap.Strict (HashMap)
import qualified Data.HashMap.Strict as HashMap
import Data.Tuple (swap)
import Prelude hiding (lookup)
data IMB a = IMB {
intMap :: IntMap a,
valMap :: HashMap a Int
}
-- | An empty IMB
empty :: (Eq a, Hashable a) => IMB a
empty = IMB {
intMap = IntMap.empty,
valMap = HashMap.empty
}
-- | An empty IntMap
emptyIntMap :: IntMap a
emptyIntMap = IntMap.empty
-- | Lookup a value
lookup :: (Eq a, Hashable a) => a -> IMB a -> Maybe Int
lookup a IMB { valMap = vm } = HashMap.lookup a vm
-- | Insert without any lookup
insert :: (Eq a, Hashable a) => a -> IMB a -> (Int, IMB a)
insert a IMB { intMap = im, valMap = vm } =
let
ix = IntMap.size im
im' = IntMap.insert ix a im
vm' = HashMap.insert a ix vm
imb' = IMB { intMap = im', valMap = vm' }
in
(ix, imb')
-- | Insert only when lookup fails
insert' :: (Eq a, Hashable a) => a -> IMB a -> (Int, IMB a)
insert' a imb =
case lookup a imb of
Just ix -> (ix, imb)
Nothing -> insert a imb
-- | Build IMB from existing IntMap
fromIntMap :: (Eq a, Hashable a) => IntMap a -> IMB a
fromIntMap im = IMB {
intMap = im,
valMap = HashMap.fromList (map swap (IntMap.toList im))
}
-- | Get IntMap from IMB
toIntMap :: (Eq a, Hashable a) => IMB a -> IntMap a
toIntMap = intMap

View File

@@ -15,34 +15,34 @@
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Data.Operations ( module GF.Data.Operations (
-- ** The Error monad -- ** The Error monad
Err(..), err, maybeErr, testErr, fromErr, errIn, Err(..), err, maybeErr, testErr, fromErr, errIn,
lookupErr, lookupErr,
-- ** Error monad class -- ** Error monad class
ErrorMonad(..), checks, --doUntil, allChecks, checkAgain, ErrorMonad(..), checks, --doUntil, allChecks, checkAgain,
liftErr, liftErr,
-- ** Checking -- ** Checking
checkUnique, unifyMaybeBy, unifyMaybe, checkUnique, unifyMaybeBy, unifyMaybe,
-- ** Monadic operations on lists and pairs -- ** Monadic operations on lists and pairs
mapPairsM, pairM, mapPairsM, pairM,
-- ** Printing -- ** Printing
indent, (+++), (++-), (++++), (+++-), (+++++), indent, (+++), (++-), (++++), (+++-), (+++++),
prUpper, prReplicate, prTList, prQuotedString, prParenth, prCurly, prUpper, prReplicate, prTList, prQuotedString, prParenth, prCurly,
prBracket, prArgList, prSemicList, prCurlyList, restoreEscapes, prBracket, prArgList, prSemicList, prCurlyList, restoreEscapes,
numberedParagraphs, prConjList, prIfEmpty, wrapLines, numberedParagraphs, prConjList, prIfEmpty, wrapLines,
-- ** Topological sorting -- ** Topological sorting
topoTest, topoTest2, topoTest, topoTest2,
-- ** Misc -- ** Misc
readIntArg, readIntArg,
iterFix, chunks, iterFix, chunks,
) where ) where
import Data.Char (isSpace, toUpper, isSpace, isDigit) import Data.Char (isSpace, toUpper, isSpace, isDigit)
import Data.List (nub, partition, (\\)) import Data.List (nub, partition, (\\))
@@ -204,7 +204,7 @@ topoTest2 g0 = maybe (Right cycles) Left (tsort g)
([],[]) -> Just [] ([],[]) -> Just []
([],_) -> Nothing ([],_) -> Nothing
(ns,rest) -> (leaves:) `fmap` tsort [(n,es \\ leaves) | (n,es)<-rest] (ns,rest) -> (leaves:) `fmap` tsort [(n,es \\ leaves) | (n,es)<-rest]
where leaves = map fst ns where leaves = map fst ns
-- | Fix point iterator (for computing e.g. transitive closures or reachability) -- | Fix point iterator (for computing e.g. transitive closures or reachability)

View File

@@ -83,7 +83,7 @@ transitiveClosure r = fix (Map.map growSet) r
where growSet ys = foldl Set.union ys (map (allRelated r) $ Set.toList ys) where growSet ys = foldl Set.union ys (map (allRelated r) $ Set.toList ys)
reflexiveClosure_ :: Ord a => [a] -- ^ The set over which the relation is defined. reflexiveClosure_ :: Ord a => [a] -- ^ The set over which the relation is defined.
-> Rel a -> Rel a -> Rel a -> Rel a
reflexiveClosure_ u r = relates [(x,x) | x <- u] r reflexiveClosure_ u r = relates [(x,x) | x <- u] r
-- | Uses 'domain' -- | Uses 'domain'
@@ -117,8 +117,8 @@ equivalenceClasses :: Ord a => Rel a -> [Set a]
equivalenceClasses r = equivalenceClasses_ (Map.keys r) r equivalenceClasses r = equivalenceClasses_ (Map.keys r) r
where equivalenceClasses_ [] _ = [] where equivalenceClasses_ [] _ = []
equivalenceClasses_ (x:xs) r = ys:equivalenceClasses_ zs r equivalenceClasses_ (x:xs) r = ys:equivalenceClasses_ zs r
where ys = allRelated r x where ys = allRelated r x
zs = [x' | x' <- xs, not (x' `Set.member` ys)] zs = [x' | x' <- xs, not (x' `Set.member` ys)]
isTransitive :: Ord a => Rel a -> Bool isTransitive :: Ord a => Rel a -> Bool
isTransitive r = and [z `Set.member` ys | (x,ys) <- Map.toList r, isTransitive r = and [z `Set.member` ys | (x,ys) <- Map.toList r,

View File

@@ -33,7 +33,7 @@ longerThan n = not . notLongerThan n
lookupList :: Eq a => a -> [(a, b)] -> [b] lookupList :: Eq a => a -> [(a, b)] -> [b]
lookupList a [] = [] lookupList a [] = []
lookupList a (p:ps) | a == fst p = snd p : lookupList a ps lookupList a (p:ps) | a == fst p = snd p : lookupList a ps
| otherwise = lookupList a ps | otherwise = lookupList a ps
split :: [a] -> ([a], [a]) split :: [a] -> ([a], [a])
split (x : y : as) = (x:xs, y:ys) split (x : y : as) = (x:xs, y:ys)
@@ -48,8 +48,8 @@ splitBy p (a : as) = if p a then (a:xs, ys) else (xs, a:ys)
foldMerge :: (a -> a -> a) -> a -> [a] -> a foldMerge :: (a -> a -> a) -> a -> [a] -> a
foldMerge merge zero = fm foldMerge merge zero = fm
where fm [] = zero where fm [] = zero
fm [a] = a fm [a] = a
fm abs = let (as, bs) = split abs in fm as `merge` fm bs fm abs = let (as, bs) = split abs in fm as `merge` fm bs
select :: [a] -> [(a, [a])] select :: [a] -> [(a, [a])]
select [] = [] select [] = []

View File

@@ -11,7 +11,6 @@
module GF.Grammar.Canonical where module GF.Grammar.Canonical where
import Prelude hiding ((<>)) import Prelude hiding ((<>))
import GF.Text.Pretty import GF.Text.Pretty
import GF.Infra.Ident (RawIdent)
-- | A Complete grammar -- | A Complete grammar
data Grammar = Grammar Abstract [Concrete] deriving Show data Grammar = Grammar Abstract [Concrete] deriving Show
@@ -31,7 +30,7 @@ data TypeApp = TypeApp CatId [Type] deriving Show
data TypeBinding = TypeBinding VarId Type deriving Show data TypeBinding = TypeBinding VarId Type deriving Show
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
-- ** Concreate syntax -- ** Concrete syntax
-- | Concrete Syntax -- | Concrete Syntax
data Concrete = Concrete ModId ModId Flags [ParamDef] [LincatDef] [LinDef] data Concrete = Concrete ModId ModId Flags [ParamDef] [LincatDef] [LinDef]
@@ -103,9 +102,9 @@ data TableRow rhs = TableRow LinPattern rhs
-- *** Identifiers in Concrete Syntax -- *** Identifiers in Concrete Syntax
newtype PredefId = PredefId Id deriving (Eq,Ord,Show) newtype PredefId = PredefId Id deriving (Eq,Ord,Show)
newtype LabelId = LabelId Id deriving (Eq,Ord,Show) newtype LabelId = LabelId Id deriving (Eq,Ord,Show)
data VarValueId = VarValueId QualId deriving (Eq,Ord,Show) newtype VarValueId = VarValueId QualId deriving (Eq,Ord,Show)
-- | Name of param type or param value -- | Name of param type or param value
newtype ParamId = ParamId QualId deriving (Eq,Ord,Show) newtype ParamId = ParamId QualId deriving (Eq,Ord,Show)
@@ -116,9 +115,9 @@ newtype ParamId = ParamId QualId deriving (Eq,Ord,Show)
newtype ModId = ModId Id deriving (Eq,Ord,Show) newtype ModId = ModId Id deriving (Eq,Ord,Show)
newtype CatId = CatId Id deriving (Eq,Ord,Show) newtype CatId = CatId Id deriving (Eq,Ord,Show)
newtype FunId = FunId Id deriving (Eq,Show) newtype FunId = FunId Id deriving (Eq,Ord,Show)
data VarId = Anonymous | VarId Id deriving Show data VarId = Anonymous | VarId Id deriving (Eq,Show)
newtype Flags = Flags [(FlagName,FlagValue)] deriving Show newtype Flags = Flags [(FlagName,FlagValue)] deriving Show
type FlagName = Id type FlagName = Id
@@ -127,7 +126,7 @@ data FlagValue = Str String | Int Int | Flt Double deriving Show
-- *** Identifiers -- *** Identifiers
type Id = RawIdent type Id = String
data QualId = Qual ModId Id | Unqual Id deriving (Eq,Ord,Show) data QualId = Qual ModId Id | Unqual Id deriving (Eq,Ord,Show)
-------------------------------------------------------------------------------- --------------------------------------------------------------------------------
@@ -266,6 +265,7 @@ instance PPA LinPattern where
RecordPattern r -> block r RecordPattern r -> block r
TuplePattern ps -> "<"<>punctuate "," ps<>">" TuplePattern ps -> "<"<>punctuate "," ps<>">"
WildPattern -> pp "_" WildPattern -> pp "_"
_ -> parens p
instance RhsSeparator LinPattern where rhsSep _ = pp "=" instance RhsSeparator LinPattern where rhsSep _ = pp "="

View File

@@ -7,7 +7,6 @@ import Control.Applicative ((<|>))
import Data.Ratio (denominator, numerator) import Data.Ratio (denominator, numerator)
import GF.Grammar.Canonical import GF.Grammar.Canonical
import Control.Monad (guard) import Control.Monad (guard)
import GF.Infra.Ident (RawIdent,showRawIdent,rawIdentS)
encodeJSON :: FilePath -> Grammar -> IO () encodeJSON :: FilePath -> Grammar -> IO ()
@@ -205,12 +204,12 @@ instance JSON a => JSON (RecordRow a) where
-- record rows and lists of record rows are both encoded as JSON records (i.e., objects) -- record rows and lists of record rows are both encoded as JSON records (i.e., objects)
showJSON row = showJSONs [row] showJSON row = showJSONs [row]
showJSONs rows = makeObj (map toRow rows) showJSONs rows = makeObj (map toRow rows)
where toRow (RecordRow (LabelId lbl) val) = (showRawIdent lbl, showJSON val) where toRow (RecordRow (LabelId lbl) val) = (lbl, showJSON val)
readJSON obj = head <$> readJSONs obj readJSON obj = head <$> readJSONs obj
readJSONs obj = mapM fromRow (assocsJSObject obj) readJSONs obj = mapM fromRow (assocsJSObject obj)
where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue
return (RecordRow (LabelId (rawIdentS lbl)) value) return (RecordRow (LabelId lbl) value)
instance JSON rhs => JSON (TableRow rhs) where instance JSON rhs => JSON (TableRow rhs) where
showJSON (TableRow l v) = makeObj [(".pattern", showJSON l), (".value", showJSON v)] showJSON (TableRow l v) = makeObj [(".pattern", showJSON l), (".value", showJSON v)]
@@ -243,24 +242,20 @@ instance JSON VarId where
<|> VarId <$> readJSON o <|> VarId <$> readJSON o
instance JSON QualId where instance JSON QualId where
showJSON (Qual (ModId m) n) = showJSON (showRawIdent m++"."++showRawIdent n) showJSON (Qual (ModId m) n) = showJSON (m++"."++n)
showJSON (Unqual n) = showJSON n showJSON (Unqual n) = showJSON n
readJSON o = do qualid <- readJSON o readJSON o = do qualid <- readJSON o
let (mod, id) = span (/= '.') qualid let (mod, id) = span (/= '.') qualid
return $ if null mod then Unqual (rawIdentS id) else Qual (ModId (rawIdentS mod)) (rawIdentS id) return $ if null mod then Unqual id else Qual (ModId mod) id
instance JSON RawIdent where
showJSON i = showJSON $ showRawIdent i
readJSON o = rawIdentS <$> readJSON o
instance JSON Flags where instance JSON Flags where
-- flags are encoded directly as JSON records (i.e., objects): -- flags are encoded directly as JSON records (i.e., objects):
showJSON (Flags fs) = makeObj [(showRawIdent f, showJSON v) | (f, v) <- fs] showJSON (Flags fs) = makeObj [(f, showJSON v) | (f, v) <- fs]
readJSON obj = Flags <$> mapM fromRow (assocsJSObject obj) readJSON obj = Flags <$> mapM fromRow (assocsJSObject obj)
where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue
return (rawIdentS lbl, value) return (lbl, value)
instance JSON FlagValue where instance JSON FlagValue where
-- flag values are encoded as basic JSON types: -- flag values are encoded as basic JSON types:

View File

@@ -78,7 +78,6 @@ import PGF.Internal (FId, FunId, SeqId, LIndex, Sequence, BindType(..))
import Data.Array.IArray(Array) import Data.Array.IArray(Array)
import Data.Array.Unboxed(UArray) import Data.Array.Unboxed(UArray)
import qualified Data.Map as Map import qualified Data.Map as Map
import qualified Data.Set as Set
import GF.Text.Pretty import GF.Text.Pretty
@@ -126,20 +125,10 @@ extends :: ModuleInfo -> [ModuleName]
extends = map fst . mextend extends = map fst . mextend
isInherited :: MInclude -> Ident -> Bool isInherited :: MInclude -> Ident -> Bool
isInherited c = isInherited c i = case c of
case c of MIAll -> True
MIAll -> const True MIOnly is -> elem i is
MIOnly is -> elemOrd is MIExcept is -> notElem i is
MIExcept is -> not . elemOrd is
-- | Faster version of `elem`, using a `Set`.
-- Make sure you give this the first argument _outside_ of the inner loop
--
-- Example:
-- > myIntersection xs ys = filter (elemOrd xs) ys
elemOrd :: Ord a => [a] -> a -> Bool
elemOrd list = (`Set.member` set)
where set = Set.fromList list
inheritAll :: ModuleName -> (ModuleName,MInclude) inheritAll :: ModuleName -> (ModuleName,MInclude)
inheritAll i = (i,MIAll) inheritAll i = (i,MIAll)

View File

@@ -4,7 +4,7 @@
module GF.Grammar.Lexer module GF.Grammar.Lexer
( Token(..), Posn(..) ( Token(..), Posn(..)
, P, runP, runPartial, token, lexer, getPosn, failLoc , P, runP, runPartial, token, lexer, getPosn, failLoc
, isReservedWord, invMap , isReservedWord
) where ) where
import Control.Applicative import Control.Applicative
@@ -134,7 +134,7 @@ data Token
| T_Double Double -- double precision float literals | T_Double Double -- double precision float literals
| T_Ident Ident | T_Ident Ident
| T_EOF | T_EOF
deriving (Eq, Ord, Show) -- debug -- deriving Show -- debug
res = eitherResIdent res = eitherResIdent
eitherResIdent :: (Ident -> Token) -> Ident -> Token eitherResIdent :: (Ident -> Token) -> Ident -> Token
@@ -224,13 +224,6 @@ resWords = Map.fromList
] ]
where b s t = (identS s, t) where b s t = (identS s, t)
invMap :: Map.Map Token String
invMap = res
where
lst = Map.toList resWords
flp = map (\(k,v) -> (v,showIdent k)) lst
res = Map.fromList flp
unescapeInitTail :: String -> String unescapeInitTail :: String -> String
unescapeInitTail = unesc . tail where unescapeInitTail = unesc . tail where
unesc s = case s of unesc s = case s of
@@ -274,7 +267,7 @@ type AlexInput2 = (AlexInput,AlexInput)
data ParseResult a data ParseResult a
= POk AlexInput2 a = POk AlexInput2 a
| PFailed Posn -- The position of the error | PFailed Posn -- The position of the error
String -- The error message String -- The error message
newtype P a = P { unP :: AlexInput2 -> ParseResult a } newtype P a = P { unP :: AlexInput2 -> ParseResult a }
@@ -283,11 +276,11 @@ instance Functor P where
fmap = liftA fmap = liftA
instance Applicative P where instance Applicative P where
pure a = a `seq` (P $ \s -> POk s a) pure = return
(<*>) = ap (<*>) = ap
instance Monad P where instance Monad P where
return = pure return a = a `seq` (P $ \s -> POk s a)
(P m) >>= k = P $ \ s -> case m s of (P m) >>= k = P $ \ s -> case m s of
POk s a -> unP (k a) s POk s a -> unP (k a) s
PFailed posn err -> PFailed posn err PFailed posn err -> PFailed posn err

View File

@@ -30,7 +30,7 @@ module GF.Grammar.Lookup (
lookupFunType, lookupFunType,
lookupCatContext, lookupCatContext,
allOpers, allOpersTo allOpers, allOpersTo
) where ) where
import GF.Data.Operations import GF.Data.Operations
import GF.Infra.Ident import GF.Infra.Ident

View File

@@ -590,7 +590,7 @@ noExist = FV []
defaultLinType :: Type defaultLinType :: Type
defaultLinType = mkRecType linLabel [typeStr] defaultLinType = mkRecType linLabel [typeStr]
-- | normalize records and record types; put s first -- normalize records and record types; put s first
sortRec :: [(Label,a)] -> [(Label,a)] sortRec :: [(Label,a)] -> [(Label,a)]
sortRec = sortBy ordLabel where sortRec = sortBy ordLabel where

View File

@@ -37,9 +37,6 @@ import PGF(mkCId)
%name pBNFCRules ListCFRule %name pBNFCRules ListCFRule
%name pEBNFRules ListEBNFRule %name pEBNFRules ListEBNFRule
%errorhandlertype explist
%error { happyError }
-- no lexer declaration -- no lexer declaration
%monad { P } { >>= } { return } %monad { P } { >>= } { return }
%lexer { lexer } { T_EOF } %lexer { lexer } { T_EOF }
@@ -433,7 +430,6 @@ Exp3
RecType xs -> RecType (xs ++ [(tupleLabel (length xs+1),$3)]) RecType xs -> RecType (xs ++ [(tupleLabel (length xs+1),$3)])
t -> RecType [(tupleLabel 1,$1), (tupleLabel 2,$3)] } t -> RecType [(tupleLabel 1,$1), (tupleLabel 2,$3)] }
| Exp3 '**' Exp4 { ExtR $1 $3 } | Exp3 '**' Exp4 { ExtR $1 $3 }
| Exp3 '**' '{' ListCase '}' { let v = identS "$vvv" in T TRaw ($4 ++ [(PV v, S $1 (Vr v))]) }
| Exp4 { $1 } | Exp4 { $1 }
Exp4 :: { Term } Exp4 :: { Term }
@@ -705,18 +701,8 @@ Posn
{ {
happyError :: (Token, [String]) -> P a happyError :: P a
happyError (t,strs) = fail $ happyError = fail "syntax error"
"Syntax error:\n Unexpected " ++ showToken t ++ ".\n Expected one of:\n"
++ unlines (map ((" - "++).cleanupToken) strs)
where
cleanupToken "Ident" = "an identifier"
cleanupToken x = x
showToken (T_Ident i) = "identifier '" ++ showIdent i ++ "'"
showToken t = case Map.lookup t invMap of
Nothing -> show t
Just s -> "token '" ++ s ++"'"
mkListId,mkConsId,mkBaseId :: Ident -> Ident mkListId,mkConsId,mkBaseId :: Ident -> Ident
mkListId = prefixIdent "List" mkListId = prefixIdent "List"

View File

@@ -12,12 +12,11 @@
-- pattern matching for both concrete and abstract syntax. AR -- 16\/6\/2003 -- pattern matching for both concrete and abstract syntax. AR -- 16\/6\/2003
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Grammar.PatternMatch ( module GF.Grammar.PatternMatch (matchPattern,
matchPattern, testOvershadow,
testOvershadow, findMatch,
findMatch, measurePatt
measurePatt ) where
) where
import GF.Data.Operations import GF.Data.Operations
import GF.Grammar.Grammar import GF.Grammar.Grammar

View File

@@ -175,18 +175,18 @@ ppTerm q d (Abs b v e) = let (xs,e') = getAbs (Abs b v e)
in prec d 0 ('\\' <> commaPunct ppBind xs <+> "->" <+> ppTerm q 0 e') in prec d 0 ('\\' <> commaPunct ppBind xs <+> "->" <+> ppTerm q 0 e')
ppTerm q d (T TRaw xs) = case getCTable (T TRaw xs) of ppTerm q d (T TRaw xs) = case getCTable (T TRaw xs) of
([],_) -> "table" <+> '{' $$ ([],_) -> "table" <+> '{' $$
nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$ nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$
'}' '}'
(vs,e) -> prec d 0 ("\\\\" <> commaPunct pp vs <+> "=>" <+> ppTerm q 0 e) (vs,e) -> prec d 0 ("\\\\" <> commaPunct pp vs <+> "=>" <+> ppTerm q 0 e)
ppTerm q d (T (TTyped t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$ ppTerm q d (T (TTyped t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$
nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$ nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$
'}' '}'
ppTerm q d (T (TComp t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$ ppTerm q d (T (TComp t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$
nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$ nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$
'}' '}'
ppTerm q d (T (TWild t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$ ppTerm q d (T (TWild t) xs) = "table" <+> ppTerm q 0 t <+> '{' $$
nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$ nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$
'}' '}'
ppTerm q d (Prod bt x a b)= if x == identW && bt == Explicit ppTerm q d (Prod bt x a b)= if x == identW && bt == Explicit
then prec d 0 (ppTerm q 4 a <+> "->" <+> ppTerm q 0 b) then prec d 0 (ppTerm q 4 a <+> "->" <+> ppTerm q 0 b)
else prec d 0 (parens (ppBind (bt,x) <+> ':' <+> ppTerm q 0 a) <+> "->" <+> ppTerm q 0 b) else prec d 0 (parens (ppBind (bt,x) <+> ':' <+> ppTerm q 0 a) <+> "->" <+> ppTerm q 0 b)
@@ -198,14 +198,14 @@ ppTerm q d (C e1 e2) =prec d 1 (hang (ppTerm q 2 e1) 2 ("++" <+> ppTerm q 1 e
ppTerm q d (Glue e1 e2) =prec d 2 (ppTerm q 3 e1 <+> '+' <+> ppTerm q 2 e2) ppTerm q d (Glue e1 e2) =prec d 2 (ppTerm q 3 e1 <+> '+' <+> ppTerm q 2 e2)
ppTerm q d (S x y) = case x of ppTerm q d (S x y) = case x of
T annot xs -> let e = case annot of T annot xs -> let e = case annot of
TRaw -> y TRaw -> y
TTyped t -> Typed y t TTyped t -> Typed y t
TComp t -> Typed y t TComp t -> Typed y t
TWild t -> Typed y t TWild t -> Typed y t
in "case" <+> ppTerm q 0 e <+>"of" <+> '{' $$ in "case" <+> ppTerm q 0 e <+>"of" <+> '{' $$
nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$ nest 2 (vcat (punctuate ';' (map (ppCase q) xs))) $$
'}' '}'
_ -> prec d 3 (hang (ppTerm q 3 x) 2 ("!" <+> ppTerm q 4 y)) _ -> prec d 3 (hang (ppTerm q 3 x) 2 ("!" <+> ppTerm q 4 y))
ppTerm q d (ExtR x y) = prec d 3 (ppTerm q 3 x <+> "**" <+> ppTerm q 4 y) ppTerm q d (ExtR x y) = prec d 3 (ppTerm q 3 x <+> "**" <+> ppTerm q 4 y)
ppTerm q d (App x y) = prec d 4 (ppTerm q 4 x <+> ppTerm q 5 y) ppTerm q d (App x y) = prec d 4 (ppTerm q 4 x <+> ppTerm q 5 y)
ppTerm q d (V e es) = hang "table" 2 (sep [ppTerm q 6 e,brackets (fsep (punctuate ';' (map (ppTerm q 0) es)))]) ppTerm q d (V e es) = hang "table" 2 (sep [ppTerm q 6 e,brackets (fsep (punctuate ';' (map (ppTerm q 0) es)))])
@@ -362,3 +362,4 @@ getLet :: Term -> ([LocalDef], Term)
getLet (Let l e) = let (ls,e') = getLet e getLet (Let l e) = let (ls,e') = getLet e
in (l:ls,e') in (l:ls,e')
getLet e = ([],e) getLet e = ([],e)

View File

@@ -12,16 +12,15 @@
-- (Description of the module) -- (Description of the module)
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Grammar.Values ( module GF.Grammar.Values (-- ** Values used in TC type checking
-- ** Values used in TC type checking Val(..), Env,
Val(..), Env, -- ** Annotated tree used in editing
-- ** Annotated tree used in editing
Binds, Constraints, MetaSubst, Binds, Constraints, MetaSubst,
-- ** For TC -- ** For TC
valAbsInt, valAbsFloat, valAbsString, vType, valAbsInt, valAbsFloat, valAbsString, vType,
isPredefCat, isPredefCat,
eType, eType,
) where ) where
import GF.Infra.Ident import GF.Infra.Ident
import GF.Grammar.Grammar import GF.Grammar.Grammar

View File

@@ -1,34 +1,13 @@
{-# LANGUAGE CPP #-} {-# LANGUAGE CPP #-}
{-# LANGUAGE TemplateHaskell #-}
module GF.Infra.BuildInfo where module GF.Infra.BuildInfo where
import System.Info import System.Info
import Data.Version(showVersion) import Data.Version(showVersion)
import Language.Haskell.TH.Syntax
import Control.Monad.IO.Class
import Control.Exception
import Data.Time hiding (buildTime)
import System.Process
-- Use Template Haskell to get compile time
buildTime :: String
buildTime = $(do
timeZone <- liftIO getCurrentTimeZone
time <- liftIO $ utcToLocalTime timeZone <$> getCurrentTime
return $ LitE $ StringL $ formatTime defaultTimeLocale "%F %T" time )
-- Use Template Haskell to get current Git information
gitInfo :: String
gitInfo = $(do
info <- liftIO $ try $ readProcess "git" ["log", "--format=commit %h tag %(describe:tags=true)", "-1"] "" :: Q (Either SomeException String)
return $ LitE $ StringL $ either (\_ -> "unavailable") id info )
{-# NOINLINE buildInfo #-} {-# NOINLINE buildInfo #-}
buildInfo = buildInfo =
"Built on "++os++"/"++arch "Built on "++os++"/"++arch
++" with "++compilerName++"-"++showVersion compilerVersion ++ " at " ++ buildTime ++ "\nGit info: " ++ gitInfo ++" with "++compilerName++"-"++showVersion compilerVersion
++"\nFlags:" ++", flags:"
#ifdef USE_INTERRUPT #ifdef USE_INTERRUPT
++" interrupt" ++" interrupt"
#endif #endif

View File

@@ -14,10 +14,10 @@
module GF.Infra.CheckM module GF.Infra.CheckM
(Check, CheckResult, Message, runCheck, runCheck', (Check, CheckResult, Message, runCheck, runCheck',
checkError, checkCond, checkWarn, checkWarnings, checkAccumError, checkError, checkCond, checkWarn, checkWarnings, checkAccumError,
checkIn, checkInModule, checkMap, checkMapRecover, checkIn, checkInModule, checkMap, checkMapRecover,
parallelCheck, accumulateError, commitCheck, parallelCheck, accumulateError, commitCheck,
) where ) where
import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint import Prelude hiding ((<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import GF.Data.Operations import GF.Data.Operations
@@ -48,7 +48,7 @@ newtype Check a
instance Functor Check where fmap = liftM instance Functor Check where fmap = liftM
instance Monad Check where instance Monad Check where
return = pure return x = Check $ \{-ctxt-} ws -> (ws,Success x)
f >>= g = Check $ \{-ctxt-} ws -> f >>= g = Check $ \{-ctxt-} ws ->
case unCheck f {-ctxt-} ws of case unCheck f {-ctxt-} ws of
(ws,Success x) -> unCheck (g x) {-ctxt-} ws (ws,Success x) -> unCheck (g x) {-ctxt-} ws
@@ -58,7 +58,7 @@ instance Fail.MonadFail Check where
fail = raise fail = raise
instance Applicative Check where instance Applicative Check where
pure x = Check $ \{-ctxt-} ws -> (ws,Success x) pure = return
(<*>) = ap (<*>) = ap
instance ErrorMonad Check where instance ErrorMonad Check where

View File

@@ -13,18 +13,18 @@
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Infra.Ident (-- ** Identifiers module GF.Infra.Ident (-- ** Identifiers
ModuleName(..), moduleNameS, ModuleName(..), moduleNameS,
Ident, ident2utf8, showIdent, prefixIdent, Ident, ident2utf8, showIdent, prefixIdent,
-- *** Normal identifiers (returned by the parser) -- *** Normal identifiers (returned by the parser)
identS, identC, identW, identS, identC, identW,
-- *** Special identifiers for internal use -- *** Special identifiers for internal use
identV, identA, identAV, identV, identA, identAV,
argIdent, isArgIdent, getArgIndex, argIdent, isArgIdent, getArgIndex,
varStr, varX, isWildIdent, varIndex, varStr, varX, isWildIdent, varIndex,
-- *** Raw identifiers -- *** Raw identifiers
RawIdent, rawIdentS, rawIdentC, ident2raw, prefixRawIdent, RawIdent, rawIdentS, rawIdentC, ident2raw, prefixRawIdent,
isPrefixOf, showRawIdent isPrefixOf, showRawIdent
) where ) where
import qualified Data.ByteString.UTF8 as UTF8 import qualified Data.ByteString.UTF8 as UTF8
import qualified Data.ByteString.Char8 as BS(append,isPrefixOf) import qualified Data.ByteString.Char8 as BS(append,isPrefixOf)
@@ -77,6 +77,7 @@ instance Binary RawIdent where
put = put . rawId2utf8 put = put . rawId2utf8
get = fmap rawIdentC get get = fmap rawIdentC get
-- | This function should be used with care, since the returned ByteString is -- | This function should be used with care, since the returned ByteString is
-- UTF-8-encoded. -- UTF-8-encoded.
ident2utf8 :: Ident -> UTF8.ByteString ident2utf8 :: Ident -> UTF8.ByteString
@@ -87,7 +88,6 @@ ident2utf8 i = case i of
IAV (Id s) b j -> BS.append s (pack ('_':show b ++ '_':show j)) IAV (Id s) b j -> BS.append s (pack ('_':show b ++ '_':show j))
IW -> pack "_" IW -> pack "_"
ident2raw :: Ident -> RawIdent
ident2raw = Id . ident2utf8 ident2raw = Id . ident2utf8
showIdent :: Ident -> String showIdent :: Ident -> String
@@ -95,14 +95,13 @@ showIdent i = unpack $! ident2utf8 i
instance Pretty Ident where pp = pp . showIdent instance Pretty Ident where pp = pp . showIdent
instance Pretty RawIdent where pp = pp . showRawIdent
identS :: String -> Ident identS :: String -> Ident
identS = identC . rawIdentS identS = identC . rawIdentS
identC :: RawIdent -> Ident identC :: RawIdent -> Ident
identW :: Ident identW :: Ident
prefixIdent :: String -> Ident -> Ident prefixIdent :: String -> Ident -> Ident
prefixIdent pref = identC . Id . BS.append (pack pref) . ident2utf8 prefixIdent pref = identC . Id . BS.append (pack pref) . ident2utf8

View File

@@ -87,7 +87,8 @@ data Verbosity = Quiet | Normal | Verbose | Debug
data Phase = Preproc | Convert | Compile | Link data Phase = Preproc | Convert | Compile | Link
deriving (Show,Eq,Ord) deriving (Show,Eq,Ord)
data OutputFormat = FmtPGFPretty data OutputFormat = FmtLPGF
| FmtPGFPretty
| FmtCanonicalGF | FmtCanonicalGF
| FmtCanonicalJson | FmtCanonicalJson
| FmtJavaScript | FmtJavaScript
@@ -131,13 +132,8 @@ data CFGTransform = CFGNoLR
| CFGRemoveCycles | CFGRemoveCycles
deriving (Show,Eq,Ord) deriving (Show,Eq,Ord)
data HaskellOption = HaskellNoPrefix data HaskellOption = HaskellNoPrefix | HaskellGADT | HaskellLexical
| HaskellGADT | HaskellConcrete | HaskellVariants | HaskellData
| HaskellLexical
| HaskellConcrete
| HaskellVariants
| HaskellData
| HaskellPGF2
deriving (Show,Eq,Ord) deriving (Show,Eq,Ord)
data Warning = WarnMissingLincat data Warning = WarnMissingLincat
@@ -335,7 +331,7 @@ optDescr =
Option ['f'] ["output-format"] (ReqArg outFmt "FMT") Option ['f'] ["output-format"] (ReqArg outFmt "FMT")
(unlines ["Output format. FMT can be one of:", (unlines ["Output format. FMT can be one of:",
"Canonical GF grammar: canonical_gf, canonical_json, (and haskell with option --haskell=concrete)", "Canonical GF grammar: canonical_gf, canonical_json, (and haskell with option --haskell=concrete)",
"Multiple concrete: pgf (default), json, js, pgf_pretty, prolog, python, ...", -- gar, "Multiple concrete: pgf (default), lpgf, json, js, pgf_pretty, prolog, python, ...", -- gar,
"Single concrete only: bnf, ebnf, fa, gsl, jsgf, regexp, slf, srgs_xml, srgs_abnf, vxml, ....", -- cf, lbnf, "Single concrete only: bnf, ebnf, fa, gsl, jsgf, regexp, slf, srgs_xml, srgs_abnf, vxml, ....", -- cf, lbnf,
"Abstract only: haskell, ..."]), -- prolog_abs, "Abstract only: haskell, ..."]), -- prolog_abs,
Option [] ["sisr"] (ReqArg sisrFmt "FMT") Option [] ["sisr"] (ReqArg sisrFmt "FMT")
@@ -477,7 +473,8 @@ outputFormats = map fst outputFormatsExpl
outputFormatsExpl :: [((String,OutputFormat),String)] outputFormatsExpl :: [((String,OutputFormat),String)]
outputFormatsExpl = outputFormatsExpl =
[(("pgf_pretty", FmtPGFPretty),"human-readable pgf"), [(("lpgf", FmtLPGF),"Linearisation-only PGF"),
(("pgf_pretty", FmtPGFPretty),"Human-readable PGF"),
(("canonical_gf", FmtCanonicalGF),"Canonical GF source files"), (("canonical_gf", FmtCanonicalGF),"Canonical GF source files"),
(("canonical_json", FmtCanonicalJson),"Canonical JSON source files"), (("canonical_json", FmtCanonicalJson),"Canonical JSON source files"),
(("js", FmtJavaScript),"JavaScript (whole grammar)"), (("js", FmtJavaScript),"JavaScript (whole grammar)"),
@@ -537,8 +534,7 @@ haskellOptionNames =
("lexical", HaskellLexical), ("lexical", HaskellLexical),
("concrete", HaskellConcrete), ("concrete", HaskellConcrete),
("variants", HaskellVariants), ("variants", HaskellVariants),
("data", HaskellData), ("data", HaskellData)]
("pgf2", HaskellPGF2)]
-- | This is for bacward compatibility. Since GHC 6.12 we -- | This is for bacward compatibility. Since GHC 6.12 we
-- started using the native Unicode support in GHC but it -- started using the native Unicode support in GHC but it

View File

@@ -52,11 +52,11 @@ newtype SIO a = SIO {unS::PutStr->IO a}
instance Functor SIO where fmap = liftM instance Functor SIO where fmap = liftM
instance Applicative SIO where instance Applicative SIO where
pure x = SIO (const (pure x)) pure = return
(<*>) = ap (<*>) = ap
instance Monad SIO where instance Monad SIO where
return = pure return x = SIO (const (return x))
SIO m1 >>= xm2 = SIO $ \ h -> m1 h >>= \ x -> unS (xm2 x) h SIO m1 >>= xm2 = SIO $ \ h -> m1 h >>= \ x -> unS (xm2 x) h
instance Fail.MonadFail SIO where instance Fail.MonadFail SIO where

View File

@@ -32,17 +32,15 @@ import qualified Text.ParserCombinators.ReadP as RP
import System.Directory({-getCurrentDirectory,-}getAppUserDataDirectory) import System.Directory({-getCurrentDirectory,-}getAppUserDataDirectory)
import Control.Exception(SomeException,fromException,evaluate,try) import Control.Exception(SomeException,fromException,evaluate,try)
import Control.Monad.State hiding (void) import Control.Monad.State hiding (void)
import Control.Monad (join, when, (<=<))
import qualified GF.System.Signal as IO(runInterruptibly) import qualified GF.System.Signal as IO(runInterruptibly)
#ifdef SERVER_MODE #ifdef SERVER_MODE
import GF.Server(server) import GF.Server(server)
#endif #endif
import GF.Command.Messages(welcome) import GF.Command.Messages(welcome)
#if !(MIN_VERSION_base(4,9,0)) import GF.Infra.UseIO (Output)
-- Needed to make it compile on GHC < 8 -- Provides an orphan instance of MonadFail for StateT in ghc versions < 8
import Control.Monad.Trans.Instances () import Control.Monad.Trans.Instances ()
#endif
-- | Run the GF Shell in quiet mode (@gf -run@). -- | Run the GF Shell in quiet mode (@gf -run@).
mainRunGFI :: Options -> [FilePath] -> IO () mainRunGFI :: Options -> [FilePath] -> IO ()
@@ -58,7 +56,6 @@ mainGFI opts files = do
shell opts files = flip evalStateT (emptyGFEnv opts) $ shell opts files = flip evalStateT (emptyGFEnv opts) $
do mapStateT runSIO $ importInEnv opts files do mapStateT runSIO $ importInEnv opts files
modify $ \ gfenv0 -> gfenv0 {history = [unwords ("i":files)]}
loop loop
#ifdef SERVER_MODE #ifdef SERVER_MODE
@@ -436,7 +433,7 @@ wc_type = cmd_name
x2 = takeWhile (\c -> isIdent c || isSpace c || c == '-' || c == '=' || c == '"') x1 x2 = takeWhile (\c -> isIdent c || isSpace c || c == '-' || c == '=' || c == '"') x1
cmd = case [x | (x,cs) <- RP.readP_to_S pCommand x2, all isSpace cs] of cmd = case [x | (x,cs) <- RP.readP_to_S pCommand x2, all isSpace cs] of
[x] -> Just x [x] -> Just x
_ -> Nothing _ -> Nothing
isIdent c = c == '_' || c == '\'' || isAlphaNum c isIdent c = c == '_' || c == '\'' || isAlphaNum c

View File

@@ -12,7 +12,7 @@ import GF.Command.Abstract
import GF.Command.Parse(readCommandLine,pCommand) import GF.Command.Parse(readCommandLine,pCommand)
import GF.Data.Operations (Err(..)) import GF.Data.Operations (Err(..))
import GF.Data.Utilities(whenM,repeatM) import GF.Data.Utilities(whenM,repeatM)
import Control.Monad (join, when, (<=<))
import GF.Infra.UseIO(ioErrorText,putStrLnE) import GF.Infra.UseIO(ioErrorText,putStrLnE)
import GF.Infra.SIO import GF.Infra.SIO
import GF.Infra.Option import GF.Infra.Option
@@ -58,7 +58,6 @@ mainGFI opts files = do
shell opts files = flip evalStateT (emptyGFEnv opts) $ shell opts files = flip evalStateT (emptyGFEnv opts) $
do mapStateT runSIO $ importInEnv opts files do mapStateT runSIO $ importInEnv opts files
modify $ \ gfenv0 -> gfenv0 {history = [unwords ("i":files)]}
loop loop
{- {-
@@ -437,7 +436,7 @@ wc_type = cmd_name
x2 = takeWhile (\c -> isIdent c || isSpace c || c == '-' || c == '=' || c == '"') x1 x2 = takeWhile (\c -> isIdent c || isSpace c || c == '-' || c == '=' || c == '"') x1
cmd = case [x | (x,cs) <- RP.readP_to_S pCommand x2, all isSpace cs] of cmd = case [x | (x,cs) <- RP.readP_to_S pCommand x2, all isSpace cs] of
[x] -> Just x [x] -> Just x
_ -> Nothing _ -> Nothing
isIdent c = c == '_' || c == '\'' || isAlphaNum c isIdent c = c == '_' || c == '\'' || isAlphaNum c

View File

@@ -16,21 +16,18 @@ import Data.Version
import System.Directory import System.Directory
import System.Environment (getArgs) import System.Environment (getArgs)
import System.Exit import System.Exit
import GHC.IO.Encoding import GF.System.Console (setConsoleEncoding)
-- import GF.System.Console (setConsoleEncoding)
-- | Run the GF main program, taking arguments from the command line. -- | Run the GF main program, taking arguments from the command line.
-- (It calls 'setConsoleEncoding' and 'getOptions', then 'mainOpts'.) -- (It calls 'setConsoleEncoding' and 'getOptions', then 'mainOpts'.)
-- Run @gf --help@ for usage info. -- Run @gf --help@ for usage info.
main :: IO () main :: IO ()
main = do main = do
setLocaleEncoding utf8 --setConsoleEncoding
-- setConsoleEncoding
uncurry mainOpts =<< getOptions uncurry mainOpts =<< getOptions
-- | Get and parse GF command line arguments. Fix relative paths. -- | Get and parse GF command line arguments. Fix relative paths.
-- Calls 'getArgs' and 'parseOptions'. -- Calls 'getArgs' and 'parseOptions'.
getOptions :: IO (Options, [FilePath])
getOptions = do getOptions = do
args <- getArgs args <- getArgs
case parseOptions args of case parseOptions args of
@@ -48,10 +45,7 @@ getOptions = do
mainOpts :: Options -> [FilePath] -> IO () mainOpts :: Options -> [FilePath] -> IO ()
mainOpts opts files = mainOpts opts files =
case flag optMode opts of case flag optMode opts of
ModeVersion -> do datadir <- getDataDir ModeVersion -> putStrLn $ "Grammatical Framework (GF) version " ++ showVersion version ++ "\n" ++ buildInfo
putStrLn $ "Grammatical Framework (GF) version " ++ showVersion version ++ "\n" ++
buildInfo ++ "\n" ++
"Shared folder: " ++ datadir
ModeHelp -> putStrLn helpMessage ModeHelp -> putStrLn helpMessage
ModeServer port -> GFI1.mainServerGFI opts port files ModeServer port -> GFI1.mainServerGFI opts port files
ModeCompiler -> mainGFC opts files ModeCompiler -> mainGFC opts files

View File

@@ -6,7 +6,7 @@ import qualified Data.Map as M
import Control.Applicative -- for GHC<7.10 import Control.Applicative -- for GHC<7.10
import Control.Monad(when) import Control.Monad(when)
import Control.Monad.State(StateT(..),get,gets,put) import Control.Monad.State(StateT(..),get,gets,put)
import Control.Monad.Except(ExceptT(..),runExceptT) import Control.Monad.Error(ErrorT(..),Error(..))
import System.Random(randomRIO) import System.Random(randomRIO)
--import System.IO(stderr,hPutStrLn) --import System.IO(stderr,hPutStrLn)
import GF.System.Catch(try) import GF.System.Catch(try)
@@ -108,9 +108,9 @@ handle_fcgi execute1 state0 stateM cache =
-- * Request handler -- * Request handler
-- | Handler monad -- | Handler monad
type HM s a = StateT (Q,s) (ExceptT Response IO) a type HM s a = StateT (Q,s) (ErrorT Response IO) a
run :: HM s Response -> (Q,s) -> IO (s,Response) run :: HM s Response -> (Q,s) -> IO (s,Response)
run m s = either bad ok =<< runExceptT (runStateT m s) run m s = either bad ok =<< runErrorT (runStateT m s)
where where
bad resp = return (snd s,resp) bad resp = return (snd s,resp)
ok (resp,(qs,state)) = return (state,resp) ok (resp,(qs,state)) = return (state,resp)
@@ -123,12 +123,12 @@ put_qs qs = do state <- get_state; put (qs,state)
put_state state = do qs <- get_qs; put (qs,state) put_state state = do qs <- get_qs; put (qs,state)
err :: Response -> HM s a err :: Response -> HM s a
err e = StateT $ \ s -> ExceptT $ return $ Left e err e = StateT $ \ s -> ErrorT $ return $ Left e
hmbracket_ :: IO () -> IO () -> HM s a -> HM s a hmbracket_ :: IO () -> IO () -> HM s a -> HM s a
hmbracket_ pre post m = hmbracket_ pre post m =
do s <- get do s <- get
e <- liftIO $ bracket_ pre post $ runExceptT $ runStateT m s e <- liftIO $ bracket_ pre post $ runErrorT $ runStateT m s
case e of case e of
Left resp -> err resp Left resp -> err resp
Right (a,s) -> do put s;return a Right (a,s) -> do put s;return a
@@ -407,6 +407,9 @@ resp404 path = Response 404 [plain,xo] $ "Not found: "++path++"\n"
resp500 msg = Response 500 [plain,xo] $ "Internal error: "++msg++"\n" resp500 msg = Response 500 [plain,xo] $ "Internal error: "++msg++"\n"
resp501 msg = Response 501 [plain,xo] $ "Not implemented: "++msg++"\n" resp501 msg = Response 501 [plain,xo] $ "Not implemented: "++msg++"\n"
instance Error Response where
noMsg = resp500 "no message"
strMsg = resp500
-- * Content types -- * Content types
plain = ct "text/plain" "" plain = ct "text/plain" ""

View File

@@ -12,15 +12,15 @@
-- A simple finite state network module. -- A simple finite state network module.
----------------------------------------------------------------------------- -----------------------------------------------------------------------------
module GF.Speech.FiniteState (FA(..), State, NFA, DFA, module GF.Speech.FiniteState (FA(..), State, NFA, DFA,
startState, finalStates, startState, finalStates,
states, transitions, states, transitions,
isInternal, isInternal,
newFA, newFA_, newFA, newFA_,
addFinalState, addFinalState,
newState, newStates, newState, newStates,
newTransition, newTransitions, newTransition, newTransitions,
insertTransitionWith, insertTransitionsWith, insertTransitionWith, insertTransitionsWith,
mapStates, mapTransitions, mapStates, mapTransitions,
modifyTransitions, modifyTransitions,
nonLoopTransitionsTo, nonLoopTransitionsFrom, nonLoopTransitionsTo, nonLoopTransitionsFrom,
loops, loops,
@@ -28,11 +28,11 @@ module GF.Speech.FiniteState (FA(..), State, NFA, DFA,
oneFinalState, oneFinalState,
insertNFA, insertNFA,
onGraph, onGraph,
moveLabelsToNodes, removeTrivialEmptyNodes, moveLabelsToNodes, removeTrivialEmptyNodes,
minimize, minimize,
dfa2nfa, dfa2nfa,
unusedNames, renameStates, unusedNames, renameStates,
prFAGraphviz, faToGraphviz) where prFAGraphviz, faToGraphviz) where
import Data.List import Data.List
import Data.Maybe import Data.Maybe
@@ -145,7 +145,7 @@ renameStates :: Ord x => [y] -- ^ Infinite supply of new names
renameStates supply (FA g s fs) = FA (renameNodes newName rest g) s' fs' renameStates supply (FA g s fs) = FA (renameNodes newName rest g) s' fs'
where (ns,rest) = splitAt (length (nodes g)) supply where (ns,rest) = splitAt (length (nodes g)) supply
newNodes = Map.fromList (zip (map fst (nodes g)) ns) newNodes = Map.fromList (zip (map fst (nodes g)) ns)
newName n = Map.findWithDefault (error "FiniteState.newName") n newNodes newName n = Map.findWithDefault (error "FiniteState.newName") n newNodes
s' = newName s s' = newName s
fs' = map newName fs fs' = map newName fs
@@ -182,9 +182,9 @@ oneFinalState nl el fa =
moveLabelsToNodes :: (Ord n,Eq a) => FA n () (Maybe a) -> FA n (Maybe a) () moveLabelsToNodes :: (Ord n,Eq a) => FA n () (Maybe a) -> FA n (Maybe a) ()
moveLabelsToNodes = onGraph f moveLabelsToNodes = onGraph f
where f g@(Graph c _ _) = Graph c' ns (concat ess) where f g@(Graph c _ _) = Graph c' ns (concat ess)
where is = [ ((n,l),inc) | (n, (l,inc,_)) <- Map.toList (nodeInfo g)] where is = [ ((n,l),inc) | (n, (l,inc,_)) <- Map.toList (nodeInfo g)]
(c',is') = mapAccumL fixIncoming c is (c',is') = mapAccumL fixIncoming c is
(ns,ess) = unzip (concat is') (ns,ess) = unzip (concat is')
-- | Remove empty nodes which are not start or final, and have -- | Remove empty nodes which are not start or final, and have
@@ -234,17 +234,17 @@ fixIncoming :: (Ord n, Eq a) => [n]
-- incoming edges. -- incoming edges.
fixIncoming cs c@((n,()),es) = (cs'', ((n,Nothing),es'):newContexts) fixIncoming cs c@((n,()),es) = (cs'', ((n,Nothing),es'):newContexts)
where ls = nub $ map edgeLabel es where ls = nub $ map edgeLabel es
(cs',cs'') = splitAt (length ls) cs (cs',cs'') = splitAt (length ls) cs
newNodes = zip cs' ls newNodes = zip cs' ls
es' = [ (x,n,()) | x <- map fst newNodes ] es' = [ (x,n,()) | x <- map fst newNodes ]
-- separate cyclic and non-cyclic edges -- separate cyclic and non-cyclic edges
(cyc,ncyc) = partition (\ (f,_,_) -> f == n) es (cyc,ncyc) = partition (\ (f,_,_) -> f == n) es
-- keep all incoming non-cyclic edges with the right label -- keep all incoming non-cyclic edges with the right label
to (x,l) = [ (f,x,()) | (f,_,l') <- ncyc, l == l'] to (x,l) = [ (f,x,()) | (f,_,l') <- ncyc, l == l']
-- for each cyclic edge with the right label, -- for each cyclic edge with the right label,
-- add an edge from each of the new nodes (including this one) -- add an edge from each of the new nodes (including this one)
++ [ (y,x,()) | (f,_,l') <- cyc, l == l', (y,_) <- newNodes] ++ [ (y,x,()) | (f,_,l') <- cyc, l == l', (y,_) <- newNodes]
newContexts = [ (v, to v) | v <- newNodes ] newContexts = [ (v, to v) | v <- newNodes ]
--alphabet :: Eq b => Graph n a (Maybe b) -> [b] --alphabet :: Eq b => Graph n a (Maybe b) -> [b]
--alphabet = nub . catMaybes . map edgeLabel . edges --alphabet = nub . catMaybes . map edgeLabel . edges
@@ -254,18 +254,18 @@ determinize (FA g s f) = let (ns,es) = h (Set.singleton start) Set.empty Set.emp
(ns',es') = (Set.toList ns, Set.toList es) (ns',es') = (Set.toList ns, Set.toList es)
final = filter isDFAFinal ns' final = filter isDFAFinal ns'
fa = FA (Graph undefined [(n,()) | n <- ns'] es') start final fa = FA (Graph undefined [(n,()) | n <- ns'] es') start final
in renameStates [0..] fa in renameStates [0..] fa
where info = nodeInfo g where info = nodeInfo g
-- reach = nodesReachable out -- reach = nodesReachable out
start = closure info $ Set.singleton s start = closure info $ Set.singleton s
isDFAFinal n = not (Set.null (Set.fromList f `Set.intersection` n)) isDFAFinal n = not (Set.null (Set.fromList f `Set.intersection` n))
h currentStates oldStates es h currentStates oldStates es
| Set.null currentStates = (oldStates,es) | Set.null currentStates = (oldStates,es)
| otherwise = ((h $! uniqueNewStates) $! allOldStates) $! es' | otherwise = ((h $! uniqueNewStates) $! allOldStates) $! es'
where where
allOldStates = oldStates `Set.union` currentStates allOldStates = oldStates `Set.union` currentStates
(newStates,es') = new (Set.toList currentStates) Set.empty es (newStates,es') = new (Set.toList currentStates) Set.empty es
uniqueNewStates = newStates Set.\\ allOldStates uniqueNewStates = newStates Set.\\ allOldStates
-- Get the sets of states reachable from the given states -- Get the sets of states reachable from the given states
-- by consuming one symbol, and the associated edges. -- by consuming one symbol, and the associated edges.
new [] rs es = (rs,es) new [] rs es = (rs,es)
@@ -296,8 +296,8 @@ reachable1 info ns = Map.fromListWith (++) [(c, [y]) | n <- Set.toList ns, (_,y,
reverseNFA :: NFA a -> NFA a reverseNFA :: NFA a -> NFA a
reverseNFA (FA g s fs) = FA g''' s' [s] reverseNFA (FA g s fs) = FA g''' s' [s]
where g' = reverseGraph g where g' = reverseGraph g
(g'',s') = newNode () g' (g'',s') = newNode () g'
g''' = newEdges [(s',f,Nothing) | f <- fs] g'' g''' = newEdges [(s',f,Nothing) | f <- fs] g''
dfa2nfa :: DFA a -> NFA a dfa2nfa :: DFA a -> NFA a
dfa2nfa = mapTransitions Just dfa2nfa = mapTransitions Just
@@ -316,10 +316,10 @@ faToGraphviz :: (Eq n,Show n) => FA n String String -> Dot.Graph
faToGraphviz (FA (Graph _ ns es) s f) faToGraphviz (FA (Graph _ ns es) s f)
= Dot.Graph Dot.Directed Nothing [] (map mkNode ns) (map mkEdge es) [] = Dot.Graph Dot.Directed Nothing [] (map mkNode ns) (map mkEdge es) []
where mkNode (n,l) = Dot.Node (show n) attrs where mkNode (n,l) = Dot.Node (show n) attrs
where attrs = [("label",l)] where attrs = [("label",l)]
++ if n == s then [("shape","box")] else [] ++ if n == s then [("shape","box")] else []
++ if n `elem` f then [("style","bold")] else [] ++ if n `elem` f then [("style","bold")] else []
mkEdge (x,y,l) = Dot.Edge (show x) (show y) [("label",l)] mkEdge (x,y,l) = Dot.Edge (show x) (show y) [("label",l)]
-- --
-- * Utilities -- * Utilities

View File

@@ -32,7 +32,7 @@ prGSL :: SRG -> Doc
prGSL srg = header $++$ mainCat $++$ foldr ($++$) empty (map prRule (srgRules srg)) prGSL srg = header $++$ mainCat $++$ foldr ($++$) empty (map prRule (srgRules srg))
where where
header = ";GSL2.0" $$ header = ";GSL2.0" $$
comment ("Nuance speech recognition grammar for " ++ srgName srg) $$ comment ("Nuance speech recognition grammar for " ++ srgName srg) $$
comment ("Generated by GF") comment ("Generated by GF")
mainCat = ".MAIN" <+> prCat (srgStartCat srg) mainCat = ".MAIN" <+> prCat (srgStartCat srg)
prRule (SRGRule cat rhs) = prCat cat <+> union (map prAlt rhs) prRule (SRGRule cat rhs) = prCat cat <+> union (map prAlt rhs)

View File

@@ -31,7 +31,7 @@ width :: Int
width = 75 width = 75
jsgfPrinter :: Options jsgfPrinter :: Options
-> PGF -> PGF
-> CId -> String -> CId -> String
jsgfPrinter opts pgf cnc = renderStyle st $ prJSGF sisr $ makeNonLeftRecursiveSRG opts pgf cnc jsgfPrinter opts pgf cnc = renderStyle st $ prJSGF sisr $ makeNonLeftRecursiveSRG opts pgf cnc
where st = style { lineLength = width } where st = style { lineLength = width }
@@ -44,7 +44,7 @@ prJSGF sisr srg
header = "#JSGF" <+> "V1.0" <+> "UTF-8" <+> lang <> ';' $$ header = "#JSGF" <+> "V1.0" <+> "UTF-8" <+> lang <> ';' $$
comment ("JSGF speech recognition grammar for " ++ srgName srg) $$ comment ("JSGF speech recognition grammar for " ++ srgName srg) $$
comment "Generated by GF" $$ comment "Generated by GF" $$
("grammar " ++ srgName srg ++ ";") ("grammar " ++ srgName srg ++ ";")
lang = maybe empty pp (srgLanguage srg) lang = maybe empty pp (srgLanguage srg)
mainCat = rule True "MAIN" [prCat (srgStartCat srg)] mainCat = rule True "MAIN" [prCat (srgStartCat srg)]
prRule (SRGRule cat rhs) = rule (isExternalCat srg cat) cat (map prAlt rhs) prRule (SRGRule cat rhs) = rule (isExternalCat srg cat) cat (map prAlt rhs)
@@ -110,3 +110,4 @@ prepunctuate p (x:xs) = x : map (p <>) xs
($++$) :: Doc -> Doc -> Doc ($++$) :: Doc -> Doc -> Doc
x $++$ y = x $$ emptyLine $$ y x $++$ y = x $$ emptyLine $$ y

View File

@@ -40,20 +40,20 @@ import qualified Data.Set as Set
--import Debug.Trace --import Debug.Trace
data SRG = SRG { srgName :: String -- ^ grammar name data SRG = SRG { srgName :: String -- ^ grammar name
, srgStartCat :: Cat -- ^ start category name , srgStartCat :: Cat -- ^ start category name
, srgExternalCats :: Set Cat , srgExternalCats :: Set Cat
, srgLanguage :: Maybe String -- ^ The language for which the grammar , srgLanguage :: Maybe String -- ^ The language for which the grammar
-- is intended, e.g. en-UK -- is intended, e.g. en-UK
, srgRules :: [SRGRule] , srgRules :: [SRGRule]
} }
deriving (Eq,Show) deriving (Eq,Show)
data SRGRule = SRGRule Cat [SRGAlt] data SRGRule = SRGRule Cat [SRGAlt]
deriving (Eq,Show) deriving (Eq,Show)
-- | maybe a probability, a rule name and an EBNF right-hand side -- | maybe a probability, a rule name and an EBNF right-hand side
data SRGAlt = SRGAlt (Maybe Double) CFTerm SRGItem data SRGAlt = SRGAlt (Maybe Double) CFTerm SRGItem
deriving (Eq,Show) deriving (Eq,Show)
type SRGItem = RE SRGSymbol type SRGItem = RE SRGSymbol
@@ -111,10 +111,10 @@ makeNonRecursiveSRG opts = mkSRG cfgToSRG id
mkSRG :: (CFG -> [SRGRule]) -> (CFG -> CFG) -> PGF -> CId -> SRG mkSRG :: (CFG -> [SRGRule]) -> (CFG -> CFG) -> PGF -> CId -> SRG
mkSRG mkRules preprocess pgf cnc = mkSRG mkRules preprocess pgf cnc =
SRG { srgName = showCId cnc, SRG { srgName = showCId cnc,
srgStartCat = cfgStartCat cfg, srgStartCat = cfgStartCat cfg,
srgExternalCats = cfgExternalCats cfg, srgExternalCats = cfgExternalCats cfg,
srgLanguage = languageCode pgf cnc, srgLanguage = languageCode pgf cnc,
srgRules = mkRules cfg } srgRules = mkRules cfg }
where cfg = renameCats (showCId cnc) $ preprocess $ pgfToCFG pgf cnc where cfg = renameCats (showCId cnc) $ preprocess $ pgfToCFG pgf cnc
-- | Renames all external cats C to C_cat, and all internal cats C_X (where X is any string), -- | Renames all external cats C to C_cat, and all internal cats C_X (where X is any string),

View File

@@ -38,7 +38,7 @@ width :: Int
width = 75 width = 75
srgsAbnfPrinter :: Options srgsAbnfPrinter :: Options
-> PGF -> CId -> String -> PGF -> CId -> String
srgsAbnfPrinter opts pgf cnc = showDoc $ prABNF sisr $ makeNonLeftRecursiveSRG opts pgf cnc srgsAbnfPrinter opts pgf cnc = showDoc $ prABNF sisr $ makeNonLeftRecursiveSRG opts pgf cnc
where sisr = flag optSISR opts where sisr = flag optSISR opts
@@ -125,3 +125,4 @@ prepunctuate p (x:xs) = x : map (p <>) xs
($++$) :: Doc -> Doc -> Doc ($++$) :: Doc -> Doc -> Doc
x $++$ y = x $$ emptyLine $$ y x $++$ y = x $$ emptyLine $$ y

View File

@@ -37,7 +37,7 @@ prSrgsXml sisr srg = showXMLDoc (optimizeSRGS xmlGr)
[meta "description" [meta "description"
("SRGS XML speech recognition grammar for " ++ srgName srg ++ "."), ("SRGS XML speech recognition grammar for " ++ srgName srg ++ "."),
meta "generator" "Grammatical Framework"] meta "generator" "Grammatical Framework"]
++ map ruleToXML (srgRules srg) ++ map ruleToXML (srgRules srg)
ruleToXML (SRGRule cat alts) = Tag "rule" ([("id",cat)]++pub) (prRhs alts) ruleToXML (SRGRule cat alts) = Tag "rule" ([("id",cat)]++pub) (prRhs alts)
where pub = if isExternalCat srg cat then [("scope","public")] else [] where pub = if isExternalCat srg cat then [("scope","public")] else []
prRhs rhss = [oneOf (map (mkProd sisr) rhss)] prRhs rhss = [oneOf (map (mkProd sisr) rhss)]
@@ -81,12 +81,12 @@ oneOf = Tag "one-of" []
grammar :: Maybe SISRFormat grammar :: Maybe SISRFormat
-> String -- ^ root -> String -- ^ root
-> Maybe String -- ^language -> Maybe String -- ^language
-> [XML] -> XML -> [XML] -> XML
grammar sisr root ml = grammar sisr root ml =
Tag "grammar" $ [("xmlns","http://www.w3.org/2001/06/grammar"), Tag "grammar" $ [("xmlns","http://www.w3.org/2001/06/grammar"),
("version","1.0"), ("version","1.0"),
("mode","voice"), ("mode","voice"),
("root",root)] ("root",root)]
++ (if isJust sisr then [("tag-format","semantics/1.0")] else []) ++ (if isJust sisr then [("tag-format","semantics/1.0")] else [])
++ maybe [] (\l -> [("xml:lang", l)]) ml ++ maybe [] (\l -> [("xml:lang", l)]) ml

Some files were not shown because too many files have changed in this diff Show More