1
0
forked from GitHub/gf-core

Compare commits

...

348 Commits

Author SHA1 Message Date
John J. Camilleri
a27b07542d Add run-on-grammar canonical test script 2021-07-01 14:05:30 +02:00
John J. Camilleri
78b73fba20 Make cleanupRecordFields also recurse into variants
It's possible that more constructors need to be handled
2021-07-01 13:53:33 +02:00
John J. Camilleri
e5a2aed5b6 Remove record fields not in lincat
Fixes #100, #101
2021-07-01 11:47:14 +02:00
John J. Camilleri
13575b093f Add top-level signatures and general code cleanup 2021-07-01 10:13:42 +02:00
John J. Camilleri
32be75ca7d Reduce Phrasebook grammars in testsuite/canonical to bare minimum 2021-07-01 09:22:57 +02:00
John J. Camilleri
587004f985 Sort record fields in lin definitions
Fixes #102
2021-06-30 14:14:54 +02:00
John J. Camilleri
4436cb101e Move testsuite/compiler/canonical on level up, update test script 2021-06-30 13:47:15 +02:00
John J. Camilleri
0f5be0bbaa Add shell script in testsuite/compiler/canonical for replicating known issues
Ideally this is integrated into proper test suite, but that's too much overhead for now
2021-06-30 12:41:56 +02:00
John J. Camilleri
0a70eca6e2 Make GF.Grammar.Canonical.Id a type synonym for GF.Infra.Ident.RawIdent
This avoids a lot of conversion back and forth between Strings and ByteStrings

This commit was cherry-picked from d0c27cdaae (lpgf branch)
2021-06-30 10:58:23 +02:00
Inari Listenmaa
6efbd23c5c Merge pull request #84 from ffrixslee/issue-46
Issue 46 (various deprecations during compilation of GF)
2021-06-29 23:48:00 +02:00
John J. Camilleri
3a27fa0d39 Add another = 2021-06-24 09:34:27 +02:00
John J. Camilleri
1ba5449d21 Update pgf.cabal, and minors to other cabal files 2021-06-24 09:31:37 +02:00
John J. Camilleri
cf9afa8f74 Update README.md
Add `stack install` as alternative to `cabal install`
2021-06-23 09:20:44 +02:00
John J. Camilleri
91d2ecf23c Update RELEASE.md
Add link to gf maintainers on Hackage.
2021-06-23 09:16:03 +02:00
John J. Camilleri
8206143328 Merge pull request #106 from GrammaticalFramework/stack-yaml-symlink
In the end, just some minor additions to Stack files. See discussion for more.
2021-06-22 13:37:13 +02:00
John J. Camilleri
5564a2f244 Make stack.yaml a regular file again 2021-06-22 13:35:46 +02:00
John J. Camilleri
cf2eff3801 Merge branch 'master' into stack-yaml-symlink 2021-06-22 13:32:17 +02:00
Inari Listenmaa
5a53a38247 Merge pull request #114 from 1Regina/fix-tests
Fix tests
2021-06-18 05:27:38 +02:00
Andreas Källberg
02671cafd0 Disable cabal tests
The test suite isn't currently able to find the gf executable on cabal
2021-06-17 20:20:18 +08:00
Andreas Källberg
0a18688788 Remove gf-lib-path from testsuite
Since it no longer depends on RGL and it caused issues in the testsuite
2021-06-17 19:24:14 +08:00
Andreas Källberg
889be1ab8e Enable tests in github actions 2021-06-17 16:42:04 +08:00
Andreas Källberg
65522a63c3 Testsuite: Add support for expected failures
And mark the currently failing tests as expected failures
2021-06-17 16:38:33 +08:00
Andreas Källberg
7065125e19 Fix "canonicalizePath: does not exist" issue on ghc-7.10
This caused failures in the test suite
Only fixes it for stack builds.
We should probably add constraints to the cabal file as well
2021-06-16 15:30:24 +08:00
Andreas Källberg
2c37e7dfad Fix build for ghc-7.10.3 2021-06-16 14:54:36 +08:00
Andreas Källberg
f505d88a8e Fix build of test suite on ghc-8.2.2 2021-06-16 14:27:19 +08:00
Andreas Källberg
b1ed63b089 Don't print stack traces in Command.hs
They don't provide useful info anyways and they are needlessly verbose.
2021-06-16 14:26:22 +08:00
Inari Listenmaa
f23031ea1d Add command ai f to trigger error msg 2021-06-16 12:23:07 +08:00
Inari Listenmaa
c3153134b7 Remove CStr [] which causes error, update gold 2021-06-16 12:19:35 +08:00
Inari Listenmaa
fd4fb62b9e Add output files for test suite in gitignore 2021-06-11 13:55:20 +08:00
Inari Listenmaa
53c3afbd6f Remove CallStack outputs from gold files
Rather, we should not output these, or output them in a nicer way.
2021-06-11 13:55:04 +08:00
Tristan Koh
544b39a8a5 changed build wheels repo link from master to main 2021-06-11 13:23:18 +08:00
Jacob Tan En
6179d79e72 Update gf.cabal
`cabal install` needs this
2021-06-11 13:23:18 +08:00
Jacob Tan En
ecb19013c0 Update index-3.11.md
`Cabal install` is fragile and can fail if the GHC on path is of an incompatible version.

Use ghcup to use a GHC version that is known to work.
2021-06-11 13:23:18 +08:00
1Regina
c416571406 Rectified gold files 2021-06-11 12:14:49 +08:00
1Regina
a1372040b4 Add RGL dependencies - Prelude and Predef 2021-06-11 11:47:03 +08:00
1Regina
67fcf21577 remove testsuite/libraries 2021-06-11 11:43:41 +08:00
Inari Listenmaa
a7ab610f95 Merge pull request #113 from TristanKoh/master
Changed build wheels repo link from master to main
2021-06-10 07:02:55 +02:00
Tristan Koh
e5b8fa095b changed build wheels repo link from master to main 2021-06-10 12:00:57 +08:00
Inari Listenmaa
6beebbac2b Merge pull request #111 from 2jacobtan/patch-2
Update gf.cabal
2021-06-10 05:46:45 +02:00
Inari Listenmaa
95917a7715 Merge pull request #110 from 2jacobtan/patch-1
Update index-3.11.md
2021-06-10 01:17:27 +02:00
Jacob Tan En
de8b23c014 Update gf.cabal
`cabal install` needs this
2021-06-09 19:56:08 +08:00
Jacob Tan En
098541dda2 Update index-3.11.md
`Cabal install` is fragile and can fail if the GHC on path is of an incompatible version.

Use ghcup to use a GHC version that is known to work.
2021-06-09 18:31:16 +08:00
1Regina
af87664d27 Merge branch 'enable-tests' of https://github.com/kharus/gf-core into fix-tests
to continue working from ruslan tests
2021-06-09 10:39:49 +08:00
krangelov
af1360d37e allow parameter cat in the Web API for parsing 2021-05-27 11:45:31 +02:00
krangelov
eeda03e9b0 added news 2021-05-05 15:04:15 +02:00
John J. Camilleri
7042768054 Merge pull request #107 from GrammaticalFramework/pgf2-complete
Add complete function to PGF2
2021-05-03 22:49:31 +02:00
John J. Camilleri
84fd431afd Manage to get completion working in PGF2 2021-05-03 22:28:48 +02:00
John J. Camilleri
588cd6ddb1 Improvement to test script, distinguishes when input ends with whitespace 2021-05-03 20:51:24 +02:00
John J. Camilleri
437bd8e7f9 Add proper error handling in complete 2021-05-03 20:36:31 +02:00
John J. Camilleri
e56d1b2959 Second attempt. Reading enum is closer to working but all strings are empty. 2021-05-03 14:25:35 +02:00
John J. Camilleri
450368f9bb First attempt at adding support for complete in PGF2 (gives segmentation faults) 2021-05-03 13:19:08 +02:00
John J. Camilleri
07fd41294a Comment out c-runtime flag by default 2021-05-03 10:33:36 +02:00
John J. Camilleri
4729d22c36 Make stack.yaml an actual symlink to stack-ghc8.6.5.yaml. Add some commented flags in stack files. 2021-05-03 10:24:26 +02:00
John J. Camilleri
60bc752a6f Add note about type-checking dynamic expressions in PGF2 Haddock
Closes #72
2021-04-30 14:59:20 +02:00
John J. Camilleri
91278e2b4b Remove notice about example grammars not being included anymore from build scripts 2021-04-30 13:39:15 +02:00
John J. Camilleri
9b4f2dd18b Remove notice about RGL not being included anymore from build scripts 2021-03-08 13:48:30 +01:00
Inari Listenmaa
9dda5dfa8a (Homepage) Link to summer school 2021 2021-03-01 04:23:39 +01:00
Inari Listenmaa
2fd94f5f57 Merge pull request #99 from inariksit/refman-updates
(refman) Add section about lists + links to my blog
2021-03-01 04:16:47 +01:00
krangelov
ba3e09cc38 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2021-02-26 14:59:18 +01:00
krangelov
8fbfc0b4a9 bracketedLinearize now reports the binding spots just like the Haskell binding 2021-02-26 14:58:31 +01:00
Inari Listenmaa
f9b8653ab2 (refman) Add section about lists + links to my blog 2021-02-22 23:18:42 +08:00
Inari Listenmaa
173fca7f12 Merge pull request #93 from inariksit/notYet-errormsg
Fix #92 (inappropriate error message)
2021-02-06 14:04:45 +01:00
Inari Listenmaa
c6ff3e0c5e Update also the stack setup 2021-01-25 18:48:46 +08:00
Inari Listenmaa
8a85dbc66f Update Github actions to latest haskell-setup 2021-01-25 18:43:25 +08:00
krangelov
655173932e fix type signature 2021-01-20 20:38:19 +01:00
krangelov
04f6f113f0 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2021-01-20 20:16:26 +01:00
krangelov
bac619f025 fix gu_map_next 2021-01-20 20:15:40 +01:00
Inari Listenmaa
1a466c14c8 Don't print out the error msg for pattern matching unnecessarily 2021-01-20 01:15:28 +08:00
Inari Listenmaa
d77921005a Specify that the Windows guide is for Windows 10 2020-12-16 17:38:13 +01:00
John J. Camilleri
2b6b315bd7 Add note to changelog that #87 is still pending 2020-12-07 10:35:11 +01:00
John J. Camilleri
7f6bfa730b Update changelog for 3.11 by going through all commit messages since GF-3.10 2020-12-07 10:31:19 +01:00
John J. Camilleri
d6be4ec3b0 Add note about why we're not using Setup Haskell action for Ubuntu build. 2020-11-30 23:05:43 +01:00
John J. Camilleri
68ec61f44d Remove build status badge from README
The GitHub interface now provides much richer information about build status.
2020-11-30 22:03:23 +01:00
John J. Camilleri
491084e38e Merge pull request #88 from GrammaticalFramework/build-binary-packages
Build binary packages
2020-11-30 22:00:00 +01:00
John J. Camilleri
a7a6eb5581 Update release instructions 2020-11-30 21:34:50 +01:00
John J. Camilleri
4223935b12 Combine binary workflows into one with multiple jobs 2020-11-30 20:57:20 +01:00
John J. Camilleri
8dc1ed83b6 Update RELEASE.md 2020-11-30 13:01:51 +01:00
John J. Camilleri
8f3a7a3b6a Copy things into subfolders 2020-11-27 00:51:38 +01:00
John J. Camilleri
921a8981fb Install python-devel in msys2 2020-11-27 00:11:54 +01:00
John J. Camilleri
169f2c786d Install pip in msys2 2020-11-27 00:00:44 +01:00
John J. Camilleri
629a574dfa Considerable updates to 3.11 download page 2020-11-26 23:57:09 +01:00
John J. Camilleri
6b7e9c8c7a Inherit path with Python compilation too 2020-11-26 23:40:08 +01:00
John J. Camilleri
78f42774da Inherit path when running pip 2020-11-26 23:30:47 +01:00
John J. Camilleri
54c0949354 Change echo path command in powershell 2020-11-26 23:16:15 +01:00
John J. Camilleri
0632824b99 Need to add pip to msys2 path 2020-11-26 23:02:16 +01:00
John J. Camilleri
24bbeb31df Change extra lib/include directories 2020-11-26 22:38:23 +01:00
John J. Camilleri
70811d83be Merge remote-tracking branch 'origin/master' into build-binary-packages 2020-11-25 20:57:01 +01:00
John J. Camilleri
0ed6b726a2 Add lib/include envvars for Python build 2020-11-25 20:47:47 +01:00
John J. Camilleri
88252cb107 Try build Python bindings not using msys2 2020-11-24 23:21:05 +01:00
John J. Camilleri
cf6468a452 Install Python dev tools 2020-11-24 23:03:46 +01:00
John J. Camilleri
3e1c69da21 First attempt at building Python bindings for Windows 2020-11-24 22:46:28 +01:00
John J. Camilleri
4bcde7d6a2 Copy compiled Java files from local 2020-11-24 22:09:37 +01:00
John J. Camilleri
78c1c099df Find jpgf.jar 2020-11-24 21:08:56 +01:00
John J. Camilleri
7501a7916e Copy jpgf.jar to dist 2020-11-24 20:47:09 +01:00
John J. Camilleri
32f451f1d7 Add jdk/bin to path 2020-11-24 20:29:04 +01:00
John J. Camilleri
aad2ba61d4 Move flag 2020-11-24 15:59:01 +01:00
John J. Camilleri
9932b10bf1 Add -D__int64=int64_t flag to Java Windows build 2020-11-24 15:48:22 +01:00
John J. Camilleri
f8da24c5ec Change include path when building Java bindings in Windows build 2020-11-24 15:36:13 +01:00
John J. Camilleri
951e439703 First attempt at building Java bindings 2020-11-24 13:06:13 +01:00
John J. Camilleri
08e6aca83d Windows testing... 2020-11-24 13:00:05 +01:00
John J. Camilleri
301f23ac55 Windows testing... 2020-11-24 12:58:21 +01:00
John J. Camilleri
e36b7cb044 Windows testing... 2020-11-24 12:57:24 +01:00
John J. Camilleri
9131581f03 Add windows-testing workflow 2020-11-24 12:52:21 +01:00
John J. Camilleri
d79fa6d22b Move DLLs into another folder first 2020-11-24 12:48:46 +01:00
John J. Camilleri
c8623e2be7 Try to find Java stuff 2020-11-24 12:40:42 +01:00
John J. Camilleri
59dda75f16 Find Java stuff 2020-11-24 12:30:53 +01:00
John J. Camilleri
cac65418ff Try yet another path 2020-11-24 12:28:32 +01:00
John J. Camilleri
e47ce2a28b Try another upload path 2020-11-24 12:19:25 +01:00
John J. Camilleri
9a697fbde4 Try different upload path 2020-11-24 12:10:43 +01:00
John J. Camilleri
43b06d5f53 Use windows path with upload-artifact 2020-11-24 10:41:26 +01:00
John J. Camilleri
ee6082d100 Typo 2020-11-24 10:25:04 +01:00
John J. Camilleri
4d2218a0d1 See what's in /mingw64/bin 2020-11-24 10:10:04 +01:00
John J. Camilleri
af9c8ee553 Add compiled C runtime to artifact in Windows build 2020-11-19 14:56:51 +01:00
John J. Camilleri
3e20e735a3 Install gcc in MSYS 2020-11-19 01:22:26 +01:00
John J. Camilleri
0a0060373b Flip slashes 2020-11-19 01:09:07 +01:00
John J. Camilleri
12ece26409 sudo mkdir 2020-11-19 00:58:37 +01:00
John J. Camilleri
424e6887b5 Attempt to build C runtime in Windows build using MSYS 2020-11-19 00:58:07 +01:00
John J. Camilleri
4987b70df7 Make directory first 2020-11-19 00:49:31 +01:00
John J. Camilleri
a072b4688b Create symlink to JNI headers in a place where the Java runtime makefile looks 2020-11-19 00:40:51 +01:00
John J. Camilleri
0b3ae5aaa2 Rename artifacts 2020-11-19 00:26:53 +01:00
John J. Camilleri
a48bbb3b13 Enable server, upload artifact (only exe) to Windows build 2020-11-19 00:08:07 +01:00
John J. Camilleri
131d196fad Add 'export CPPFLAGS' line to macOS build 2020-11-19 00:02:50 +01:00
John J. Camilleri
b0341ec42d Add more test commands for JNI in macOS; re-enable mac & Windows builds on push for testing 2020-11-18 23:48:20 +01:00
John J. Camilleri
293d05fde1 Install alex, happy in Windows build 2020-11-18 23:26:31 +01:00
John J. Camilleri
d39e4a22a8 Don't build binaries on push 2020-11-18 23:24:38 +01:00
John J. Camilleri
8e9212d059 First install dependencies in Windows build 2020-11-18 23:19:24 +01:00
John J. Camilleri
012541ff55 Remove locate command, which fails because of missing DB 2020-11-18 23:17:42 +01:00
John J. Camilleri
0d12c7101c Add debugging output to macOS build for locating JNI headers 2020-11-18 23:15:15 +01:00
John J. Camilleri
6ee7c88f34 Add first attempt at Windows build 2020-11-18 23:10:54 +01:00
John J. Camilleri
08af135653 Install openjdk in macOS build 2020-11-18 23:08:49 +01:00
krangelov
37c63a0c22 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2020-11-17 10:27:07 +01:00
krangelov
d4ccd2848c Take a step back and do bottom up prediction only when needed. This solves the problem that some sentences were impossible to parse. 2020-11-17 10:26:00 +01:00
John J. Camilleri
6862098d8b Add preliminary RELEASE.md 2020-11-17 01:07:05 +01:00
John J. Camilleri
40e5f90d56 Add stubs in download/ for 3.11. Make dowload/index.html redirect to current version. 2020-11-17 01:05:32 +01:00
John J. Camilleri
3df552eb5d Remove 'download/gfc' 2020-11-17 00:48:31 +01:00
John J. Camilleri
dbb0bcc5dd Minors in comments 2020-11-17 00:44:05 +01:00
John J. Camilleri
38facbc064 Copy package from parent dir first because upload-artifact doesn't allow .. 2020-11-17 00:16:01 +01:00
John J. Camilleri
8cc901f334 Remove txt2tags and pandoc from Debian requirements 2020-11-17 00:04:57 +01:00
John J. Camilleri
8550f8deaf Remove RGL and HTML from Debian build 2020-11-17 00:01:05 +01:00
John J. Camilleri
5a6acf1d47 Replace _ with - 2020-11-16 23:50:02 +01:00
John J. Camilleri
a7ff2d0611 Upload artifact directly without copying elsewhere in between 2020-11-16 23:41:01 +01:00
John J. Camilleri
30bcafb76f Re-enable hack for finding C runtime on macOS 2020-11-16 23:18:40 +01:00
John J. Camilleri
ce9caa2726 Install alex and happy 2020-11-16 22:54:19 +01:00
John J. Camilleri
b4ccca8c18 Build on push too 2020-11-16 22:46:04 +01:00
John J. Camilleri
2dc11524fc Remove RGL from build, use 'cabal v1-' commands 2020-11-16 22:42:13 +01:00
Liyana
76bec6d71e Omitted import Except(..) 2020-11-12 09:48:15 +08:00
Ruslan Khafizov
1740181daf Enable tests 2020-11-10 19:15:57 +08:00
Liyana
2dc179239f Replaced Control.Monad.Error with Control.Monad.Except 2020-11-10 17:32:43 +08:00
Liyana
9b02385e3e Removed fromValue for boolV 2020-11-10 17:26:56 +08:00
Liyana
54e5fb6645 Added explicit implementation for 'readJSON' in the instance declaration for 'JSON PGF.Trie' 2020-11-10 17:19:18 +08:00
Liyana
8ca4baf470 Deleted redundant pattern match 2020-11-10 17:15:20 +08:00
Liyana
1f7584bf98 Added explicit implementation for 'fromValue' in instance declaration for 'Predef Bool' 2020-11-10 17:14:31 +08:00
Liyana
4364b1d9fb Replaced Control.Monad.Error with Control.Monad.Except 2020-11-10 17:11:41 +08:00
Liyana
33aad1b8de Deleted redundant pattern match 2020-11-10 17:06:35 +08:00
Liyana
dc6dd988bc Replaced inlinePerformIO with accursedUnutterablePerformIO 2020-11-10 17:01:47 +08:00
Liyana
ac81b418d6 Added readJSON error messages 2020-11-10 16:57:33 +08:00
John J. Camilleri
feed61dd30 Change setup-haskell version to just v1 2020-11-09 22:43:06 +01:00
John J. Camilleri
1c7c52da68 Use GHC 8.4.4 2020-11-09 22:11:29 +01:00
John J. Camilleri
71b10672e8 Fix macOS version at 10.13 2020-11-09 22:09:57 +01:00
John J. Camilleri
687f56178e Use newer version of setup-haskell 2020-11-09 22:05:50 +01:00
John J. Camilleri
359f1509fa Blurt out cabal version in both action and build script 2020-11-09 21:59:05 +01:00
John J. Camilleri
b1b3bc3360 Install C runtime in two places. Use cabal v1-copy. 2020-11-09 21:40:55 +01:00
John J. Camilleri
9018eabb10 Add libtool finish command from warning 2020-11-09 21:35:17 +01:00
John J. Camilleri
ed97a42fde Try it another way round 2020-11-09 21:23:52 +01:00
John J. Camilleri
f6eb94c33b Try removing something that looks like a typo 2020-11-09 21:12:23 +01:00
John J. Camilleri
6e2f34f4d0 Try to set PREFIX env var 2020-11-09 17:52:41 +01:00
John J. Camilleri
13ec9ca888 Explicitly specify env vars when building GF (test) 2020-11-09 17:43:15 +01:00
John J. Camilleri
24619bc3ee Change cabal version to 2.4, to match GHC 8.6.5 2020-11-09 17:15:13 +01:00
John J. Camilleri
399974ebfb Fix whitespace in binary build script 2020-11-09 13:55:53 +01:00
John J. Camilleri
6836360e0c Comment cabal freeze and caching in case it was causing build error below
https://github.com/GrammaticalFramework/gf-core/runs/1374091798?check_suite_focus=true
2020-11-09 13:55:26 +01:00
John J. Camilleri
3844277a66 Install Haskell via setup-haskell action, install build tools for C runtime 2020-11-09 13:35:03 +01:00
John J. Camilleri
86729b3efc Try to install GHC/Cabal via Homebrew 2020-11-09 13:18:57 +01:00
John J. Camilleri
beb7599d33 Add first attempt at GitHub action file for building .pkg 2020-11-09 13:15:35 +01:00
Inari Listenmaa
7dc6717b5e Merge pull request #79 from anka-213/fix-infinite-loop
Fix infinite recursion on error
2020-10-27 11:23:49 +01:00
Andreas Källberg
1ff66006b8 Fix infinite recursion on error
The implementation was meant to lift from SIO to IO,
but instead it was just the identity function,
which means that `fail = id . fail` and we have an infinite loop.
2020-10-26 17:21:22 +01:00
John J. Camilleri
db5ee0b66a Merge pull request #78 from anka-213/solve-syntax-error-bug
Fix syntax error problem for older versions of GHC
2020-10-09 11:56:21 +02:00
Andreas Källberg
7b4eeb368c Make CI green
See https://github.com/joerick/cibuildwheel/issues/446
2020-10-08 21:50:12 +02:00
Andreas Källberg
f2e4b89a22 Fix syntax error problem for older versions of GHC 2020-10-08 17:41:44 +02:00
Inari Listenmaa
670a58e7e7 Merge pull request #77 from inariksit/tutorial-fixes
Minor tweaks and updates to the tutorial
2020-10-02 20:32:21 +02:00
krangelov
f3a8658cc1 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2020-10-02 19:55:24 +02:00
krangelov
bfb94d1e48 fix parsing with HOAS 2020-10-02 19:34:52 +02:00
Inari Listenmaa
df77205c43 (Tutorial) Rename TV (transitive verb) to V2, the name used in RGL 2020-10-02 17:58:57 +02:00
Inari Listenmaa
e41436eb14 (Tutorial) Remove reference to morpho_list + overly verbose path 2020-10-02 17:57:35 +02:00
Inari Listenmaa
2826061251 (Tutorial) Update the pre syntax 2020-10-02 17:56:24 +02:00
Inari Listenmaa
f56fbcf86e (Tutorial) Remove mentions to pt -typecheck
The GF shell no longer has `put_tree -typecheck` option, and typechecking is done automatically when parsing.

The metavariable thing is a bit unclear: you don't get it when parsing "dim the light", or "switch on the fan, but you do get it when you `gt` after adding `switchOn` and `switchOff`.

```
> p "switch on the fan"
CAction fan (switchOff fan) (DKindOne fan)
> gt
CAction light dim (DKindOne light)
CAction ?3 (switchOff ?3) (DKindOne ?3)
CAction ?3 (switchOn ?3) (DKindOne ?3)
```

My hypothesis is that you don't get metavariable when parsing e.g. "dim the light", because even though `light` is suppressed in `CAction`, it still appears in `DKindOne`, so it gets to contribute to the whole tree with its string.
2020-09-29 09:23:36 +02:00
aarneranta
2c2bd158a6 link to CoLi paper, mention of iOS 2020-09-29 09:05:15 +02:00
John J. Camilleri
d95b3efd6b Add instructions for uploading PGF2 to Hackage 2020-09-18 10:49:21 +02:00
John J. Camilleri
db8b111e72 Bump PGF2 to 1.2.1 2020-09-18 10:34:45 +02:00
John J. Camilleri
ab52572f44 Fix bug where shell commands were ignored, introduced by #71 2020-09-18 09:25:08 +02:00
John J. Camilleri
6c54e5b63c Merge pull request #71 from anka-213/fix-newer-cabal
Fix support for newer stackage snapshots
2020-09-14 22:42:37 +02:00
Andreas Källberg
8bcdeedba0 Bump default stack.yaml to ghc8.6.5 2020-09-14 17:44:23 +02:00
Andreas Källberg
7d6a115cc1 Bump stackage snapshots to latest versions 2020-09-14 15:15:23 +02:00
Andreas Källberg
127a1b2842 Remove MonadFail requirements for aeson code 2020-09-12 11:04:32 +02:00
Andreas Källberg
2fd1040724 Fix incorrect type and update dependencies 2020-09-12 11:04:32 +02:00
Andreas Källberg
340f8d9b93 First attempt at github actions for stack 2020-09-12 10:55:18 +02:00
Andreas Källberg
9d8cd55cd5 Import orphan instances of MonadFail for ghc<8
Also upgrade alex/happy so automatic install works
2020-09-09 11:05:41 +02:00
Andreas Källberg
150b592aa9 Add stack file for ghc8.8.4 2020-09-08 15:10:29 +02:00
Andreas Källberg
56f94da772 Merge remote-tracking branch 'origin/master' into fix-newer-cabal 2020-09-05 21:11:12 +02:00
Andreas Källberg
57ce76dbc1 Add two more missing MonadFail imports 2020-09-05 20:57:30 +02:00
Andreas Källberg
2b23e0f27e Fix wrong indent 2020-09-05 20:45:08 +02:00
Andreas Källberg
57c1014e9f Update package database on ubuntu build
Fixes 404 error:
https://github.com/GrammaticalFramework/gf-core/runs/1076062405
2020-09-05 20:36:04 +02:00
Andreas Källberg
7268253f5a MonadFail: Make backwards-compatible 2020-09-05 20:23:07 +02:00
Andreas Källberg
1234c715fc Fix MonadFail for c-runtime as well 2020-09-05 18:57:40 +02:00
Inari Listenmaa
bca0691cb0 (Tutorial) Minor typofixes + current error message 2020-08-31 15:54:33 +02:00
Inari Listenmaa
3de9c664fd Merge pull request #73 from inariksit/video-tutorial-page
(Homepage) Change link of video tutorials to a page + small fixes
2020-08-31 15:31:31 +02:00
Inari Listenmaa
f6560d309e (Homepage) Change link of video tutorials to a page + small fixes
Also added video tutorial link to the footer.
2020-08-30 20:53:59 +02:00
Inari Listenmaa
254f03ecfe Fix wording + formatting slightly 2020-08-30 20:38:49 +02:00
Inari Listenmaa
0bb02eeb51 Add a page for all GF video tutorials 2020-08-30 20:08:17 +02:00
Inari Listenmaa
bf21b4768c (Tutorial) Fix to make calculator example compile
In abstract: startcat needs to be defined to run the commands that are shown later in the doc.
In concrete: ss and SS are defined in Prelude.
2020-08-21 13:25:16 +02:00
Andreas Källberg
47dbf9ac27 Add stack file for a more recent ghc 2020-08-19 14:13:17 +02:00
krangelov
90fc1d750e remove the deprecated pgf_print_expr_tuple 2020-08-14 21:03:48 +02:00
krangelov
24beed9a95 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2020-08-14 20:56:20 +02:00
krangelov
23edeec5a9 added an API for cloning expressions/types/literals 2020-08-14 20:54:15 +02:00
Inari Listenmaa
542a41fb32 Merge pull request #70 from inariksit/haskell
New features in PGF to Haskell translation
2020-08-11 14:20:28 +02:00
Andreas Källberg
85ab6daaaa Add cabal dist-newtyle to gitignore 2020-08-05 19:09:25 +02:00
Andreas Källberg
e351e7b79a Remove NoMonadFailDesugaring flag
I've fixed so everything has the fail it needs now
2020-08-05 18:48:24 +02:00
Andreas Källberg
05903b271c Fix testsuite compatability with newer Cabal 2020-08-05 18:48:24 +02:00
Andreas Källberg
3bd1f01959 Fix a few warnings 2020-08-05 18:48:24 +02:00
Andreas Källberg
0581d6827e Fix most build errors 2020-08-05 18:48:24 +02:00
Andreas Källberg
b8812b54b2 fix newer ghc: Don't try to be backwards compatible 2020-08-05 18:48:24 +02:00
Andreas Källberg
251845f83e First attempt at fixing incompabilities with newer cabal 2020-08-05 18:48:24 +02:00
Inari Listenmaa
7c478016d0 Replace deprecated pragma with up-to-date one. (#17) 2020-08-03 19:45:20 +02:00
John J. Camilleri
deddde953f Add script for uploading PGF2 documentation to Hackage
Hackage's attempt to build the package will fail because of missing C libraries
2020-08-03 14:22:32 +02:00
John J. Camilleri
e10bb790cb Merge pull request #69 from GrammaticalFramework/pgf2-hackage
Update pgf2 metadata for uploading to Hackage

Even though this isn't able to package the C runtime itself into the Haskell package, I think these changes are still worth merging into master.
2020-08-03 10:53:51 +02:00
Inari Listenmaa
868566a319 Remove accidentally added space character in deriving clause. 2020-07-31 15:16:45 +02:00
Inari Listenmaa
aeabc955c8 Remove characters that aren't allowed in Haskell data types.
GF allows more characters in its types, as long as they are inside
single quotes. E.g. 'VP/Object' is a valid name for a GF category,
but not for a Haskell data type.
2020-07-31 15:05:46 +02:00
Inari Listenmaa
030c3bfee9 Add option "data" to Haskell options.
Imports Data.Data, all GF types derive Data, and uses DeriveDataTypeable.
2020-07-31 12:46:19 +02:00
John J. Camilleri
c53353f087 Updates to PGF2 readme 2020-07-28 22:54:34 +02:00
John J. Camilleri
f00f0cb0ef Bump pgf2 to 1.2.0 2020-07-28 22:36:49 +02:00
John J. Camilleri
22d5f31d74 Merge remote-tracking branch 'origin/master' into pgf2-hackage 2020-07-28 22:19:15 +02:00
krangelov
830dbe760d expose parseToChart via the Web API 2020-07-26 15:56:54 +02:00
krangelov
d7965d81b4 parseToChart also returns the category 2020-07-26 15:56:21 +02:00
krangelov
a2d7f1369c filter out empty brackets 2020-07-26 15:55:18 +02:00
krangelov
0cee82f715 the if is actually unnecessary 2020-07-22 16:55:21 +02:00
krangelov
7229033e42 bugfix in bracketedLinearize 2020-07-22 16:26:05 +02:00
krangelov
8bc4cc7187 added function for posting query strings. scales better than get. 2020-07-21 22:29:00 +02:00
krangelov
2b09e70b4a allow specifying content-type in ajax_http 2020-07-21 13:19:19 +02:00
John J. Camilleri
38f468eed3 (pgf2) Readme, license, changelog 2020-07-11 21:06:08 +02:00
John J. Camilleri
88a73c1d9e Bump pgf2 to 1.1.0, update README 2020-07-11 20:00:25 +02:00
krangelov
77a2630ed9 revert to using unconditional probabilities in the different lookup functions 2020-07-11 09:52:43 +02:00
John J. Camilleri
f54e54123c Merge remote-tracking branch 'origin/master' into pgf2-hackage
# Conflicts:
#	src/runtime/haskell-bind/pgf2.cabal
2020-07-08 22:07:08 +02:00
John J. Camilleri
2ac796dbbc Remove PGF from PGF2, clean up PGF2 cabal file. 2020-07-08 21:55:42 +02:00
krangelov
33818076ff drop the SG library completely. 2020-07-08 21:12:01 +02:00
John J. Camilleri
47d1da0845 Merge pull request #65 from GrammaticalFramework/build-debian-package
Build Debian package via GitHub action
2020-07-07 12:02:35 +02:00
aarneranta
8a052edca2 an attempt to solve record extension overloading bug, commented out for the moment 2020-07-06 18:01:59 +02:00
aarneranta
1360723137 fixed issue #67 on order of record fields in overloading 2020-07-06 14:27:49 +02:00
John J. Camilleri
4594c36cfb Copy packages so that upload-artifact can find them 2020-06-30 11:41:12 +02:00
John J. Camilleri
d8e88fd42a Try alternate way for uploading artifact 2020-06-30 11:25:24 +02:00
John J. Camilleri
daa2145378 Try alternate way of cloning RGL one level up 2020-06-30 11:12:10 +02:00
John J. Camilleri
398c64734c Add txt2tags and pandoc to build env 2020-06-30 10:17:34 +02:00
John J. Camilleri
eb185e5358 Add sudo to apt command
https://help.github.com/en/actions/reference/virtual-environments-for-github-hosted-runners#administrative-privileges-of-github-hosted-runners
2020-06-30 10:14:35 +02:00
John J. Camilleri
bb4ad9ec7f First attempt at GitHub action for building Debian package 2020-06-30 10:10:36 +02:00
John J. Camilleri
5777b85701 Merge pull request #62 from GrammaticalFramework/python-bindings-description
Update descriptions in Python bindings setup.py
2020-06-23 10:42:00 +02:00
John J. Camilleri
ab3c6ec4eb Update descriptions in Python bindings setup.py 2020-06-17 11:36:31 +02:00
John J. Camilleri
63a3a57620 Remove duplicate --gfo flag
Fixes #51
2020-06-16 14:59:14 +02:00
Inari Listenmaa
aa9b4d06ba Merge pull request #55 from inariksit/error-messages
Improvements in common  error messages
2020-06-16 09:50:05 +02:00
John J. Camilleri
fff19f31af Add friendly name to upload job 2020-06-15 12:53:19 +02:00
John J. Camilleri
c47f2232c5 Merge pull request #61 from miracle2k/wheels2
Fix twine upload.
2020-06-15 12:38:56 +02:00
Michael Elsdörfer
c802ec6022 Only upload to PyPI on pushes to master. 2020-06-15 11:21:35 +01:00
Michael Elsdörfer
b2e6d52509 Fix twine upload. 2020-06-15 10:57:36 +01:00
John J. Camilleri
383ff5e227 Merge pull request #60 from miracle2k/wheels
WIP: Add Github action workflow to build Python wheels.
2020-06-15 08:59:40 +02:00
Michael Elsdörfer
71a98cdf00 Merge branch 'master' into wheels 2020-06-14 17:47:39 +01:00
Michael Elsdörfer
74f3f7a384 Update documentation. 2020-06-14 17:46:06 +01:00
Inari Listenmaa
3fe8c3109f (Homepage) Add new languages in list of RGL langs 2020-06-14 17:38:50 +01:00
bc²
7abad1f4bf in Fedora install instructions, use dnf
dnf is the new yum, see https://fedoramagazine.org/5tftw-2014-06-10/
2020-06-14 17:38:50 +01:00
Michael Elsdörfer
8d4eb9288a Remove references to live PyPI. 2020-06-14 17:38:01 +01:00
Michael Elsdörfer
866e91c917 Make sure sdist is included. 2020-06-14 17:38:01 +01:00
Michael Elsdörfer
6f5e25d01d Bring back fail-fast. 2020-06-14 17:38:00 +01:00
Michael Elsdörfer
9ad7d25fb4 Add upload to PyPI step. 2020-06-14 17:38:00 +01:00
Michael Elsdörfer
958da5e5e9 Add Github action workflow to build Python wheels. 2020-06-14 17:38:00 +01:00
Inari Listenmaa
f31bccca1c (Homepage) Add new languages in list of RGL langs 2020-06-06 18:15:50 +02:00
Inari Listenmaa
de8cc02ba5 Condense the unsupported token gluing as per John's suggestion 2020-06-05 19:39:31 +02:00
Inari Listenmaa
dbc7297d80 Don't output "\n **" if helpfulMsg is empty. 2020-06-04 20:19:06 +02:00
Inari Listenmaa
414c2a1a5f Add clarification to Internal error in GeneratePMCFG 2020-06-04 19:57:55 +02:00
Inari Listenmaa
dca1fcd7fe Add clarification to Unsupported token gluing (both good and bad case) 2020-06-04 19:57:38 +02:00
Inari Listenmaa
c0714b7d33 Add clarification to "expected foo, inferred bar" type of error msgs 2020-06-04 19:57:10 +02:00
Inari Listenmaa
a4e3bce6bb Add clarification to "Pattern is not linear" error msg. 2020-06-04 19:56:31 +02:00
Inari Listenmaa
9a903c166f Add suggestions to error messages that are caused by too few/many args 2020-06-04 17:56:13 +02:00
Inari Listenmaa
4414c3a9c8 Merge pull request #54 from odanoburu/patch-1
in Fedora install instructions, use dnf
2020-06-03 07:00:26 +02:00
bc²
11201d8645 in Fedora install instructions, use dnf
dnf is the new yum, see https://fedoramagazine.org/5tftw-2014-06-10/
2020-06-02 20:54:15 -03:00
John J. Camilleri
5846622c4d Homepage: fix some spacing, add RGL browser, add news about GFSS 2020 2020-06-01 11:18:06 +02:00
John J. Camilleri
d8e543a4e6 Add link to Inari's video tutorial from homepage 2020-06-01 09:44:38 +02:00
krangelov
0a915199e8 allow literals in the C shell 2020-05-20 20:18:47 +02:00
krangelov
165c5a6d9d bugfix in parsing literals 2020-05-20 19:57:33 +02:00
krangelov
0ad1c352fe expose lookupCohorts in Python 2020-05-20 16:12:50 +02:00
krangelov
48d3973daa fix the uggly typo that broke the literals. 2020-05-20 10:15:53 +02:00
krangelov
9a1f982b14 split unknown words by spaces 2020-05-16 08:32:00 +02:00
krangelov
e8653135d4 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2020-05-14 15:05:13 +02:00
krangelov
62bc78380e lookupCohorts now detects and reports unknown words. Also:
- added added two filtering functions: filterLongest and filterBest
 - updated the PGF service to work with the new API
2020-05-14 15:03:30 +02:00
Inari Listenmaa
dda348776e Add Lauri Alanko to the list of contributors 2020-05-11 13:32:26 +02:00
aarneranta
65c810f085 accepting gf-ud style abslabels in gf-core ; cnclabels TODO 2020-05-05 15:46:48 +02:00
Krasimir Angelov
b962bcd178 Merge pull request #50 from jdahlin/patch-1
Expose PGF/Concr/Iter/Bracket
2020-04-02 09:14:53 +02:00
Johan Dahlin
589c358389 Expose PGF/Concr/Iter/Bracket
Expose the remaining types in the module.

These are helpful for example in type annotations.
2020-04-01 21:37:13 -03:00
krangelov
57a1ea5b56 forgot the conversion of the annotation string to jstring 2020-04-01 16:26:03 +02:00
krangelov
762d83c1f0 switch off the debugger 2020-03-25 20:03:06 +01:00
krangelov
733fdac755 restore the sequence ordering after -optimize-pgf 2020-03-15 19:57:47 +01:00
krangelov
00e25d0ccb an API to access the names of all fields withing a category 2020-03-06 12:29:08 +01:00
krangelov
9806232532 fix the build after the change in the morphological API 2020-03-05 12:24:01 +01:00
krangelov
88f76ef671 lookup_morpho, lookup_cohorts and fullform_lexicon now report the usual conditional probability. This turns out to be more useful for part of speech tagging 2020-03-05 12:04:42 +01:00
krangelov
f22bd70585 The APIs for callbacks and the bracketed strings now use a string for the analysis intead of an integer. This is now consistent with lookupMorpho and friends 2020-03-05 11:58:21 +01:00
krangelov
3133900125 another bug related to mattern matching on multiword expression 2020-02-18 15:07:47 +01:00
krangelov
e15392e579 fix: pattern matching on strings should reconstruct the tokens after matching 2020-02-17 19:29:36 +01:00
krangelov
9604a6309c fix the compilation of case insensitive grammars 2020-02-17 12:40:14 +01:00
krangelov
98a18843da support command c-lookupCohorts 2020-02-13 14:51:03 +01:00
krangelov
61641e7a59 support post requests to the server 2020-02-13 14:50:23 +01:00
krangelov
c50df37144 rename the WordNet module when creating a new language 2020-01-28 21:12:28 +01:00
krangelov
34fd18ea96 added link to WordNet 2020-01-28 15:33:23 +01:00
krangelov
65024a0a55 added plugin for search in WordNet 2020-01-24 18:22:00 +01:00
krangelov
4b67949d36 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2020-01-17 12:42:40 +01:00
krangelov
2ab9fee8e4 Python 3 literal callbacks will receive offsets in number of characters instead of bytes 2020-01-17 12:41:54 +01:00
Inari Listenmaa
f4d9b534dc Remove outdated advice on producing Haskell file 2019-12-12 16:13:47 +01:00
krangelov
14f394c9e9 a version of the parser which returns a chart rather than a list of expressions 2019-12-07 22:00:39 +01:00
krangelov
dbb09cc689 remove bogus comment 2019-12-07 21:59:41 +01:00
krangelov
bb298fadbe silence warnings in the Python bindings 2019-12-07 10:27:06 +01:00
Thomas Hallgren
f1f47f7281 GF cloud: GFMorpho: responsive layout improvement for small screens 2019-12-03 17:07:20 +01:00
Thomas Hallgren
fb1199c49c GF Cloud: add Word inflection with smart paradigms
This is a reimplemention of Aarne's GFMorpho service from 2012, using
the GF shell web API. Not all features are implemented (yet).
2019-12-03 16:40:21 +01:00
Thomas Hallgren
12e55c93c0 Fix another build problem in bin/build-binary-dist.sh
Make sure the dynamic C runtime libraries are found when running GF to
compile the RGL
2019-11-27 14:31:45 +01:00
Thomas Hallgren
33aeb53f7a PGFService: userLanguage now defaults to English, if present in the grammar
The userLangauge is the name of the concrete syntax that has a languageCode
that matches the user's preferred language, as reported by the web browser.
If no matching language code is found, the PGF service now sets userLanguage
to the concrete syntax for English (e.g. FoodsEng) if present, and defaults
to the first concrete syntax (e.g. FoodsAfr) only if English is not present
in the grammar.
2019-11-26 15:27:02 +01:00
Thomas Hallgren
e6b33ac8b8 Minibar: make it possible to configure a list of preferred grammars
A preferred grammar is selected when a user visits the Minibar for the
first time. (Like before, Minibar remembers the selected grammar for future
visits.)

A preferred list of grammars can be specified in config.js in the .../minibar
directory on the server, e.g. like this:

  preferred_grammars=["/grammars/Foods.pgf","/grammars/ResourceDemo.pgf"]

The first available grammar from the list is used.
2019-11-21 14:25:07 +01:00
Thomas Hallgren
14e5528544 Fix build problems, bump versiom to gf-3.10.4-git
debian/rules and bin/build-binary-dist.sh needed to be adepted to changes
in how INSTALLPATH is used in src/runtime/java/Makefile.
2019-11-18 15:54:44 +01:00
Thomas Hallgren
28f53e801a PGFService: revert unlexing change in PGFService to restore &+ behaviour 2019-11-18 13:20:41 +01:00
aarneranta
6f2b1a83b7 fixed a vd bug that sometimes erased the root label 2019-11-13 11:40:37 +01:00
aarneranta
d3b501d35f fixed the problem with generating several roots in ud2gf. Now only the leftmost word becomes ROOT, the others become dep - which can be eliminated by cnclabels. This works fine for e.g. English prepositional and particle verbs. But it does not work if the 'main' word is not the leftmost one 2019-11-12 17:46:55 +01:00
krangelov
95b3fb306f forgot that debugging is on 2019-10-09 14:34:04 +02:00
krangelov
5b790b82c5 fix chunk extraction when there are literals 2019-10-09 14:32:20 +02:00
krangelov
26361b3692 fix the parsing for literals after the latest changes in the parser 2019-10-09 14:18:05 +02:00
krangelov
30eef61f0a more dead code 2019-09-20 16:15:28 +02:00
krangelov
29662350dc removed more dead code 2019-09-20 10:49:29 +02:00
krangelov
4d79aa8b19 remove obsolete code 2019-09-20 10:37:50 +02:00
Thomas Hallgren
9d3badd8b2 GrammarToCanonical: bug fix: add missing case for Empty 2019-09-10 12:41:16 +02:00
krangelov
e2ddea6c7d first version of a parser which returns chunks in case of failure 2019-08-30 13:31:57 +02:00
krangelov
59a6e3cfdd fix gu_map_next 2019-08-30 13:31:19 +02:00
krangelov
1e8d684f9a Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2019-08-30 08:12:52 +02:00
krangelov
72cfc1f48a a more reasonable API to iterate over a map 2019-08-30 08:12:15 +02:00
John J. Camilleri
724bf67295 Update Stack files after testing with Stack v2
- Also bump up some minor GHC versions (8.4.3 -> 8.4.4, 8.6.2 -> 8.6.5)
- Should still work with Stack < v2 (tested in docker/haskell:8.2.2)
2019-08-28 10:57:21 +02:00
Thomas Hallgren
a7a592d93e Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2019-08-21 14:33:30 +02:00
Thomas Hallgren
d1bb1de87f Minibar: support for links to open a given grammar in the minibar
When you press the "i" or "More info" button for a grammar, the info now
includes a link that can be used by anyone to open this specific grammar in
the minibar.
2019-08-21 14:27:56 +02:00
krangelov
394d033d19 added gu_map_delete 2019-08-20 18:55:36 +02:00
krangelov
cb678dfdc8 fix packages 2019-08-18 09:37:55 +02:00
krangelov
4161bbf0ec fix reference to FastCGIUtils 2019-08-18 09:12:30 +02:00
krangelov
148590927c remove obsolete code 2019-08-18 09:09:40 +02:00
krangelov
85a81ef741 Merge branch 'master' of https://github.com/GrammaticalFramework/gf-core 2019-08-09 15:18:27 +02:00
krangelov
3e662475ee bugfix in the parser's scanner 2019-08-09 15:15:30 +02:00
Thomas Hallgren
b77626b802 debian/rules: fix two build problems 2019-08-07 20:15:28 +02:00
Thomas Hallgren
12f2520b3c Download page: add binary package for Raspbian 10 2019-08-07 19:02:27 +02:00
Thomas Hallgren
941b4ddf1f GF home page: fix some links smoother operation over https 2019-08-07 14:07:47 +02:00
John J. Camilleri
85f12a5544 Remove wrong Haddock comment in PGF2
Clearly just a copy-paste error
2019-08-07 12:52:17 +02:00
Thomas Hallgren
81362ed7b7 Minibar can now display grammar documentation.
The documentation is taken from a file called Grammar.pgf_info, located
next to the Grammar.pgf file on the server.

The first line of the documentation is displayed below the menu bar in
the minibar. The rest of the documentation is displayed when you press
the "More info" button (or the "i" button).

The documentation can contain HTML markup. Blank lines are treated as
paragraph breaks.
2019-08-05 15:25:29 +02:00
192 changed files with 5301 additions and 56864 deletions

View File

@@ -0,0 +1,95 @@
# Based on the template here: https://kodimensional.dev/github-actions
name: Build with stack and cabal
# Trigger the workflow on push or pull request, but only for the master branch
on:
pull_request:
push:
branches: [master]
jobs:
cabal:
name: ${{ matrix.os }} / ghc ${{ matrix.ghc }}
runs-on: ${{ matrix.os }}
strategy:
matrix:
os: [ubuntu-latest, macos-latest, windows-latest]
cabal: ["3.2"]
ghc:
- "8.6.5"
- "8.8.3"
- "8.10.1"
exclude:
- os: macos-latest
ghc: 8.8.3
- os: macos-latest
ghc: 8.6.5
- os: windows-latest
ghc: 8.8.3
- os: windows-latest
ghc: 8.6.5
steps:
- uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: actions/setup-haskell@v1.1.4
id: setup-haskell-cabal
name: Setup Haskell
with:
ghc-version: ${{ matrix.ghc }}
cabal-version: ${{ matrix.cabal }}
- name: Freeze
run: |
cabal freeze
- uses: actions/cache@v1
name: Cache ~/.cabal/store
with:
path: ${{ steps.setup-haskell-cabal.outputs.cabal-store }}
key: ${{ runner.os }}-${{ matrix.ghc }}
# key: ${{ runner.os }}-${{ matrix.ghc }}-${{ hashFiles('cabal.project.freeze') }}
- name: Build
run: |
cabal configure --enable-tests --enable-benchmarks --test-show-details=direct
cabal build all
# - name: Test
# run: |
# cabal test all
stack:
name: stack / ghc ${{ matrix.ghc }}
runs-on: ubuntu-latest
strategy:
matrix:
stack: ["2.3.3"]
ghc: ["7.10.3","8.0.2", "8.2.2", "8.4.4", "8.6.5", "8.8.4"]
# ghc: ["8.8.3"]
steps:
- uses: actions/checkout@v2
if: github.event.action == 'opened' || github.event.action == 'synchronize' || github.event.ref == 'refs/heads/master'
- uses: actions/setup-haskell@v1.1.4
name: Setup Haskell Stack
with:
# ghc-version: ${{ matrix.ghc }}
stack-version: ${{ matrix.stack }}
- uses: actions/cache@v1
name: Cache ~/.stack
with:
path: ~/.stack
key: ${{ runner.os }}-${{ matrix.ghc }}-stack
- name: Build
run: |
stack build --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml
# stack build --system-ghc --test --bench --no-run-tests --no-run-benchmarks
- name: Test
run: |
stack test --system-ghc --stack-yaml stack-ghc${{ matrix.ghc }}.yaml

View File

@@ -0,0 +1,185 @@
name: Build Binary Packages
on:
workflow_dispatch:
release:
jobs:
# ---
ubuntu:
name: Build Ubuntu package
runs-on: ubuntu-18.04
# strategy:
# matrix:
# ghc: ["8.6.5"]
# cabal: ["2.4"]
steps:
- uses: actions/checkout@v2
# Note: `haskell-platform` is listed as requirement in debian/control,
# which is why it's installed using apt instead of the Setup Haskell action.
# - name: Setup Haskell
# uses: actions/setup-haskell@v1
# id: setup-haskell-cabal
# with:
# ghc-version: ${{ matrix.ghc }}
# cabal-version: ${{ matrix.cabal }}
- name: Install build tools
run: |
sudo apt-get update
sudo apt-get install -y \
make \
dpkg-dev \
debhelper \
haskell-platform \
libghc-json-dev \
python-dev \
default-jdk \
libtool-bin
- name: Build package
run: |
make deb
- name: Copy package
run: |
cp ../gf_*.deb dist/
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: gf-${{ github.sha }}-ubuntu
path: dist/gf_*.deb
if-no-files-found: error
# ---
macos:
name: Build macOS package
runs-on: macos-10.15
strategy:
matrix:
ghc: ["8.6.5"]
cabal: ["2.4"]
steps:
- uses: actions/checkout@v2
- name: Setup Haskell
uses: actions/setup-haskell@v1
id: setup-haskell-cabal
with:
ghc-version: ${{ matrix.ghc }}
cabal-version: ${{ matrix.cabal }}
- name: Install build tools
run: |
brew install \
automake
cabal v1-install alex happy
- name: Build package
run: |
sudo mkdir -p /Library/Java/Home
sudo ln -s /usr/local/opt/openjdk/include /Library/Java/Home/include
make pkg
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: gf-${{ github.sha }}-macos
path: dist/gf-*.pkg
if-no-files-found: error
# ---
windows:
name: Build Windows package
runs-on: windows-2019
strategy:
matrix:
ghc: ["8.6.5"]
cabal: ["2.4"]
steps:
- uses: actions/checkout@v2
- name: Setup MSYS2
uses: msys2/setup-msys2@v2
with:
install: >-
base-devel
gcc
python-devel
- name: Prepare dist folder
shell: msys2 {0}
run: |
mkdir /c/tmp-dist
mkdir /c/tmp-dist/c
mkdir /c/tmp-dist/java
mkdir /c/tmp-dist/python
- name: Build C runtime
shell: msys2 {0}
run: |
cd src/runtime/c
autoreconf -i
./configure
make
make install
cp /mingw64/bin/libpgf-0.dll /c/tmp-dist/c
cp /mingw64/bin/libgu-0.dll /c/tmp-dist/c
- name: Build Java bindings
shell: msys2 {0}
run: |
export PATH="${PATH}:/c/Program Files/Java/jdk8u275-b01/bin"
cd src/runtime/java
make \
JNI_INCLUDES="-I \"/c/Program Files/Java/jdk8u275-b01/include\" -I \"/c/Program Files/Java/jdk8u275-b01/include/win32\" -I \"/mingw64/include\" -D__int64=int64_t" \
WINDOWS_LDFLAGS="-L\"/mingw64/lib\" -no-undefined"
make install
cp .libs//msys-jpgf-0.dll /c/tmp-dist/java/jpgf.dll
cp jpgf.jar /c/tmp-dist/java
- name: Build Python bindings
shell: msys2 {0}
env:
EXTRA_INCLUDE_DIRS: /mingw64/include
EXTRA_LIB_DIRS: /mingw64/lib
run: |
cd src/runtime/python
python setup.py build
python setup.py install
cp /usr/lib/python3.8/site-packages/pgf* /c/tmp-dist/python
- name: Setup Haskell
uses: actions/setup-haskell@v1
id: setup-haskell-cabal
with:
ghc-version: ${{ matrix.ghc }}
cabal-version: ${{ matrix.cabal }}
- name: Install Haskell build tools
run: |
cabal install alex happy
- name: Build GF
run: |
cabal install --only-dependencies -fserver
cabal configure -fserver
cabal build
copy dist\build\gf\gf.exe C:\tmp-dist
- name: Upload artifact
uses: actions/upload-artifact@v2
with:
name: gf-${{ github.sha }}-windows
path: C:\tmp-dist\*
if-no-files-found: error

View File

@@ -0,0 +1,98 @@
name: Build & Publish Python Package
# Trigger the workflow on push or pull request, but only for the master branch
on:
pull_request:
push:
branches: [master]
jobs:
build_wheels:
name: Build wheel on ${{ matrix.os }}
runs-on: ${{ matrix.os }}
strategy:
fail-fast: true
matrix:
os: [ubuntu-18.04, macos-10.15]
steps:
- uses: actions/checkout@v1
- uses: actions/setup-python@v1
name: Install Python
with:
python-version: '3.7'
- name: Install cibuildwheel
run: |
python -m pip install git+https://github.com/joerick/cibuildwheel.git@main
- name: Install build tools for OSX
if: startsWith(matrix.os, 'macos')
run: |
brew install automake
- name: Build wheels on Linux
if: startsWith(matrix.os, 'macos') != true
env:
CIBW_BEFORE_BUILD: cd src/runtime/c && autoreconf -i && ./configure && make && make install
run: |
python -m cibuildwheel src/runtime/python --output-dir wheelhouse
- name: Build wheels on OSX
if: startsWith(matrix.os, 'macos')
env:
CIBW_BEFORE_BUILD: cd src/runtime/c && glibtoolize && autoreconf -i && ./configure && make && make install
run: |
python -m cibuildwheel src/runtime/python --output-dir wheelhouse
- uses: actions/upload-artifact@v2
with:
path: ./wheelhouse
build_sdist:
name: Build source distribution
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
name: Install Python
with:
python-version: '3.7'
- name: Build sdist
run: cd src/runtime/python && python setup.py sdist
- uses: actions/upload-artifact@v2
with:
path: ./src/runtime/python/dist/*.tar.gz
upload_pypi:
name: Upload to PyPI
needs: [build_wheels, build_sdist]
runs-on: ubuntu-latest
if: github.ref == 'refs/heads/master' && github.event_name == 'push'
steps:
- uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: '3.x'
- name: Install twine
run: pip install twine
- uses: actions/download-artifact@v2
with:
name: artifact
path: ./dist
- name: Publish
env:
TWINE_USERNAME: __token__
TWINE_PASSWORD: ${{ secrets.pypi_password }}
run: |
(cd ./src/runtime/python && curl -I --fail https://pypi.org/project/$(python setup.py --name)/$(python setup.py --version)/) || twine upload dist/*

13
.gitignore vendored
View File

@@ -5,7 +5,14 @@
*.jar
*.gfo
*.pgf
debian/.debhelper
debian/debhelper-build-stamp
debian/gf
debian/gf.debhelper.log
debian/gf.substvars
debian/files
dist/
dist-newstyle/
src/runtime/c/.libs/
src/runtime/c/Makefile
src/runtime/c/Makefile.in
@@ -44,6 +51,12 @@ cabal.sandbox.config
.stack-work
DATA_DIR
stack*.yaml.lock
# Output files for test suite
*.out
gf-tests.html
# Generated documentation (not exhaustive)
demos/index-numbers.html
demos/resourcegrammars.html

View File

@@ -2,8 +2,6 @@
# Grammatical Framework (GF)
[![Build Status](https://travis-ci.org/GrammaticalFramework/gf-core.svg?branch=master)](https://travis-ci.org/GrammaticalFramework/gf-core)
The Grammatical Framework is a grammar formalism based on type theory.
It consists of:
@@ -32,13 +30,16 @@ GF particularly addresses four aspects of grammars:
## Compilation and installation
The simplest way of installing GF is with the command:
The simplest way of installing GF from source is with the command:
```
cabal install
```
or:
```
stack install
```
For more details, see the [download page](http://www.grammaticalframework.org/download/index.html)
and [developers manual](http://www.grammaticalframework.org/doc/gf-developers.html).
For more information, including links to precompiled binaries, see the [download page](http://www.grammaticalframework.org/download/index.html).
## About this repository

66
RELEASE.md Normal file
View File

@@ -0,0 +1,66 @@
# GF Core releases
**Note:**
The RGL is now released completely separately from GF Core.
See the [RGL's RELEASE.md](https://github.com/GrammaticalFramework/gf-rgl/blob/master/RELEASE.md).
## Creating a new release
### 1. Prepare the repository
**Web pages**
1. Create `download/index-X.Y.md` with installation instructions.
2. Create `download/release-X.Y.md` with changelog information.
3. Update `download/index.html` to redirect to the new version.
4. Add announcement in news section in `index.html`.
**Version numbers**
1. Update version number in `gf.cabal` (ommitting `-git` suffix).
2. Add a new line in `debian/changelog`.
### 2. Create GitHub release
1. When the above changes are committed to the `master` branch in the repository
and pushed, check that all CI workflows are successful (fixing as necessary):
- <https://github.com/GrammaticalFramework/gf-core/actions>
- <https://travis-ci.org/github/GrammaticalFramework/gf-core>
2. Create a GitHub release [here](https://github.com/GrammaticalFramework/gf-core/releases/new):
- Tag version format `RELEASE-X.Y`
- Title: "GF X.Y"
- Description: mention major changes since last release
3. Publish the release to trigger the building of the binary packages (below).
### 3. Binary packages
The binaries will be built automatically by GitHub Actions when the release is created,
but the generated _artifacts_ must be manually attached to the release as _assets_.
1. Go to the [actions page](https://github.com/GrammaticalFramework/gf-core/actions) and click "Build Binary Packages" under _Workflows_.
2. Choose the workflow run corresponding to the newly created release.
3. Download the artifacts locally. Extract the Ubuntu and macOS ones to get the `.deb` and `.pkg` files.
4. Go back to the [releases page](https://github.com/GrammaticalFramework/gf-core/releases) and click to edit the release information.
5. Add the downloaded artifacts as release assets, giving them names with format `gf-X.Y-PLATFORM.EXT` (e.g. `gf-3.11-macos.pkg`).
### 4. Upload to Hackage
In order to do this you will need to be added the [GF maintainers](https://hackage.haskell.org/package/gf/maintainers/) on Hackage.
1. Run `make sdist`
2. Upload the package, either:
1. **Manually**: visit <https://hackage.haskell.org/upload> and upload the file `dist/gf-X.Y.tar.gz`
2. **via Cabal (≥2.4)**: `cabal upload dist/gf-X.Y.tar.gz`
3. If the documentation-building fails on the Hackage server, do:
```
cabal v2-haddock --builddir=dist/docs --haddock-for-hackage --enable-doc
cabal upload --documentation dist/docs/*-docs.tar.gz
```
## Miscellaneous
### What is the tag `GF-3.10`?
For GF 3.10, the Core and RGL repositories had already been separated, however
the binary packages still included the RGL. `GF-3.10` is a tag that was created
in both repositories ([gf-core](https://github.com/GrammaticalFramework/gf-core/releases/tag/GF-3.10) and [gf-rgl](https://github.com/GrammaticalFramework/gf-rgl/releases/tag/GF-3.10)) to indicate which versions of each went into the binaries.

View File

@@ -19,7 +19,6 @@ main = defaultMainWithHooks simpleUserHooks
, preInst = gfPreInst
, postInst = gfPostInst
, postCopy = gfPostCopy
, sDistHook = gfSDist
}
where
gfPreBuild args = gfPre args . buildDistPref
@@ -29,17 +28,17 @@ main = defaultMainWithHooks simpleUserHooks
return emptyHookedBuildInfo
gfPostBuild args flags pkg lbi = do
noRGLmsg
-- noRGLmsg
let gf = default_gf lbi
buildWeb gf flags (pkg,lbi)
gfPostInst args flags pkg lbi = do
noRGLmsg
-- noRGLmsg
saveInstallPath args flags (pkg,lbi)
installWeb (pkg,lbi)
gfPostCopy args flags pkg lbi = do
noRGLmsg
-- noRGLmsg
saveCopyPath args flags (pkg,lbi)
copyWeb flags (pkg,lbi)

View File

@@ -26,6 +26,14 @@ import Distribution.PackageDescription(PackageDescription(..))
so users won't see this message unless they check the log.)
-}
-- | Notice about contrib grammars
noContribMsg :: IO ()
noContribMsg = putStr $ unlines
[ "Example grammars are no longer included in the main GF repository, but have moved to gf-contrib."
, "If you want them to be built, clone the following repository in the same directory as gf-core:"
, "https://github.com/GrammaticalFramework/gf-contrib.git"
]
example_grammars :: [(String, String, [String])] -- [(pgf, subdir, source modules)]
example_grammars =
[("Letter.pgf","letter",letterSrc)
@@ -50,11 +58,8 @@ buildWeb gf flags (pkg,lbi) = do
contrib_exists <- doesDirectoryExist contrib_dir
if contrib_exists
then mapM_ build_pgf example_grammars
else putStr $ unlines
[ "Example grammars are no longer included in the main GF repository, but have moved to gf-contrib."
, "If you want these example grammars to be built, clone this repository in the same top-level directory as GF:"
, "https://github.com/GrammaticalFramework/gf-contrib.git"
]
-- else noContribMsg
else return ()
where
gfo_dir = buildDir lbi </> "examples"

View File

@@ -1,15 +1,18 @@
#! /bin/bash
### This script builds a binary distribution of GF from the source
### package that this script is a part of. It assumes that you have installed
### a recent version of the Haskell Platform.
### Two binary package formats are supported: plain tar files (.tar.gz) and
### OS X Installer packages (.pkg).
### This script builds a binary distribution of GF from source.
### It assumes that you have Haskell and Cabal installed.
### Two binary package formats are supported (specified with the FMT env var):
### - plain tar files (.tar.gz)
### - macOS installer packages (.pkg)
os=$(uname) # Operating system name (e.g. Darwin or Linux)
hw=$(uname -m) # Hardware name (e.g. i686 or x86_64)
# GF version number:
cabal="cabal v1-" # Cabal >= 2.4
# cabal="cabal " # Cabal <= 2.2
## Get GF version number from Cabal file
ver=$(grep -i ^version: gf.cabal | sed -e 's/version://' -e 's/ //g')
name="gf-$ver"
@@ -29,6 +32,7 @@ set -x # print commands before executing them
pushd src/runtime/c
bash setup.sh configure --prefix="$prefix"
bash setup.sh build
bash setup.sh install prefix="$prefix" # hack required for GF build on macOS
bash setup.sh install prefix="$destdir$prefix"
popd
@@ -38,11 +42,11 @@ if which >/dev/null python; then
EXTRA_INCLUDE_DIRS="$extrainclude" EXTRA_LIB_DIRS="$extralib" python setup.py build
python setup.py install --prefix="$destdir$prefix"
if [ "$fmt" == pkg ] ; then
# A hack for Python on OS X to find the PGF modules
pyver=$(ls "$destdir$prefix/lib" | sed -n 's/^python//p')
pydest="$destdir/Library/Python/$pyver/site-packages"
mkdir -p "$pydest"
ln "$destdir$prefix/lib/python$pyver/site-packages"/pgf* "$pydest"
# A hack for Python on macOS to find the PGF modules
pyver=$(ls "$destdir$prefix/lib" | sed -n 's/^python//p')
pydest="$destdir/Library/Python/$pyver/site-packages"
mkdir -p "$pydest"
ln "$destdir$prefix/lib/python$pyver/site-packages"/pgf* "$pydest"
fi
popd
else
@@ -53,52 +57,42 @@ fi
if which >/dev/null javac && which >/dev/null jar ; then
pushd src/runtime/java
rm -f libjpgf.la # In case it contains the wrong INSTALL_PATH
if make CFLAGS="-I$extrainclude -L$extralib" INSTALL_PATH="$prefix/lib"
if make CFLAGS="-I$extrainclude -L$extralib" INSTALL_PATH="$prefix"
then
make INSTALL_PATH="$destdir$prefix/lib" install
make INSTALL_PATH="$destdir$prefix" install
else
echo "*** Skipping the Java binding because of errors"
echo "Skipping the Java binding because of errors"
fi
popd
else
echo "Java SDK is not installed, so the Java binding will not be included"
fi
## To find dynamic C run-time libraries when building GF below
export DYLD_LIBRARY_PATH="$extralib" LD_LIBRARY_PATH="$extralib"
## Build GF, with C run-time support enabled
cabal install -w "$ghc" --only-dependencies -fserver -fc-runtime $extra
cabal configure -w "$ghc" --prefix="$prefix" -fserver -fc-runtime $extra
DYLD_LIBRARY_PATH="$extralib" LD_LIBRARY_PATH="$extralib" cabal build
# Building the example grammars will fail, because the RGL is missing
cabal copy --destdir="$destdir" # create www directory
## Build the RGL and copy it to $destdir
PATH=$PWD/dist/build/gf:$PATH
export GF_LIB_PATH="$(dirname $(find "$destdir" -name www))/lib" # hmm
mkdir -p "$GF_LIB_PATH"
pushd ../gf-rgl
make build
make copy
popd
# Build GF again, including example grammars that need the RGL
DYLD_LIBRARY_PATH="$extralib" LD_LIBRARY_PATH="$extralib" cabal build
${cabal}install -w "$ghc" --only-dependencies -fserver -fc-runtime $extra
${cabal}configure -w "$ghc" --prefix="$prefix" -fserver -fc-runtime $extra
${cabal}build
## Copy GF to $destdir
cabal copy --destdir="$destdir"
${cabal}copy --destdir="$destdir"
libdir=$(dirname $(find "$destdir" -name PGF.hi))
cabal register --gen-pkg-config=$libdir/gf-$ver.conf
${cabal}register --gen-pkg-config="$libdir/gf-$ver.conf"
## Create the binary distribution package
case $fmt in
tar.gz)
targz="$name-bin-$hw-$os.tar.gz" # the final tar file
tar -C "$destdir/$prefix" -zcf "dist/$targz" .
echo "Created $targz, consider renaming it to something more user friendly"
;;
targz="$name-bin-$hw-$os.tar.gz" # the final tar file
tar --directory "$destdir/$prefix" --gzip --create --file "dist/$targz" .
echo "Created $targz"
;;
pkg)
pkg=$name.pkg
pkgbuild --identifier org.grammaticalframework.gf.pkg --version "$ver" --root "$destdir" --install-location / dist/$pkg
echo "Created $pkg"
pkg=$name.pkg
pkgbuild --identifier org.grammaticalframework.gf.pkg --version "$ver" --root "$destdir" --install-location / dist/$pkg
echo "Created $pkg"
esac
## Cleanup
rm -r "$destdir"

View File

@@ -82,9 +82,10 @@ $body$
<li><a href="http://cloud.grammaticalframework.org/">GF Cloud</a></li>
<li>
<a href="$rel-root$/doc/tutorial/gf-tutorial.html">Tutorial</a>
/
·
<a href="$rel-root$/lib/doc/rgl-tutorial/index.html">RGL Tutorial</a>
</li>
<li><a href="$rel-root$/doc/gf-video-tutorials.html">Video Tutorials</a></li>
<li><a href="$rel-root$/download"><strong>Download GF</strong></a></li>
</ul>
</div>

View File

@@ -147,7 +147,7 @@ else
fi
done
find . -name '*.md' | while read file ; do
if [[ "$file" == *"README.md" ]] ; then continue ; fi
if [[ "$file" == *"README.md" ]] || [[ "$file" == *"RELEASE.md" ]] ; then continue ; fi
html="${file%.md}.html"
if [ "$file" -nt "$html" ] || [ "$template" -nt "$html" ] ; then
render_md_html "$file" "$html"

6
debian/changelog vendored
View File

@@ -1,3 +1,9 @@
gf (3.10.4-1) xenial bionic cosmic; urgency=low
* GF 3.10.4
-- Thomas Hallgren <hallgren@chalmers.se> Fri, 18 Nov 2019 15:00:00 +0100
gf (3.10.3-1) xenial bionic cosmic; urgency=low
* GF 3.10.3

4
debian/control vendored
View File

@@ -3,14 +3,14 @@ Section: devel
Priority: optional
Maintainer: Thomas Hallgren <hallgren@chalmers.se>
Standards-Version: 3.9.2
Build-Depends: debhelper (>= 5), haskell-platform (>= 2011.2.0.1), libghc-haskeline-dev, libghc-mtl-dev, libghc-json-dev, autoconf, automake, libtool-bin, python-dev, java-sdk, txt2tags, pandoc
Build-Depends: debhelper (>= 5), haskell-platform (>= 2011.2.0.1), libghc-haskeline-dev, libghc-mtl-dev, libghc-json-dev, autoconf, automake, libtool-bin, python-dev, java-sdk
Homepage: http://www.grammaticalframework.org/
Package: gf
Architecture: any
Depends: ${shlibs:Depends}
Description: Tools for GF, a grammar formalism based on type theory
Grammatical Framework (GF) is a grammar formalism based on type theory.
Grammatical Framework (GF) is a grammar formalism based on type theory.
It consists of a special-purpose programming language,
a compiler of the language, and a generic grammar processor.
.

14
debian/rules vendored
View File

@@ -1,6 +1,6 @@
#!/usr/bin/make -f
%:
%:
+dh $@
#dh_shlibdeps has a problem finding which package some of the Haskell
@@ -24,19 +24,15 @@ SET_LDL=LD_LIBRARY_PATH=$$LD_LIBRARY_PATH:$(CURDIR)/src/runtime/c/.libs
override_dh_auto_build:
cd src/runtime/python && EXTRA_INCLUDE_DIRS=$(CURDIR)/src/runtime/c EXTRA_LIB_DIRS=$(CURDIR)/src/runtime/c/.libs python setup.py build
cd src/runtime/java && make CFLAGS="-I$(CURDIR)/src/runtime/c -L$(CURDIR)/src/runtime/c/.libs" INSTALL_PATH=/usr/lib
cd src/runtime/java && make CFLAGS="-I$(CURDIR)/src/runtime/c -L$(CURDIR)/src/runtime/c/.libs" INSTALL_PATH=/usr
echo $(SET_LDL)
$(SET_LDL) cabal build # builds gf, fails to build example grammars
PATH=$(CURDIR)/dist/build/gf:$$PATH && make -C ../gf-rgl build
GF_LIB_PATH=$(CURDIR)/../gf-rgl/dist $(SET_LDL) cabal build # have RGL now, ok to build example grammars
make html
-$(SET_LDL) cabal build
override_dh_auto_install:
$(SET_LDL) cabal copy --destdir=$(CURDIR)/debian/gf # creates www directory
export GF_LIB_PATH="$$(dirname $$(find "$(CURDIR)/debian/gf" -name www))/lib" && echo "GF_LIB_PATH=$$GF_LIB_PATH" && mkdir -p "$$GF_LIB_PATH" && make -C ../gf-rgl copy
$(SET_LDL) cabal copy --destdir=$(CURDIR)/debian/gf
cd src/runtime/c && bash setup.sh copy prefix=$(CURDIR)/debian/gf/usr
cd src/runtime/python && python setup.py install --prefix=$(CURDIR)/debian/gf/usr
cd src/runtime/java && make INSTALL_PATH=$(CURDIR)/debian/gf/usr/lib install
cd src/runtime/java && make INSTALL_PATH=$(CURDIR)/debian/gf/usr install
D="`find debian/gf -name site-packages`" && [ -n "$$D" ] && cd $$D && cd .. && mv site-packages dist-packages
override_dh_auto_clean:

27
doc/errors/gluing.md Normal file
View File

@@ -0,0 +1,27 @@
## unsupported token gluing `foo + bar`
There was a problem in an expression using +, e.g. `foo + bar`.
This can be due to two causes, check which one applies in your case.
1. You are trying to use + on runtime arguments. Even if you are using
`foo + bar` in an oper, make sure that the oper isn't called in a
linearization that takes arguments. Both of the following are illegal:
lin Test foo bar = foo.s + bar.s -- explicit + in a lin
lin Test foo bar = opWithPlus foo bar -- the oper uses +
2. One of the arguments in `foo + bar` is a bound variable
from pattern matching a string, but the cases are non-exhaustive.
Example:
case "test" of {
x + "a" => x + "b" -- no applicable case for "test", so x = ???
} ;
You can fix this by adding a catch-all case in the end:
{ x + "a" => x + "b" ;
_ => "default case" } ;
3. If neither applies to your problem, submit a bug report and we
will update the error message and this documentation.
https://github.com/GrammaticalFramework/gf-core/issues

View File

@@ -391,6 +391,8 @@ bindings are found in the ``src/runtime/python`` and ``src/runtime/java``
directories, respecively. Compile them by following the instructions in
the ``INSTALL`` files in those directories.
The Python library can also be installed from PyPI using `pip install pgf`.
== Compilation of RGL ==
As of 2018-07-26, the RGL is distributed separately from the GF compiler and runtimes.

View File

@@ -32,6 +32,7 @@ The following people have contributed code to some of the versions:
- [Janna Khegai](http://www.cs.chalmers.se/~janna) (Chalmers)
- [Peter Ljunglöf](http://www.cse.chalmers.se/~peb) (University of Gothenburg)
- Petri Mäenpää (Nokia)
- Lauri Alanko (University of Helsinki)
At least the following colleagues are thanked for suggestions, bug
reports, and other indirect contributions to the code.

View File

@@ -1809,6 +1809,23 @@ As the last rule, subtyping is transitive:
- if *A* is a subtype of *B* and *B* is a subtype of *C*, then *A* is
a subtype of *C*.
### List categories
[]{#lists}
Since categories of lists of elements of another category are a common idiom, the following syntactic sugar is available:
cat [C] {n}
abbreviates a set of three judgements:
cat ListC ;
fun BaseC : C -> ... -> C -> ListC ; --n Cs
fun ConsC : C -> ListC -> ListC
The functions `BaseC` and `ConsC` are automatically generated in the abstract syntax, but their linearizations, as well as the linearization type of `ListC`, must be defined manually. The type expression `[C]` is in all contexts interchangeable with `ListC`.
More information on lists in GF can be found [here](https://inariksit.github.io/gf/2021/02/22/lists.html).
### Tables and table types
@@ -2113,7 +2130,7 @@ of *x*, and the application thereby disappears.
[]{#reuse}
*This section is valid for GF 3.0, which abandons the \"lock field\"*
*This section is valid for GF 3.0, which abandons the \"[lock field](https://inariksit.github.io/gf/2018/05/25/subtyping-gf.html#lock-fields)\"*
*discipline of GF 2.8.*
As explained [here](#openabstract), abstract syntax modules can be

35
doc/gf-video-tutorials.md Normal file
View File

@@ -0,0 +1,35 @@
---
title: "Video tutorials"
---
The GF [YouTube channel](https://www.youtube.com/channel/UCZ96DechSUVcXAhtOId9VVA) keeps a playlist of [all GF videos](https://www.youtube.com/playlist?list=PLrgqBB5thLeT15fUtJ8_Dtk8ppdtH90MK), and more specific playlists for narrower topics.
If you make a video about GF, let us know and we'll add it to the suitable playlist(s)!
- [General introduction to GF](#general-introduction-to-gf)
- [Beginner resources](#beginner-resources)
- [Resource grammar tutorials](#resource-grammar-tutorials)
## General introduction to GF
These videos introduce GF at a high level, and present some use cases.
__Grammatical Framework: Formalizing the Grammars of the World__
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/x1LFbDQhbso" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
__Aarne Ranta: Automatic Translation for Consumers and Producers__
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/An-AmFScw1o" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Beginner resources
These videos show how to install GF on your computer (Mac or Windows), and how to play with simple grammars in a [Jupyter notebook](https://github.com/GrammaticalFramework/gf-binder) (any platform, hosted at [mybinder.org](https://mybinder.org)).
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/videoseries?list=PLrgqBB5thLeRa8eViJJnjT8jBhxqCPMF2" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>
## Resource grammar tutorials
These videos show incremental improvements to a [miniature version of the resource grammar](https://github.com/inariksit/comp-syntax-2020/tree/master/lab2/grammar/dummy#readme).
They assume some prior knowledge of GF, roughly lessons 1-3 from the [GF tutorial](http://www.grammaticalframework.org/doc/tutorial/gf-tutorial.html).
<iframe width="560" height="315" src="https://www.youtube-nocookie.com/embed/videoseries?list=PLrgqBB5thLeTPkp88lnOmRtprCa8g0wX2" frameborder="0" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture" allowfullscreen></iframe>

View File

@@ -898,7 +898,7 @@ Parentheses are only needed for grouping.
Parsing something that is not in grammar will fail:
```
> parse "hello dad"
Unknown words: dad
The parser failed at token 2: "dad"
> parse "world hello"
no tree found
@@ -2475,7 +2475,7 @@ can be used to read a text and return for each word its analyses
```
The command ``morpho_quiz = mq`` generates inflection exercises.
```
% gf -path=alltenses:prelude $GF_LIB_PATH/alltenses/IrregFre.gfo
% gf alltenses/IrregFre.gfo
> morpho_quiz -cat=V
@@ -2488,11 +2488,6 @@ The command ``morpho_quiz = mq`` generates inflection exercises.
réapparaîtriez
Score 0/1
```
To create a list for later use, use the command ``morpho_list = ml``
```
> morpho_list -number=25 -cat=V | write_file exx.txt
```
@@ -2651,12 +2646,12 @@ The verb //switch off// is called a
We can define transitive verbs and their combinations as follows:
```
lincat TV = {s : Number => Str ; part : Str} ;
lincat V2 = {s : Number => Str ; part : Str} ;
fun AppTV : Item -> TV -> Item -> Phrase ;
fun AppV2 : Item -> V2 -> Item -> Phrase ;
lin AppTV subj tv obj =
{s = subj.s ++ tv.s ! subj.n ++ obj.s ++ tv.part} ;
lin AppV2 subj v2 obj =
{s = subj.s ++ v2.s ! subj.n ++ obj.s ++ v2.part} ;
```
**Exercise**. Define the language ``a^n b^n c^n`` in GF, i.e.
@@ -2722,11 +2717,11 @@ This topic will be covered in #Rseclexing.
The symbol ``**`` is used for both record types and record objects.
```
lincat TV = Verb ** {c : Case} ;
lincat V2 = Verb ** {c : Case} ;
lin Follow = regVerb "folgen" ** {c = Dative} ;
```
``TV`` becomes a **subtype** of ``Verb``.
``V2`` (transitive verb) becomes a **subtype** of ``Verb``.
If //T// is a subtype of //R//, an object of //T// can be used whenever
an object of //R// is required.
@@ -2757,7 +2752,11 @@ Thus the labels ``p1, p2,...`` are hard-coded.
English indefinite article:
```
oper artIndef : Str =
pre {"a" ; "an" / strs {"a" ; "e" ; "i" ; "o"}} ;
pre {
("a" | "e" | "i" | "o") => "an" ;
_ => "a"
} ;
```
Thus
```
@@ -2948,7 +2947,7 @@ We need the following combinations:
```
We also need **lexical insertion**, to form phrases from single words:
```
mkCN : N -> NP ;
mkCN : N -> CN ;
mkAP : A -> AP ;
```
Naming convention: to construct a //C//, use a function ``mk``//C//.
@@ -2969,7 +2968,7 @@ can be built as follows:
```
mkCl
(mkNP these_Det
(mkCN (mkAP very_AdA (mkAP warm_A)) (mkCN pizza_CN)))
(mkCN (mkAP very_AdA (mkAP warm_A)) (mkCN pizza_N)))
(mkAP italian_AP)
```
The task now: to define the concrete syntax of ``Foods`` so that
@@ -3718,49 +3717,25 @@ Concrete syntax does not know if a category is a dependent type.
```
Notice that the ``Kind`` argument is suppressed in linearization.
Parsing with dependent types is performed in two phases:
Parsing with dependent types consists of two phases:
+ context-free parsing
+ filtering through type checker
Parsing a type-correct command works as expected:
By just doing the first phase, the ``kind`` argument is not found:
```
> parse "dim the light"
CAction ? dim (DKindOne light)
```
Moreover, type-incorrect commands are not rejected:
```
> parse "dim the fan"
CAction ? dim (DKindOne fan)
```
The term ``?`` is a **metavariable**, returned by the parser
for any subtree that is suppressed by a linearization rule.
These are the same kind of metavariables as were used #Rsecediting
to mark incomplete parts of trees in the syntax editor.
#NEW
===Solving metavariables===
Use the command ``put_tree = pt`` with the option ``-typecheck``:
```
> parse "dim the light" | put_tree -typecheck
CAction light dim (DKindOne light)
```
The ``typecheck`` process may fail, in which case an error message
is shown and no tree is returned:
However, type-incorrect commands are rejected by the typecheck:
```
> parse "dim the fan" | put_tree -typecheck
Error in tree UCommand (CAction ? 0 dim (DKindOne fan)) :
(? 0 <> fan) (? 0 <> light)
> parse "dim the fan"
The parsing is successful but the type checking failed with error(s):
Couldn't match expected type Device light
against the interred type Device fan
In the expression: DKindOne fan
```
#NEW
==Polymorphism==
@@ -3786,23 +3761,19 @@ to express Haskell-type library functions:
\_,_,_,f,x,y -> f y x ;
```
#NEW
===Dependent types: exercises===
1. Write an abstract syntax module with above contents
and an appropriate English concrete syntax. Try to parse the commands
//dim the light// and //dim the fan//, with and without ``solve`` filtering.
//dim the light// and //dim the fan//.
2. Perform random and exhaustive generation, with and without
``solve`` filtering.
2. Perform random and exhaustive generation.
3. Add some device kinds and actions to the grammar.
#NEW
==Proof objects==
@@ -3912,7 +3883,6 @@ fun
Classes for new actions can be added incrementally.
#NEW
==Variable bindings==
@@ -4200,7 +4170,8 @@ We construct a calculator with addition, subtraction, multiplication, and
division of integers.
```
abstract Calculator = {
flags startcat = Exp ;
cat Exp ;
fun
@@ -4226,7 +4197,7 @@ We begin with a
concrete syntax that always uses parentheses around binary
operator applications:
```
concrete CalculatorP of Calculator = {
concrete CalculatorP of Calculator = open Prelude in {
lincat
Exp = SS ;
@@ -4737,10 +4708,6 @@ abstract Query = {
To make it easy to define a transfer function, we export the
abstract syntax to a system of Haskell datatypes:
```
% gf --output-format=haskell Query.pgf
```
It is also possible to produce the Haskell file together with PGF, by
```
% gf -make --output-format=haskell QueryEng.gf
```

View File

@@ -1,25 +0,0 @@
#!/bin/sh
prefix="/usr/local"
case "i386-apple-darwin9.3.0" in
*-cygwin)
prefix=`cygpath -w "$prefix"`;;
esac
exec_prefix="${prefix}"
GF_BIN_DIR="${exec_prefix}/bin"
GF_DATA_DIR="${prefix}/share/GF-3.0-beta"
GFBIN="$GF_BIN_DIR/gf"
if [ ! -x "${GFBIN}" ]; then
GFBIN=`which gf`
fi
if [ ! -x "${GFBIN}" ]; then
echo "gf not found."
exit 1
fi
exec $GFBIN --batch "$@"

View File

@@ -13,13 +13,13 @@ These binary packages include both the GF core (compiler and runtime) as well as
| Platform | Download | Features | How to install |
|:----------------|:---------------------------------------------------|:---------------|:-----------------------------------|
| macOS | [gf-3.10.pkg](gf-3.10.pkg) | GF, S, C, J, P | Double-click on the package icon |
| Raspbian 10 (buster) | [gf\_3.10-2\_armhf.deb](gf_3.10-2_armhf.deb) | GF,S,C,J,P | `sudo dpkg -i gf_3.10-2_armhf.deb` |
| Ubuntu (32-bit) | [gf\_3.10-2\_i386.deb](gf_3.10-2_i386.deb) | GF, S, C, J, P | `sudo dpkg -i gf_3.10-2_i386.deb` |
| Ubuntu (64-bit) | [gf\_3.10-2\_amd64.deb](gf_3.10-2_amd64.deb) | GF, S, C, J, P | `sudo dpkg -i gf_3.10-2_amd64.deb` |
| Windows | [gf-3.10-bin-windows.zip](gf-3.10-bin-windows.zip) | GF, S | `unzip gf-3.10-bin-windows.zip` |
<!--
| macOS | [gf-3.10-bin-intel-mac.tar.gz](gf-3.10-bin-intel-mac.tar.gz) | GF,S,C,J,P | `sudo tar -C /usr/local -zxf gf-3.10-bin-intel-mac.tar.gz` |
| Raspbian 9.1 | [gf\_3.10-1\_armhf.deb](gf_3.10-1_armhf.deb) | GF,S,C,J,P | `sudo dpkg -i gf_3.10-1_armhf.deb` |
-->
**Features**
@@ -114,7 +114,7 @@ automatically by cabal, and therefore need to be installed manually.
Here is one way to do this:
- On Ubuntu: `sudo apt-get install libghc-haskeline-dev`
- On Fedora: `sudo yum install ghc-haskeline-devel`
- On Fedora: `sudo dnf install ghc-haskeline-devel`
**GHC version**
@@ -171,6 +171,20 @@ in the RGL folder.
This assumes that you already have GF installed.
For more details about building the RGL, see the [RGL README](https://github.com/GrammaticalFramework/gf-rgl/blob/master/README.md).
## Installing the Python bindings from PyPI
The Python library is available on PyPI as `pgf`, so it can be installed using:
```
pip install pgf
```
We provide binary wheels for Linux and OSX (with Windows missing so far), which
include the C runtime and a ready-to-go. If there is no binary distribution for
your platform, this will install the source tarball, which will attempt to build
the binding during installation, and requires the GF C runtime to be installed on
your system.
## Older releases
- [GF 3.9](index-3.9.html) (August 2017)

173
download/index-3.11.md Normal file
View File

@@ -0,0 +1,173 @@
---
title: Grammatical Framework Download and Installation
...
**GF 3.11** was released on ... December 2020.
What's new? See the [release notes](release-3.11.html).
#### Note: GF core and the RGL
The following instructions explain how to install **GF core**, i.e. the compiler, shell and run-time systems.
Obtaining the **Resource Grammar Library (RGL)** is done separately; see the section at the bottom of this page.
---
## Installing from a binary package
Binary packages are available for Debian/Ubuntu, macOS, and Windows and include:
- GF shell and grammar compiler
- `gf -server` mode
- C run-time system
- Java & Python bindings to the C run-time system
Unlike in previous versions, the binaries **do not** include the RGL.
[Binary packages on GitHub](https://github.com/GrammaticalFramework/gf-core/releases/tag/RELEASE-3.11)
#### Debian/Ubuntu
To install the package use:
```
sudo dpkg -i gf_3.11.deb
```
The Ubuntu `.deb` packages should work on Ubuntu 16.04, 18.04 and similar Linux distributions.
#### macOS
To install the package, just double-click it and follow the installer instructions.
The packages should work on at least 10.13 (High Sierra) and 10.14 (Mojave).
#### Windows
To install the package, unpack it anywhere.
You will probably need to update the `PATH` environment variable to include your chosen install location.
For more information, see [Using GF on Windows](https://www.grammaticalframework.org/~inari/gf-windows.html) (latest updated for Windows 10).
## Installing the latest Hackage release (macOS, Linux, and WSL2 on Windows)
[GF is on Hackage](http://hackage.haskell.org/package/gf), so under
normal circumstances the procedure is fairly simple:
1. Install ghcup https://www.haskell.org/ghcup/
2. `ghcup install ghc 8.10.4`
3. `ghcup set ghc 8.10.4`
4. `cabal update`
5. On Linux: install some C libraries from your Linux distribution (see note below)
6. `cabal install gf-3.11`
You can also download the source code release from [GitHub](https://github.com/GrammaticalFramework/gf-core/releases),
and follow the instructions below under **Installing from the latest developer source code**.
### Notes
**Installation location**
The above steps installs GF for a single user.
The executables are put in `$HOME/.cabal/bin` (or on macOS in `$HOME/Library/Haskell/bin`),
so you might want to add this directory to your path (in `.bash_profile` or similar):
```
PATH=$HOME/.cabal/bin:$PATH
```
**Haskeline**
GF uses [`haskeline`](http://hackage.haskell.org/package/haskeline), which
on Linux depends on some non-Haskell libraries that won't be installed
automatically by cabal, and therefore need to be installed manually.
Here is one way to do this:
- On Ubuntu: `sudo apt-get install libghc-haskeline-dev`
- On Fedora: `sudo dnf install ghc-haskeline-devel`
**GHC version**
The GF source code has been updated to compile with GHC versions 7.10 through to 8.8.
## Installing from the latest developer source code
If you haven't already, clone the repository with:
```
git clone https://github.com/GrammaticalFramework/gf-core.git
```
If you've already cloned the repository previously, update with:
```
git pull
```
Then install with:
```
cabal install
```
or, if you're a Stack user:
```
stack install
```
The above notes for installing from source apply also in these cases.
For more info on working with the GF source code, see the
[GF Developers Guide](../doc/gf-developers.html).
## Installing the Python bindings from PyPI
The Python library is available on PyPI as `pgf`, so it can be installed using:
```
pip install pgf
```
We provide binary wheels for Linux and macOS, which include the C runtime and are ready-to-go.
If there is no binary distribution for your platform, this will install the source tarball,
which will attempt to build the binding during installation,
and requires the GF C runtime to be installed on your system.
---
## Installing the RGL from a binary release
Binary releases of the RGL are made available on [GitHub](https://github.com/GrammaticalFramework/gf-rgl/releases).
In general the steps to follow are:
1. Download a binary release and extract it somewhere on your system.
2. Set the environment variable `GF_LIB_PATH` to point to wherever you extracted the RGL.
## Installing the RGL from source
To compile the RGL, you will need to have GF already installed and in your path.
1. Obtain the RGL source code, either by:
- cloning with `git clone https://github.com/GrammaticalFramework/gf-rgl.git`
- downloading a source archive [here](https://github.com/GrammaticalFramework/gf-rgl/archive/master.zip)
2. Run `make` in the source code folder.
For more options, see the [RGL README](https://github.com/GrammaticalFramework/gf-rgl/blob/master/README.md).
---
## Older releases
- [GF 3.10](index-3.10.html) (December 2018)
- [GF 3.9](index-3.9.html) (August 2017)
- [GF 3.8](index-3.8.html) (June 2016)
- [GF 3.7.1](index-3.7.1.html) (October 2015)
- [GF 3.7](index-3.7.html) (June 2015)
- [GF 3.6](index-3.6.html) (June 2014)
- [GF 3.5](index-3.5.html) (August 2013)
- [GF 3.4](index-3.4.html) (January 2013)
- [GF 3.3.3](index-3.3.3.html) (March 2012)
- [GF 3.3](index-3.3.html) (October 2011)
- [GF 3.2.9](index-3.2.9.html) source-only snapshot (September 2011)
- [GF 3.2](index-3.2.html) (December 2010)
- [GF 3.1.6](index-3.1.6.html) (April 2010)

8
download/index.html Normal file
View File

@@ -0,0 +1,8 @@
<html>
<head>
<meta http-equiv="refresh" content="0; URL=/download/index-3.10.html" />
</head>
<body>
You are being redirected to <a href="index-3.10.html">the current version</a> of this page.
</body>
</html>

40
download/release-3.11.md Normal file
View File

@@ -0,0 +1,40 @@
---
title: GF 3.11 Release Notes
date: ... December 2020
...
## Installation
See the [download page](index-3.11.html).
## What's new
From this release, the binary GF core packages do not contain the RGL.
The RGL's release cycle is now completely separate from GF's. See [RGL releases](https://github.com/GrammaticalFramework/gf-rgl/releases).
Over 400 changes have been pushed to GF core
since the release of GF 3.10 in December 2018.
## General
- Make the test suite work again.
- Compatibility with new versions of GHC, including multiple Stack files for the different versions.
- Updates to build scripts and CI.
- Bug fixes.
## GF compiler and run-time library
- Huge improvements in time & space requirements for grammar compilation (pending [#87](https://github.com/GrammaticalFramework/gf-core/pull/87)).
- Add CoNLL output to `visualize_tree` shell command.
- Add canonical GF as output format in the compiler.
- Add PGF JSON as output format in the compiler.
- Deprecate JavaScript runtime in favour of updated [TypeScript runtime](https://github.com/GrammaticalFramework/gf-typescript).
- Improvements to Haskell export.
- Improvements to the C runtime.
- Improvements to `gf -server` mode.
- Clearer compiler error messages.
## Other
- Web page and documentation improvements.
- Add WordNet module to GFSE.

View File

@@ -1,5 +1,5 @@
name: gf
version: 3.10.3-git
version: 3.10.4-git
cabal-version: >= 1.22
build-type: Custom
@@ -14,6 +14,7 @@ maintainer: Thomas Hallgren
tested-with: GHC==7.10.3, GHC==8.0.2, GHC==8.2.2, GHC==8.4.3
data-dir: src
extra-source-files: WebSetup.hs
data-files:
www/*.html
www/*.css
@@ -71,7 +72,7 @@ flag c-runtime
Description: Include functionality from the C run-time library (which must be installed already)
Default: False
Library
library
default-language: Haskell2010
build-depends: base >= 4.6 && <5,
array,
@@ -82,6 +83,10 @@ Library
pretty,
mtl,
exceptions,
fail,
-- For compatability with ghc < 8
-- We need transformers-compat >= 0.6.3, but that is only in newer snapshots where it is redundant.
transformers-compat,
ghc-prim
hs-source-dirs: src/runtime/haskell
@@ -98,8 +103,6 @@ Library
--if impl(ghc>=7.8)
-- ghc-options: +RTS -A20M -RTS
ghc-prof-options: -fprof-auto
if impl(ghc>=8.6)
Default-extensions: NoMonadFailDesugaring
exposed-modules:
PGF
@@ -175,9 +178,7 @@ Library
GF.Command.TreeOperations
GF.Compile.CFGtoPGF
GF.Compile.CheckGrammar
GF.Compile.Compute.AppPredefined
GF.Compile.Compute.ConcreteNew
-- GF.Compile.Compute.ConcreteNew1
GF.Compile.Compute.Predef
GF.Compile.Compute.Value
GF.Compile.ExampleBased
@@ -319,7 +320,7 @@ Library
if impl(ghc>=8.2)
ghc-options: -fhide-source-paths
Executable gf
executable gf
hs-source-dirs: src/programs
main-is: gf-main.hs
default-language: Haskell2010
@@ -352,4 +353,5 @@ test-suite gf-tests
main-is: run.hs
hs-source-dirs: testsuite
build-depends: base>=4.3 && <5, Cabal>=1.8, directory, filepath, process
build-tool-depends: gf:gf
default-language: Haskell2010

View File

@@ -22,16 +22,16 @@
<h4 class="text-black-50">A programming language for multilingual grammar applications</h4>
</div>
<div class="row my-4">
<div class="row mt-4">
<div class="col-sm-6 col-md-3">
<div class="col-sm-6 col-md-3 mb-4">
<h3>Get started</h3>
<ul class="mb-2">
<li><a href="https://www.youtube.com/watch?v=x1LFbDQhbso">Google Tech Talk</a></li>
<li>
<a href="http://cloud.grammaticalframework.org/">
<a href="//cloud.grammaticalframework.org/">
GF Cloud
<img src="http://www.grammaticalframework.org/src/www/P/gf-cloud.png" style="height:30px" class="ml-2" alt="Cloud logo">
<img src="src/www/P/gf-cloud.png" style="height:30px" class="ml-2" alt="Cloud logo">
</a>
</li>
<li>
@@ -39,6 +39,7 @@
/
<a href="lib/doc/rgl-tutorial/index.html">RGL Tutorial</a>
</li>
<li><a href="doc/gf-video-tutorials.html">Video Tutorials</a></li>
</ul>
<a href="download/index.html" class="btn btn-primary ml-3">
@@ -47,7 +48,7 @@
</a>
</div>
<div class="col-sm-6 col-md-3">
<div class="col-sm-6 col-md-3 mb-4">
<h3>Learn more</h3>
<ul class="mb-2">
@@ -55,6 +56,7 @@
<li><a href="doc/gf-refman.html">Reference Manual</a></li>
<li><a href="doc/gf-shell-reference.html">Shell Reference</a></li>
<li><a href="http://www.molto-project.eu/sites/default/files/MOLTO_D2.3.pdf">Best Practices</a> <small>[PDF]</small></li>
<li><a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Scaling Up (Computational Linguistics 2020)</a></li>
</ul>
<a href="lib/doc/synopsis/index.html" class="btn btn-primary ml-3">
@@ -63,27 +65,30 @@
</a>
</div>
<div class="col-sm-6 col-md-3">
<div class="col-sm-6 col-md-3 mb-4">
<h3>Develop</h3>
<ul class="mb-2">
<li><a href="doc/gf-developers.html">Developers Guide</a></li>
<!-- <li><a href="/~hallgren/gf-experiment/browse/">Browse Source Code</a></li> -->
<li><a href="http://hackage.haskell.org/package/gf/docs/PGF.html">PGF library API (Haskell runtime)</a></li>
<li><a href="doc/runtime-api.html">PGF library API (C runtime)</a></li>
<li>PGF library API:<br>
<a href="http://hackage.haskell.org/package/gf/docs/PGF.html">Haskell</a> /
<a href="doc/runtime-api.html">C&nbsp;runtime</a>
</li>
<li><a href="http://hackage.haskell.org/package/gf/docs/GF.html">GF compiler API</a></li>
<!-- <li><a href="src/ui/android/README">GF on Android (new)</a></li>
<li><a href="/android/">GF on Android (old) </a></li> -->
<li><a href="doc/gf-editor-modes.html">Text Editor Support</a></li>
<li><a href="http://www.grammaticalframework.org/~john/rgl-browser/">RGL source browser</a></li>
</ul>
</div>
<div class="col-sm-6 col-md-3">
<div class="col-sm-6 col-md-3 mb-4">
<h3>Contribute</h3>
<ul class="mb-2">
<li><a href="http://groups.google.com/group/gf-dev">Mailing List</a></li>
<li><a href="https://github.com/GrammaticalFramework/gf-core/issues">Issue Tracker</a></li>
<li><a href="doc/gf-people.html">Authors</a></li>
<li><a href="http://school.grammaticalframework.org/2018/">Summer School</a></li>
<li><a href="//school.grammaticalframework.org/2020/">Summer School</a></li>
</ul>
<a href="https://github.com/GrammaticalFramework/" class="btn btn-primary ml-3">
<i class="fab fa-github mr-1"></i>
@@ -152,9 +157,9 @@ least one, it may help you to get a first idea of what GF is.
<h2>Applications & Availability</h2>
<p>
GF can be used for building
<a href="http://cloud.grammaticalframework.org/translator/">translation systems</a>,
<a href="http://cloud.grammaticalframework.org/minibar/minibar.html">multilingual web gadgets</a>,
<a href="http://www.cs.chalmers.se/~hallgren/Alfa/Tutorial/GFplugin.html">natural-language interfaces</a>,
<a href="//cloud.grammaticalframework.org/translator/">translation systems</a>,
<a href="//cloud.grammaticalframework.org/minibar/minibar.html">multilingual web gadgets</a>,
<a href="http://www.cse.chalmers.se/~hallgren/Alfa/Tutorial/GFplugin.html">natural-language interfaces</a>,
<a href="http://www.youtube.com/watch?v=1bfaYHWS6zU">dialogue systems</a>, and
<a href="lib/doc/synopsis/index.html">natural language resources</a>.
</p>
@@ -169,6 +174,7 @@ least one, it may help you to get a first idea of what GF is.
<li>macOS</li>
<li>Windows</li>
<li>Android mobile platform (via Java; runtime)</li>
<li>iOS mobile platform (iPhone, iPad)</li>
<li>via compilation to JavaScript, almost any platform that has a web browser (runtime)</li>
</ul>
@@ -210,7 +216,7 @@ least one, it may help you to get a first idea of what GF is.
<p>
We run the IRC channel <strong><code>#gf</code></strong> on the Freenode network, where you are welcome to look for help with small questions or just start a general discussion.
You can <a href="https://webchat.freenode.net/?channels=gf">open a web chat</a>
or <a href="http://www.grammaticalframework.org/irc/">browse the channel logs</a>.
or <a href="/irc/">browse the channel logs</a>.
</p>
<p>
If you have a larger question which the community may benefit from, we recommend you ask it on the <a href="http://groups.google.com/group/gf-dev">mailing list</a>.
@@ -222,9 +228,21 @@ least one, it may help you to get a first idea of what GF is.
<h2>News</h2>
<dl class="row">
<dt class="col-sm-3 text-center text-nowrap">2021-05-05</dt>
<dd class="col-sm-9">
<a href="https://cloud.grammaticalframework.org/wordnet/">GF WordNet</a> now supports languages for which there are no other WordNets. New additions: Afrikaans, German, Korean, Maltese, Polish, Somali, Swahili.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2021-03-01</dt>
<dd class="col-sm-9">
<a href="//school.grammaticalframework.org/2020/">Seventh GF Summer School</a>, in Singapore and online, 26 July &ndash; 8 August 2021.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2020-09-29</dt>
<dd class="col-sm-9">
<a href="https://www.mitpressjournals.org/doi/pdf/10.1162/COLI_a_00378">Abstract Syntax as Interlingua</a>: Scaling Up the Grammatical Framework from Controlled Languages to Robust Pipelines. A paper in Computational Linguistics (2020) summarizing much of the development in GF in the past ten years.
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-03</dt>
<dd class="col-sm-9">
<a href="http://school.grammaticalframework.org/2018/">Sixth GF Summer School</a> in Stellenbosch (South Africa), 314 December 2018
<a href="//school.grammaticalframework.org/2018/">Sixth GF Summer School</a> in Stellenbosch (South Africa), 314 December 2018
</dd>
<dt class="col-sm-3 text-center text-nowrap">2018-12-02</dt>
<dd class="col-sm-9">
@@ -248,7 +266,7 @@ least one, it may help you to get a first idea of what GF is.
GF is moving to <a href="https://github.com/GrammaticalFramework/GF/">GitHub</a>.</dd>
<dt class="col-sm-3 text-center text-nowrap">2017-03-13</dt>
<dd class="col-sm-9">
<a href="http://school.grammaticalframework.org/2017/">GF Summer School</a> in Riga (Latvia), 14-25 August 2017
<a href="//school.grammaticalframework.org/2017/">GF Summer School</a> in Riga (Latvia), 14-25 August 2017
</dd>
</dl>
@@ -268,7 +286,7 @@ least one, it may help you to get a first idea of what GF is.
</p>
<ul>
<li>
<a href="http://www.cs.chalmers.se/~hallgren/Alfa/Tutorial/GFplugin.html">GF-Alfa</a>:
<a href="http://www.cse.chalmers.se/~hallgren/Alfa/Tutorial/GFplugin.html">GF-Alfa</a>:
natural language interface to formal proofs
</li>
<li>
@@ -293,11 +311,11 @@ least one, it may help you to get a first idea of what GF is.
<a href="http://www.cse.chalmers.se/alumni/markus/FM/">Functional Morphology</a>
</li>
<li>
<a href="http://www.molto-project.eu">MOLTO</a>:
<a href="//www.molto-project.eu">MOLTO</a>:
multilingual online translation
</li>
<li>
<a href="http://remu.grammaticalframework.org">REMU</a>:
<a href="//remu.grammaticalframework.org">REMU</a>:
reliable multilingual digital communication
</li>
</ul>
@@ -324,9 +342,11 @@ least one, it may help you to get a first idea of what GF is.
Afrikaans,
Amharic (partial),
Arabic (partial),
Basque (partial),
Bulgarian,
Catalan,
Chinese,
Czech (partial),
Danish,
Dutch,
English,
@@ -338,10 +358,12 @@ least one, it may help you to get a first idea of what GF is.
Greek modern,
Hebrew (fragments),
Hindi,
Hungarian (partial),
Interlingua,
Japanese,
Italian,
Latin (fragments),
Japanese,
Korean (partial),
Latin (partial),
Latvian,
Maltese,
Mongolian,
@@ -354,7 +376,9 @@ least one, it may help you to get a first idea of what GF is.
Romanian,
Russian,
Sindhi,
Slovak (partial),
Slovene (partial),
Somali (partial),
Spanish,
Swahili (fragments),
Swedish,

View File

@@ -1,4 +1,4 @@
{-# LANGUAGE FlexibleInstances, UndecidableInstances #-}
{-# LANGUAGE FlexibleInstances, UndecidableInstances, CPP #-}
module GF.Command.Commands (
PGFEnv,HasPGFEnv(..),pgf,mos,pgfEnv,pgfCommands,
options,flags,
@@ -34,6 +34,7 @@ import Data.Maybe
import qualified Data.Map as Map
import GF.Text.Pretty
import Data.List (sort)
import qualified Control.Monad.Fail as Fail
--import Debug.Trace
@@ -44,7 +45,7 @@ pgfEnv pgf = Env pgf mos
class (Functor m,Monad m,MonadSIO m) => HasPGFEnv m where getPGFEnv :: m PGFEnv
instance (Monad m,HasPGFEnv m) => TypeCheckArg m where
instance (Monad m,HasPGFEnv m,Fail.MonadFail m) => TypeCheckArg m where
typeCheckArg e = (either (fail . render . ppTcError) (return . fst)
. flip inferExpr e . pgf) =<< getPGFEnv
@@ -740,7 +741,7 @@ pgfCommands = Map.fromList [
Nothing -> do putStrLn ("unknown category of function identifier "++show id)
return void
[e] -> case inferExpr pgf e of
Left tcErr -> error $ render (ppTcError tcErr)
Left tcErr -> errorWithoutStackTrace $ render (ppTcError tcErr)
Right (e,ty) -> do putStrLn ("Expression: "++showExpr [] e)
putStrLn ("Type: "++showType [] ty)
putStrLn ("Probability: "++show (probTree pgf e))
@@ -1018,3 +1019,7 @@ stanzas = map unlines . chop . lines where
chop ls = case break (=="") ls of
(ls1,[]) -> [ls1]
(ls1,_:ls2) -> ls1 : chop ls2
#if !(MIN_VERSION_base(4,9,0))
errorWithoutStackTrace = error
#endif

View File

@@ -18,6 +18,7 @@ import Data.Maybe
import qualified Data.Map as Map
import GF.Text.Pretty
import Control.Monad(mplus)
import qualified Control.Monad.Fail as Fail
data PGFEnv = Env {pgf::Maybe PGF,concs::Map.Map ConcName Concr}
@@ -25,7 +26,7 @@ data PGFEnv = Env {pgf::Maybe PGF,concs::Map.Map ConcName Concr}
pgfEnv pgf = Env (Just pgf) (languages pgf)
emptyPGFEnv = Env Nothing Map.empty
class (Monad m,MonadSIO m) => HasPGFEnv m where getPGFEnv :: m PGFEnv
class (Fail.MonadFail m,MonadSIO m) => HasPGFEnv m where getPGFEnv :: m PGFEnv
instance (Monad m,HasPGFEnv m) => TypeCheckArg m where
typeCheckArg e = do env <- getPGFEnv
@@ -806,14 +807,22 @@ hsExpr c =
Just (f,cs) -> H.mkApp (H.mkCId f) (map hsExpr cs)
_ -> case unStr c of
Just str -> H.mkStr str
_ -> error $ "GF.Command.Commands2.hsExpr "++show c
_ -> case unInt c of
Just n -> H.mkInt n
_ -> case unFloat c of
Just d -> H.mkFloat d
_ -> error $ "GF.Command.Commands2.hsExpr "++show c
cExpr e =
case H.unApp e of
Just (f,es) -> mkApp (H.showCId f) (map cExpr es)
_ -> case H.unStr e of
Just str -> mkStr str
_ -> error $ "GF.Command.Commands2.cExpr "++show e
_ -> case H.unInt e of
Just n -> mkInt n
_ -> case H.unFloat e of
Just d -> mkFloat d
_ -> error $ "GF.Command.Commands2.cExpr "++show e
needPGF exec opts ts =
do Env mb_pgf cncs <- getPGFEnv

View File

@@ -11,6 +11,8 @@ import GF.Infra.UseIO(putStrLnE)
import Control.Monad(when)
import qualified Data.Map as Map
import GF.Infra.UseIO (Output)
import qualified Control.Monad.Fail as Fail
data CommandEnv m = CommandEnv {
commands :: Map.Map String (CommandInfo m),
@@ -22,6 +24,7 @@ data CommandEnv m = CommandEnv {
mkCommandEnv cmds = CommandEnv cmds Map.empty Map.empty
--interpretCommandLine :: CommandEnv -> String -> SIO ()
interpretCommandLine :: (Fail.MonadFail m, Output m, TypeCheckArg m) => CommandEnv m -> String -> m ()
interpretCommandLine env line =
case readCommandLine line of
Just [] -> return ()

View File

@@ -34,14 +34,13 @@ import qualified GF.Compile.Compute.ConcreteNew as CN
import GF.Grammar
import GF.Grammar.Lexer
import GF.Grammar.Lookup
--import GF.Grammar.Predef
--import GF.Grammar.PatternMatch
import GF.Data.Operations
import GF.Infra.CheckM
import Data.List
import qualified Data.Set as Set
import qualified Data.Map as Map
import Control.Monad
import GF.Text.Pretty
@@ -59,7 +58,7 @@ checkModule opts cwd sgr mo@(m,mi) = do
where
updateCheckInfos mo = fmap (foldl update mo) . parallelCheck . map check
where check (i,info) = fmap ((,) i) (checkInfo opts cwd sgr mo i info)
update mo@(m,mi) (i,info) = (m,mi{jments=updateTree (i,info) (jments mi)})
update mo@(m,mi) (i,info) = (m,mi{jments=Map.insert i info (jments mi)})
-- check if restricted inheritance modules are still coherent
-- i.e. that the defs of remaining names don't depend on omitted names
@@ -72,7 +71,7 @@ checkRestrictedInheritance cwd sgr (name,mo) = checkInModule cwd mo NoLoc empty
where
mos = modules sgr
checkRem ((i,m),mi) = do
let (incl,excl) = partition (isInherited mi) (map fst (tree2list (jments m)))
let (incl,excl) = partition (isInherited mi) (Map.keys (jments m))
let incld c = Set.member c (Set.fromList incl)
let illegal c = Set.member c (Set.fromList excl)
let illegals = [(f,is) |
@@ -89,10 +88,10 @@ checkCompleteGrammar opts cwd gr (am,abs) (cm,cnc) = checkInModule cwd cnc NoLoc
let jsc = jments cnc
-- check that all concrete constants are in abstract; build types for all lin
jsc <- foldM checkCnc emptyBinTree (tree2list jsc)
jsc <- foldM checkCnc Map.empty (Map.toList jsc)
-- check that all abstract constants are in concrete; build default lin and lincats
jsc <- foldM checkAbs jsc (tree2list jsa)
jsc <- foldM checkAbs jsc (Map.toList jsa)
return (cm,cnc{jments=jsc})
where
@@ -113,17 +112,17 @@ checkCompleteGrammar opts cwd gr (am,abs) (cm,cnc) = checkInModule cwd cnc NoLoc
case lookupIdent c js of
Ok (AnyInd _ _) -> return js
Ok (CncFun ty (Just def) mn mf) ->
return $ updateTree (c,CncFun ty (Just def) mn mf) js
return $ Map.insert c (CncFun ty (Just def) mn mf) js
Ok (CncFun ty Nothing mn mf) ->
case mb_def of
Ok def -> return $ updateTree (c,CncFun ty (Just (L NoLoc def)) mn mf) js
Ok def -> return $ Map.insert c (CncFun ty (Just (L NoLoc def)) mn mf) js
Bad _ -> do noLinOf c
return js
_ -> do
case mb_def of
Ok def -> do (cont,val) <- linTypeOfType gr cm ty
let linty = (snd (valCat ty),cont,val)
return $ updateTree (c,CncFun (Just linty) (Just (L NoLoc def)) Nothing Nothing) js
return $ Map.insert c (CncFun (Just linty) (Just (L NoLoc def)) Nothing Nothing) js
Bad _ -> do noLinOf c
return js
where noLinOf c = checkWarn ("no linearization of" <+> c)
@@ -132,24 +131,24 @@ checkCompleteGrammar opts cwd gr (am,abs) (cm,cnc) = checkInModule cwd cnc NoLoc
Ok (CncCat (Just _) _ _ _ _) -> return js
Ok (CncCat Nothing md mr mp mpmcfg) -> do
checkWarn ("no linearization type for" <+> c <> ", inserting default {s : Str}")
return $ updateTree (c,CncCat (Just (L NoLoc defLinType)) md mr mp mpmcfg) js
return $ Map.insert c (CncCat (Just (L NoLoc defLinType)) md mr mp mpmcfg) js
_ -> do
checkWarn ("no linearization type for" <+> c <> ", inserting default {s : Str}")
return $ updateTree (c,CncCat (Just (L NoLoc defLinType)) Nothing Nothing Nothing Nothing) js
return $ Map.insert c (CncCat (Just (L NoLoc defLinType)) Nothing Nothing Nothing Nothing) js
_ -> return js
checkCnc js i@(c,info) =
checkCnc js (c,info) =
case info of
CncFun _ d mn mf -> case lookupOrigInfo gr (am,c) of
Ok (_,AbsFun (Just (L _ ty)) _ _ _) ->
do (cont,val) <- linTypeOfType gr cm ty
let linty = (snd (valCat ty),cont,val)
return $ updateTree (c,CncFun (Just linty) d mn mf) js
return $ Map.insert c (CncFun (Just linty) d mn mf) js
_ -> do checkWarn ("function" <+> c <+> "is not in abstract")
return js
CncCat {} ->
case lookupOrigInfo gr (am,c) of
Ok (_,AbsCat _) -> return $ updateTree i js
Ok (_,AbsCat _) -> return $ Map.insert c info js
{- -- This might be too pedantic:
Ok (_,AbsFun {}) ->
checkError ("lincat:"<+>c<+>"is a fun, not a cat")
@@ -157,7 +156,7 @@ checkCompleteGrammar opts cwd gr (am,abs) (cm,cnc) = checkInModule cwd cnc NoLoc
_ -> do checkWarn ("category" <+> c <+> "is not in abstract")
return js
_ -> return $ updateTree i js
_ -> return $ Map.insert c info js
-- | General Principle: only Just-values are checked.
@@ -271,7 +270,7 @@ checkInfo opts cwd sgr (m,mo) c info = checkInModule cwd mo NoLoc empty $ do
chIn loc cat = checkInModule cwd mo loc ("Happened in" <+> cat <+> c)
mkPar (f,co) = do
vs <- liftM combinations $ mapM (\(_,_,ty) -> allParamValues gr ty) co
vs <- liftM sequence $ mapM (\(_,_,ty) -> allParamValues gr ty) co
return $ map (mkApp (QC (m,f))) vs
checkUniq xss = case xss of

View File

@@ -1,64 +0,0 @@
module GF.Compile.Coding where
{-
import GF.Grammar.Grammar
import GF.Grammar.Macros
import GF.Text.Coding
--import GF.Infra.Option
import GF.Data.Operations
--import Data.Char
import System.IO
import qualified Data.ByteString.Char8 as BS
encodeStringsInModule :: TextEncoding -> SourceModule -> SourceModule
encodeStringsInModule enc = codeSourceModule (BS.unpack . encodeUnicode enc)
decodeStringsInModule :: TextEncoding -> SourceModule -> SourceModule
decodeStringsInModule enc mo = codeSourceModule (decodeUnicode enc . BS.pack) mo
codeSourceModule :: (String -> String) -> SourceModule -> SourceModule
codeSourceModule co (id,mo) = (id,mo{jments = mapTree codj (jments mo)})
where
codj (c,info) = case info of
ResOper pty pt -> ResOper (codeLTerms co pty) (codeLTerms co pt)
ResOverload es tyts -> ResOverload es [(codeLTerm co ty,codeLTerm co t) | (ty,t) <- tyts]
CncCat mcat mdef mref mpr mpmcfg -> CncCat mcat (codeLTerms co mdef) (codeLTerms co mref) (codeLTerms co mpr) mpmcfg
CncFun mty mt mpr mpmcfg -> CncFun mty (codeLTerms co mt) (codeLTerms co mpr) mpmcfg
_ -> info
codeLTerms co = fmap (codeLTerm co)
codeLTerm :: (String -> String) -> L Term -> L Term
codeLTerm = fmap . codeTerm
codeTerm :: (String -> String) -> Term -> Term
codeTerm co = codt
where
codt t = case t of
K s -> K (co s)
T ty cs -> T ty [(codp p,codt v) | (p,v) <- cs]
EPatt p -> EPatt (codp p)
_ -> composSafeOp codt t
codp p = case p of --- really: composOpPatt
PR rs -> PR [(l,codp p) | (l,p) <- rs]
PString s -> PString (co s)
PChars s -> PChars (co s)
PT x p -> PT x (codp p)
PAs x p -> PAs x (codp p)
PNeg p -> PNeg (codp p)
PRep p -> PRep (codp p)
PSeq p q -> PSeq (codp p) (codp q)
PAlt p q -> PAlt (codp p) (codp q)
_ -> p
-- | Run an encoding function on all string literals within the given string.
codeStringLiterals :: (String -> String) -> String -> String
codeStringLiterals _ [] = []
codeStringLiterals co ('"':cs) = '"' : inStringLiteral cs
where inStringLiteral [] = error "codeStringLiterals: unterminated string literal"
inStringLiteral ('"':ds) = '"' : codeStringLiterals co ds
inStringLiteral ('\\':d:ds) = '\\' : co [d] ++ inStringLiteral ds
inStringLiteral (d:ds) = co [d] ++ inStringLiteral ds
codeStringLiterals co (c:cs) = c : codeStringLiterals co cs
-}

View File

@@ -1,143 +0,0 @@
----------------------------------------------------------------------
-- |
-- Module : AppPredefined
-- Maintainer : AR
-- Stability : (stable)
-- Portability : (portable)
--
-- > CVS $Date: 2005/10/06 14:21:34 $
-- > CVS $Author: aarne $
-- > CVS $Revision: 1.13 $
--
-- Predefined function type signatures and definitions.
-----------------------------------------------------------------------------
module GF.Compile.Compute.AppPredefined ({-
isInPredefined, typPredefined, arrityPredefined, predefModInfo, appPredefined-}
) where
{-
import GF.Compile.TypeCheck.Primitives
import GF.Infra.Option
import GF.Data.Operations
import GF.Grammar
import GF.Grammar.Predef
import qualified Data.Map as Map
import GF.Text.Pretty
import Data.Char (isUpper,toUpper,toLower)
-- predefined function type signatures and definitions. AR 12/3/2003.
isInPredefined :: Ident -> Bool
isInPredefined f = Map.member f primitives
arrityPredefined :: Ident -> Maybe Int
arrityPredefined f = do ty <- typPredefined f
let (ctxt,_) = typeFormCnc ty
return (length ctxt)
predefModInfo :: SourceModInfo
predefModInfo = ModInfo MTResource MSComplete noOptions [] Nothing [] [] "Predef.gf" Nothing primitives
appPredefined :: Term -> Err (Term,Bool)
appPredefined t = case t of
App f x0 -> do
(x,_) <- appPredefined x0
case f of
-- one-place functions
Q (mod,f) | mod == cPredef ->
case x of
(K s) | f == cLength -> retb $ EInt $ length s
(K s) | f == cIsUpper -> retb $ if (all isUpper s) then predefTrue else predefFalse
(K s) | f == cToUpper -> retb $ K $ map toUpper s
(K s) | f == cToLower -> retb $ K $ map toLower s
(K s) | f == cError -> retb $ Error s
_ -> retb t
-- two-place functions
App (Q (mod,f)) z0 | mod == cPredef -> do
(z,_) <- appPredefined z0
case (norm z, norm x) of
(EInt i, K s) | f == cDrop -> retb $ K (drop i s)
(EInt i, K s) | f == cTake -> retb $ K (take i s)
(EInt i, K s) | f == cTk -> retb $ K (take (max 0 (length s - i)) s)
(EInt i, K s) | f == cDp -> retb $ K (drop (max 0 (length s - i)) s)
(K s, K t) | f == cEqStr -> retb $ if s == t then predefTrue else predefFalse
(K s, K t) | f == cOccur -> retb $ if substring s t then predefTrue else predefFalse
(K s, K t) | f == cOccurs -> retb $ if any (flip elem t) s then predefTrue else predefFalse
(EInt i, EInt j) | f == cEqInt -> retb $ if i==j then predefTrue else predefFalse
(EInt i, EInt j) | f == cLessInt -> retb $ if i<j then predefTrue else predefFalse
(EInt i, EInt j) | f == cPlus -> retb $ EInt $ i+j
(_, t) | f == cShow && notVar t -> retb $ foldrC $ map K $ words $ render (ppTerm Unqualified 0 t)
(_, K s) | f == cRead -> retb $ Cn (identS s) --- because of K, only works for atomic tags
(_, t) | f == cToStr -> trm2str t >>= retb
_ -> retb t ---- prtBad "cannot compute predefined" t
-- three-place functions
App (App (Q (mod,f)) z0) y0 | mod == cPredef -> do
(y,_) <- appPredefined y0
(z,_) <- appPredefined z0
case (z, y, x) of
(ty,op,t) | f == cMapStr -> retf $ mapStr ty op t
_ | f == cEqVal && notVar y && notVar x -> retb $ if y==x then predefTrue else predefFalse
_ -> retb t ---- prtBad "cannot compute predefined" t
_ -> retb t ---- prtBad "cannot compute predefined" t
_ -> retb t
---- should really check the absence of arg variables
where
retb t = return (retc t,True) -- no further computing needed
retf t = return (retc t,False) -- must be computed further
retc t = case t of
K [] -> t
K s -> foldr1 C (map K (words s))
_ -> t
norm t = case t of
Empty -> K []
C u v -> case (norm u,norm v) of
(K x,K y) -> K (x +++ y)
_ -> t
_ -> t
notVar t = case t of
Vr _ -> False
App f a -> notVar f && notVar a
_ -> True ---- would need to check that t is a value
foldrC ts = if null ts then Empty else foldr1 C ts
-- read makes variables into constants
predefTrue = QC (cPredef,cPTrue)
predefFalse = QC (cPredef,cPFalse)
substring :: String -> String -> Bool
substring s t = case (s,t) of
(c:cs, d:ds) -> (c == d && substring cs ds) || substring s ds
([],_) -> True
_ -> False
trm2str :: Term -> Err Term
trm2str t = case t of
R ((_,(_,s)):_) -> trm2str s
T _ ((_,s):_) -> trm2str s
V _ (s:_) -> trm2str s
C _ _ -> return $ t
K _ -> return $ t
S c _ -> trm2str c
Empty -> return $ t
_ -> Bad (render (text "cannot get Str from term" <+> ppTerm Unqualified 0 t))
-- simultaneous recursion on type and term: type arg is essential!
-- But simplify the task by assuming records are type-annotated
-- (this has been done in type checking)
mapStr :: Type -> Term -> Term -> Term
mapStr ty f t = case (ty,t) of
_ | elem ty [typeStr,typeTok] -> App f t
(_, R ts) -> R [(l,mapField v) | (l,v) <- ts]
(Table a b,T ti cs) -> T ti [(p,mapStr b f v) | (p,v) <- cs]
_ -> t
where
mapField (mty,te) = case mty of
Just ty -> (mty,mapStr ty f te)
_ -> (mty,te)
-}

View File

@@ -15,7 +15,7 @@ import GF.Grammar.Lockfield(isLockLabel,lockRecType) --unlockRecord,lockLabel
import GF.Compile.Compute.Value hiding (Error)
import GF.Compile.Compute.Predef(predef,predefName,delta)
import GF.Data.Str(Str,glueStr,str2strings,str,sstr,plusStr,strTok)
import GF.Data.Operations(Err,err,errIn,maybeErr,combinations,mapPairsM)
import GF.Data.Operations(Err,err,errIn,maybeErr,mapPairsM)
import GF.Data.Utilities(mapFst,mapSnd)
import GF.Infra.Option
import Control.Monad(ap,liftM,liftM2) -- ,unless,mplus
@@ -291,9 +291,17 @@ glue env (v1,v2) = glu v1 v2
vt v = case value2term loc (local env) v of
Left i -> Error ('#':show i)
Right t -> t
in error . render $
ppL loc (hang "unsupported token gluing:" 4
(Glue (vt v1) (vt v2)))
originalMsg = render $ ppL loc (hang "unsupported token gluing" 4
(Glue (vt v1) (vt v2)))
term = render $ pp $ Glue (vt v1) (vt v2)
in error $ unlines
[originalMsg
,""
,"There was a problem in the expression `"++term++"`, either:"
,"1) You are trying to use + on runtime arguments, possibly via an oper."
,"2) One of the arguments in `"++term++"` is a bound variable from pattern matching a string, but the cases are non-exhaustive."
,"For more help see https://github.com/GrammaticalFramework/gf-core/tree/master/doc/errors/gluing.md"
]
-- | to get a string from a value that represents a sequence of terminals
@@ -318,7 +326,7 @@ strsFromValue t = case t of
return [strTok (str2strings def) vars |
def <- d0,
vars <- [[(str2strings v, map sstr c) | (v,c) <- zip vv c0] |
vv <- combinations v0]
vv <- sequence v0]
]
VFV ts -> concat # mapM strsFromValue ts
VStrs ts -> concat # mapM strsFromValue ts
@@ -520,7 +528,7 @@ value2term' stop loc xs v0 =
-- VGlue v1 v2 -> Glue (v2t v1) (v2t v2)
-- VExtR v1 v2 -> ExtR (v2t v1) (v2t v2)
VError err -> return (Error err)
_ -> bug ("value2term "++show loc++" : "++show v0)
where
v2t = v2txs xs
v2txs = value2term' stop loc
@@ -546,7 +554,7 @@ value2term' stop loc xs v0 =
linPattVars p =
if null dups
then return pvs
else fail.render $ hang "Pattern is not linear:" 4 (ppPatt Unqualified 0 p)
else fail.render $ hang "Pattern is not linear. All variable names on the left-hand side must be distinct." 4 (ppPatt Unqualified 0 p)
where
allpvs = allPattVars p
pvs = nub allpvs

View File

@@ -7,7 +7,7 @@ import GF.Text.Pretty
--import GF.Grammar.Predef(cPredef,cInts)
--import GF.Compile.Compute.Predef(predef)
--import GF.Compile.Compute.Value(Predefined(..))
import GF.Infra.Ident(Ident,identS,identW,prefixIdent)
import GF.Infra.Ident(Ident,identC,identS,identW,prefixIdent,showRawIdent,rawIdentS)
import GF.Infra.Option
import GF.Haskell as H
import GF.Grammar.Canonical as C
@@ -21,7 +21,7 @@ concretes2haskell opts absname gr =
| let Grammar abstr cncs = grammar2canonical opts absname gr,
cncmod<-cncs,
let ModId name = concName cncmod
filename = name ++ ".hs" :: FilePath
filename = showRawIdent name ++ ".hs" :: FilePath
]
-- | Generate Haskell code for the given concrete module.
@@ -53,7 +53,7 @@ concrete2haskell opts
labels = S.difference (S.unions (map S.fromList recs)) common_labels
common_records = S.fromList [[label_s]]
common_labels = S.fromList [label_s]
label_s = LabelId "s"
label_s = LabelId (rawIdentS "s")
signature (CatDef c _) = TypeSig lf (Fun abs (pure lin))
where
@@ -69,7 +69,7 @@ concrete2haskell opts
where
--funcats = S.fromList [c | FunDef f (C.Type _ (TypeApp c _))<-funs]
allcats = S.fromList [c | CatDef c _<-cats]
gId :: ToIdent i => i -> Ident
gId = (if haskellOption opts HaskellNoPrefix then id else prefixIdent "G")
. toIdent
@@ -116,7 +116,7 @@ concrete2haskell opts
where (ls,ts) = unzip $ sortOn fst [(l,t)|RecordRow l t<-rs]
StrType -> tcon0 (identS "Str")
TableType pt lt -> Fun (ppT pt) (ppT lt)
-- TupleType lts ->
-- TupleType lts ->
lincatDef (LincatDef c t) = tsyn0 (lincatName c) (convLinType t)
@@ -126,7 +126,7 @@ concrete2haskell opts
linDefs = map eqn . sortOn fst . map linDef
where eqn (cat,(f,(ps,rhs))) = (cat,Eqn (f,ps) rhs)
linDef (LinDef f xs rhs0) =
linDef (LinDef f xs rhs0) =
(cat,(linfunName cat,(lhs,rhs)))
where
lhs = [ConP (aId f) (map VarP abs_args)]
@@ -144,7 +144,7 @@ concrete2haskell opts
where
vs = [(VarValueId (Unqual x),a)|(VarId x,a)<-zip xs args]
env= [(VarValueId (Unqual x),lc)|(VarId x,lc)<-zip xs (map arglincat absctx)]
letlin a (TypeBinding _ (C.Type _ (TypeApp acat _))) =
(a,Ap (Var (linfunName acat)) (Var (abs_arg a)))
@@ -187,7 +187,7 @@ concrete2haskell opts
pId p@(ParamId s) =
if "to_R_" `isPrefixOf` unqual s then toIdent p else gId p -- !! a hack
table cs =
if all (null.patVars) ps
then lets ds (LambdaCase [(ppP p,t')|(p,t')<-zip ps ts'])
@@ -315,13 +315,13 @@ instance Records rhs => Records (TableRow rhs) where
-- | Record subtyping is converted into explicit coercions in Haskell
coerce env ty t =
case (ty,t) of
case (ty,t) of
(_,VariantValue ts) -> VariantValue (map (coerce env ty) ts)
(TableType ti tv,TableValue _ cs) ->
TableValue ti [TableRow p (coerce env tv t)|TableRow p t<-cs]
(RecordType rt,RecordValue r) ->
RecordValue [RecordRow l (coerce env ft f) |
RecordRow l f<-r,ft<-[ft|RecordRow l' ft<-rt,l'==l]]
RecordRow l f<-r,ft<-[ft | RecordRow l' ft <- rt, l'==l]]
(RecordType rt,VarValue x)->
case lookup x env of
Just ty' | ty'/=ty -> -- better to compare to normal form of ty'
@@ -334,18 +334,17 @@ coerce env ty t =
_ -> t
where
app f ts = ParamConstant (Param f ts) -- !! a hack
to_rcon = ParamId . Unqual . to_rcon' . labels
to_rcon = ParamId . Unqual . rawIdentS . to_rcon' . labels
patVars p = []
labels r = [l|RecordRow l _<-r]
labels r = [l | RecordRow l _ <- r]
proj = Var . identS . proj'
proj' (LabelId l) = "proj_"++l
proj' (LabelId l) = "proj_" ++ showRawIdent l
rcon = Var . rcon'
rcon' = identS . rcon_name
rcon_name ls = "R"++concat (sort ['_':l|LabelId l<-ls])
rcon_name ls = "R"++concat (sort ['_':showRawIdent l | LabelId l <- ls])
to_rcon' = ("to_"++) . rcon_name
recordType ls =
@@ -400,17 +399,17 @@ linfunName c = prefixIdent "lin" (toIdent c)
class ToIdent i where toIdent :: i -> Ident
instance ToIdent ParamId where toIdent (ParamId q) = qIdentS q
instance ToIdent PredefId where toIdent (PredefId s) = identS s
instance ToIdent CatId where toIdent (CatId s) = identS s
instance ToIdent C.FunId where toIdent (FunId s) = identS s
instance ToIdent VarValueId where toIdent (VarValueId q) = qIdentS q
instance ToIdent ParamId where toIdent (ParamId q) = qIdentC q
instance ToIdent PredefId where toIdent (PredefId s) = identC s
instance ToIdent CatId where toIdent (CatId s) = identC s
instance ToIdent C.FunId where toIdent (FunId s) = identC s
instance ToIdent VarValueId where toIdent (VarValueId q) = qIdentC q
qIdentS = identS . unqual
qIdentC = identS . unqual
unqual (Qual (ModId m) n) = m++"_"++n
unqual (Unqual n) = n
unqual (Qual (ModId m) n) = showRawIdent m++"_"++ showRawIdent n
unqual (Unqual n) = showRawIdent n
instance ToIdent VarId where
toIdent Anonymous = identW
toIdent (VarId s) = identS s
toIdent (VarId s) = identC s

View File

@@ -41,6 +41,7 @@ import Control.Monad
import Control.Monad.Identity
--import Control.Exception
--import Debug.Trace(trace)
import qualified Control.Monad.Fail as Fail
----------------------------------------------------------------------
-- main conversion function
@@ -196,6 +197,9 @@ newtype CnvMonad a = CM {unCM :: SourceGrammar
-> ([ProtoFCat],[Symbol])
-> Branch b}
instance Fail.MonadFail CnvMonad where
fail = bug
instance Applicative CnvMonad where
pure = return
(<*>) = ap
@@ -614,6 +618,23 @@ mkArray lst = listArray (0,length lst-1) lst
mkSetArray map = array (0,Map.size map-1) [(v,k) | (k,v) <- Map.toList map]
bug msg = ppbug msg
ppbug msg = error . render $ hang "Internal error in GeneratePMCFG:" 4 msg
ppbug msg = error completeMsg
where
originalMsg = render $ hang "Internal error in GeneratePMCFG:" 4 msg
completeMsg =
case render msg of -- the error message for pattern matching a runtime string
"descend (CStr 0,CNil,CProj (LIdent (Id {rawId2utf8 = \"s\"})) CNil)"
-> unlines [originalMsg -- add more helpful output
,""
,"1) Check that you are not trying to pattern match a /runtime string/."
," These are illegal:"
," lin Test foo = case foo.s of {"
," \"str\" => … } ; <- explicit matching argument of a lin"
," lin Test foo = opThatMatches foo <- calling an oper that pattern matches"
,""
,"2) Not about pattern matching? Submit a bug report and we update the error message."
," https://github.com/GrammaticalFramework/gf-core/issues"
]
_ -> originalMsg -- any other message: just print it as is
ppU = ppTerm Unqualified

View File

@@ -6,30 +6,35 @@ module GF.Compile.GrammarToCanonical(
) where
import Data.List(nub,partition)
import qualified Data.Map as M
import Data.Maybe(fromMaybe)
import qualified Data.Set as S
import GF.Data.ErrM
import GF.Text.Pretty
import GF.Grammar.Grammar
import GF.Grammar.Grammar as G
import GF.Grammar.Lookup(lookupOrigInfo,allOrigInfos,allParamValues)
import GF.Grammar.Macros(typeForm,collectOp,collectPattOp,mkAbs,mkApp,term2patt)
import GF.Grammar.Macros(typeForm,collectOp,collectPattOp,composSafeOp,mkAbs,mkApp,term2patt,sortRec)
import GF.Grammar.Lockfield(isLockLabel)
import GF.Grammar.Predef(cPredef,cInts)
import GF.Compile.Compute.Predef(predef)
import GF.Compile.Compute.Value(Predefined(..))
import GF.Infra.Ident(ModuleName(..),Ident,prefixIdent,showIdent,isWildIdent)
import GF.Infra.Option(optionsPGF)
import GF.Infra.Ident(ModuleName(..),Ident,ident2raw,rawIdentS,showIdent,isWildIdent)
import GF.Infra.Option(Options,optionsPGF)
import PGF.Internal(Literal(..))
import GF.Compile.Compute.ConcreteNew(normalForm,resourceValues)
import GF.Compile.Compute.ConcreteNew(GlobalEnv,normalForm,resourceValues)
import GF.Grammar.Canonical as C
import Debug.Trace
import System.FilePath ((</>), (<.>))
import qualified Debug.Trace as T
-- | Generate Canonical code for the named abstract syntax and all associated
-- concrete syntaxes
grammar2canonical :: Options -> ModuleName -> G.Grammar -> C.Grammar
grammar2canonical opts absname gr =
Grammar (abstract2canonical absname gr)
(map snd (concretes2canonical opts absname gr))
-- | Generate Canonical code for the named abstract syntax
abstract2canonical :: ModuleName -> G.Grammar -> Abstract
abstract2canonical absname gr =
Abstract (modId absname) (convFlags gr absname) cats funs
where
@@ -44,6 +49,7 @@ abstract2canonical absname gr =
convHypo (bt,name,t) =
case typeForm t of
([],(_,cat),[]) -> gId cat -- !!
tf -> error $ "abstract2canonical convHypo: " ++ show tf
convType t =
case typeForm t of
@@ -54,25 +60,26 @@ abstract2canonical absname gr =
convHypo' (bt,name,t) = TypeBinding (gId name) (convType t)
-- | Generate Canonical code for the all concrete syntaxes associated with
-- the named abstract syntax in given the grammar.
concretes2canonical :: Options -> ModuleName -> G.Grammar -> [(FilePath, Concrete)]
concretes2canonical opts absname gr =
[(cncname,concrete2canonical gr cenv absname cnc cncmod)
| let cenv = resourceValues opts gr,
cnc<-allConcretes gr absname,
let cncname = "canonical/"++render cnc ++ ".gf" :: FilePath
let cncname = "canonical" </> render cnc <.> "gf"
Ok cncmod = lookupModule gr cnc
]
-- | Generate Canonical GF for the given concrete module.
concrete2canonical :: G.Grammar -> GlobalEnv -> ModuleName -> ModuleName -> ModuleInfo -> Concrete
concrete2canonical gr cenv absname cnc modinfo =
Concrete (modId cnc) (modId absname) (convFlags gr cnc)
(neededParamTypes S.empty (params defs))
[lincat|(_,Left lincat)<-defs]
[lin|(_,Right lin)<-defs]
[lincat | (_,Left lincat) <- defs]
[lin | (_,Right lin) <- defs]
where
defs = concatMap (toCanonical gr absname cenv) .
defs = concatMap (toCanonical gr absname cenv) .
M.toList $
jments modinfo
@@ -85,6 +92,7 @@ concrete2canonical gr cenv absname cnc modinfo =
else let ((got,need),def) = paramType gr q
in def++neededParamTypes (S.union got have) (S.toList need++qs)
toCanonical :: G.Grammar -> ModuleName -> GlobalEnv -> (Ident, Info) -> [(S.Set QIdent, Either LincatDef LinDef)]
toCanonical gr absname cenv (name,jment) =
case jment of
CncCat (Just (L loc typ)) _ _ pprn _ ->
@@ -97,7 +105,8 @@ toCanonical gr absname cenv (name,jment) =
where
tts = tableTypes gr [e']
e' = unAbs (length params) $
e' = cleanupRecordFields lincat $
unAbs (length params) $
nf loc (mkAbs params (mkApp def (map Vr args)))
params = [(b,x)|(b,x,_)<-ctx]
args = map snd params
@@ -108,12 +117,12 @@ toCanonical gr absname cenv (name,jment) =
_ -> []
where
nf loc = normalForm cenv (L loc name)
-- aId n = prefixIdent "A." (gId n)
unAbs 0 t = t
unAbs n (Abs _ _ t) = unAbs (n-1) t
unAbs _ t = t
tableTypes :: G.Grammar -> [Term] -> S.Set QIdent
tableTypes gr ts = S.unions (map tabtys ts)
where
tabtys t =
@@ -122,6 +131,7 @@ tableTypes gr ts = S.unions (map tabtys ts)
T (TTyped t) cs -> S.union (paramTypes gr t) (tableTypes gr (map snd cs))
_ -> collectOp tabtys t
paramTypes :: G.Grammar -> G.Type -> S.Set QIdent
paramTypes gr t =
case t of
RecType fs -> S.unions (map (paramTypes gr.snd) fs)
@@ -140,11 +150,26 @@ paramTypes gr t =
Ok (_,ResParam {}) -> S.singleton q
_ -> ignore
ignore = trace ("Ignore: "++show t) S.empty
ignore = T.trace ("Ignore: " ++ show t) S.empty
-- | Filter out record fields from definitions which don't appear in lincat.
cleanupRecordFields :: G.Type -> Term -> Term
cleanupRecordFields (RecType ls) (R as) =
let defnFields = M.fromList ls
in R
[ (lbl, (mty, t'))
| (lbl, (mty, t)) <- as
, M.member lbl defnFields
, let Just ty = M.lookup lbl defnFields
, let t' = cleanupRecordFields ty t
]
cleanupRecordFields ty t@(FV _) = composSafeOp (cleanupRecordFields ty) t
cleanupRecordFields _ t = t
convert :: G.Grammar -> Term -> LinValue
convert gr = convert' gr []
convert' :: G.Grammar -> [Ident] -> Term -> LinValue
convert' gr vs = ppT
where
ppT0 = convert' gr vs
@@ -162,20 +187,20 @@ convert' gr vs = ppT
S t p -> selection (ppT t) (ppT p)
C t1 t2 -> concatValue (ppT t1) (ppT t2)
App f a -> ap (ppT f) (ppT a)
R r -> RecordValue (fields r)
R r -> RecordValue (fields (sortRec r))
P t l -> projection (ppT t) (lblId l)
Vr x -> VarValue (gId x)
Cn x -> VarValue (gId x) -- hmm
Con c -> ParamConstant (Param (gId c) [])
Sort k -> VarValue (gId k)
EInt n -> LiteralValue (IntConstant n)
Q (m,n) -> if m==cPredef then ppPredef n else VarValue ((gQId m n))
QC (m,n) -> ParamConstant (Param ((gQId m n)) [])
Q (m,n) -> if m==cPredef then ppPredef n else VarValue (gQId m n)
QC (m,n) -> ParamConstant (Param (gQId m n) [])
K s -> LiteralValue (StrConstant s)
Empty -> LiteralValue (StrConstant "")
FV ts -> VariantValue (map ppT ts)
Alts t' vs -> alts vs (ppT t')
_ -> error $ "convert' "++show t
_ -> error $ "convert' ppT: " ++ show t
ppCase (p,t) = TableRow (ppP p) (ppTv (patVars p++vs) t)
@@ -188,12 +213,12 @@ convert' gr vs = ppT
Ok ALL_CAPIT -> p "ALL_CAPIT"
_ -> VarValue (gQId cPredef n) -- hmm
where
p = PredefValue . PredefId
p = PredefValue . PredefId . rawIdentS
ppP p =
case p of
PC c ps -> ParamPattern (Param (gId c) (map ppP ps))
PP (m,c) ps -> ParamPattern (Param ((gQId m c)) (map ppP ps))
PP (m,c) ps -> ParamPattern (Param (gQId m c) (map ppP ps))
PR r -> RecordPattern (fields r) {-
PW -> WildPattern
PV x -> VarP x
@@ -202,6 +227,7 @@ convert' gr vs = ppT
PFloat x -> Lit (show x)
PT _ p -> ppP p
PAs x p -> AsP x (ppP p) -}
_ -> error $ "convert' ppP: " ++ show p
where
fields = map field . filter (not.isLockLabel.fst)
field (l,p) = RecordRow (lblId l) (ppP p)
@@ -215,14 +241,15 @@ convert' gr vs = ppT
alt (t,p) = (pre p,ppT0 t)
pre (K s) = [s]
pre Empty = [""] -- Empty == K ""
pre (Strs ts) = concatMap pre ts
pre (EPatt p) = pat p
pre t = error $ "pre "++show t
pre t = error $ "convert' alts pre: " ++ show t
pat (PString s) = [s]
pat (PAlt p1 p2) = pat p1++pat p2
pat (PSeq p1 p2) = [s1++s2 | s1<-pat p1, s2<-pat p2]
pat p = error $ "pat "++show p
pat p = error $ "convert' alts pat: "++show p
fields = map field . filter (not.isLockLabel.fst)
field (l,(_,t)) = RecordRow (lblId l) (ppT t)
@@ -235,6 +262,7 @@ convert' gr vs = ppT
ParamConstant (Param p (ps++[a]))
_ -> error $ "convert' ap: "++render (ppA f <+> ppA a)
concatValue :: LinValue -> LinValue -> LinValue
concatValue v1 v2 =
case (v1,v2) of
(LiteralValue (StrConstant ""),_) -> v2
@@ -242,21 +270,24 @@ concatValue v1 v2 =
_ -> ConcatValue v1 v2
-- | Smart constructor for projections
projection r l = maybe (Projection r l) id (proj r l)
projection :: LinValue -> LabelId -> LinValue
projection r l = fromMaybe (Projection r l) (proj r l)
proj :: LinValue -> LabelId -> Maybe LinValue
proj r l =
case r of
RecordValue r -> case [v|RecordRow l' v<-r,l'==l] of
RecordValue r -> case [v | RecordRow l' v <- r, l'==l] of
[v] -> Just v
_ -> Nothing
_ -> Nothing
-- | Smart constructor for selections
selection :: LinValue -> LinValue -> LinValue
selection t v =
-- Note: impossible cases can become possible after grammar transformation
case t of
TableValue tt r ->
case nub [rv|TableRow _ rv<-keep] of
case nub [rv | TableRow _ rv <- keep] of
[rv] -> rv
_ -> Selection (TableValue tt r') v
where
@@ -275,13 +306,16 @@ selection t v =
(keep,discard) = partition (mightMatchRow v) r
_ -> Selection t v
impossible :: LinValue -> LinValue
impossible = CommentedValue "impossible"
mightMatchRow :: LinValue -> TableRow rhs -> Bool
mightMatchRow v (TableRow p _) =
case p of
WildPattern -> True
_ -> mightMatch v p
mightMatch :: LinValue -> LinPattern -> Bool
mightMatch v p =
case v of
ConcatValue _ _ -> False
@@ -293,16 +327,18 @@ mightMatch v p =
RecordValue rv ->
case p of
RecordPattern rp ->
and [maybe False (flip mightMatch p) (proj v l) | RecordRow l p<-rp]
and [maybe False (`mightMatch` p) (proj v l) | RecordRow l p<-rp]
_ -> False
_ -> True
patVars :: Patt -> [Ident]
patVars p =
case p of
PV x -> [x]
PAs x p -> x:patVars p
_ -> collectPattOp patVars p
convType :: Term -> LinType
convType = ppT
where
ppT t =
@@ -314,9 +350,9 @@ convType = ppT
Sort k -> convSort k
-- EInt n -> tcon0 (identS ("({-"++show n++"-})")) -- type level numeric literal
FV (t:ts) -> ppT t -- !!
QC (m,n) -> ParamType (ParamTypeId ((gQId m n)))
Q (m,n) -> ParamType (ParamTypeId ((gQId m n)))
_ -> error $ "Missing case in convType for: "++show t
QC (m,n) -> ParamType (ParamTypeId (gQId m n))
Q (m,n) -> ParamType (ParamTypeId (gQId m n))
_ -> error $ "convType ppT: " ++ show t
convFields = map convField . filter (not.isLockLabel.fst)
convField (l,r) = RecordRow (lblId l) (ppT r)
@@ -325,15 +361,20 @@ convType = ppT
"Float" -> FloatType
"Int" -> IntType
"Str" -> StrType
_ -> error ("convSort "++show k)
_ -> error $ "convType convSort: " ++ show k
toParamType :: Term -> ParamType
toParamType t = case convType t of
ParamType pt -> pt
_ -> error ("toParamType "++show t)
_ -> error $ "toParamType: " ++ show t
toParamId :: Term -> ParamId
toParamId t = case toParamType t of
ParamTypeId p -> p
paramType :: G.Grammar
-> (ModuleName, Ident)
-> ((S.Set (ModuleName, Ident), S.Set QIdent), [ParamDef])
paramType gr q@(_,n) =
case lookupOrigInfo gr q of
Ok (m,ResParam (Just (L _ ps)) _)
@@ -341,7 +382,7 @@ paramType gr q@(_,n) =
((S.singleton (m,n),argTypes ps),
[ParamDef name (map (param m) ps)]
)
where name = (gQId m n)
where name = gQId m n
Ok (m,ResOper _ (Just (L _ t)))
| m==cPredef && n==cInts ->
((S.empty,S.empty),[]) {-
@@ -349,36 +390,46 @@ paramType gr q@(_,n) =
[Type (ConAp ((gQId m n)) [identS "n"]) (TId (identS "Int"))])-}
| otherwise ->
((S.singleton (m,n),paramTypes gr t),
[ParamAliasDef ((gQId m n)) (convType t)])
[ParamAliasDef (gQId m n) (convType t)])
_ -> ((S.empty,S.empty),[])
where
param m (n,ctx) = Param ((gQId m n)) [toParamId t|(_,_,t)<-ctx]
param m (n,ctx) = Param (gQId m n) [toParamId t|(_,_,t)<-ctx]
argTypes = S.unions . map argTypes1
argTypes1 (n,ctx) = S.unions [paramTypes gr t|(_,_,t)<-ctx]
lblId = LabelId . render -- hmm
modId (MN m) = ModId (showIdent m)
lblId :: Label -> C.LabelId
lblId (LIdent ri) = LabelId ri
lblId (LVar i) = LabelId (rawIdentS (show i)) -- hmm
class FromIdent i where gId :: Ident -> i
modId :: ModuleName -> C.ModId
modId (MN m) = ModId (ident2raw m)
class FromIdent i where
gId :: Ident -> i
instance FromIdent VarId where
gId i = if isWildIdent i then Anonymous else VarId (showIdent i)
gId i = if isWildIdent i then Anonymous else VarId (ident2raw i)
instance FromIdent C.FunId where gId = C.FunId . showIdent
instance FromIdent CatId where gId = CatId . showIdent
instance FromIdent C.FunId where gId = C.FunId . ident2raw
instance FromIdent CatId where gId = CatId . ident2raw
instance FromIdent ParamId where gId = ParamId . unqual
instance FromIdent VarValueId where gId = VarValueId . unqual
class FromIdent i => QualIdent i where gQId :: ModuleName -> Ident -> i
class FromIdent i => QualIdent i where
gQId :: ModuleName -> Ident -> i
instance QualIdent ParamId where gQId m n = ParamId (qual m n)
instance QualIdent ParamId where gQId m n = ParamId (qual m n)
instance QualIdent VarValueId where gQId m n = VarValueId (qual m n)
qual m n = Qual (modId m) (showIdent n)
unqual n = Unqual (showIdent n)
qual :: ModuleName -> Ident -> QualId
qual m n = Qual (modId m) (ident2raw n)
unqual :: Ident -> QualId
unqual n = Unqual (ident2raw n)
convFlags :: G.Grammar -> ModuleName -> Flags
convFlags gr mn =
Flags [(n,convLit v) |
Flags [(rawIdentS n,convLit v) |
(n,v)<-err (const []) (optionsPGF.mflags) (lookupModule gr mn)]
where
convLit l =

View File

@@ -1,4 +1,4 @@
{-# LANGUAGE BangPatterns, FlexibleContexts, MagicHash #-}
{-# LANGUAGE BangPatterns, FlexibleContexts #-}
module GF.Compile.GrammarToPGF (mkCanon2pgf) where
--import GF.Compile.Export
@@ -8,16 +8,13 @@ import GF.Compile.GenerateBC
import PGF(CId,mkCId,utf8CId)
import PGF.Internal(fidInt,fidFloat,fidString,fidVar)
import PGF.Internal(updateProductionIndices)
--import qualified PGF.Macros as CM
import qualified PGF.Internal as C
import qualified PGF.Internal as D
import GF.Grammar.Predef
--import GF.Grammar.Printer
import GF.Grammar.Grammar
import qualified GF.Grammar.Lookup as Look
import qualified GF.Grammar as A
import qualified GF.Grammar.Macros as GM
--import GF.Compile.GeneratePMCFG
import GF.Infra.Ident
import GF.Infra.Option
@@ -30,9 +27,6 @@ import qualified Data.Map as Map
import qualified Data.IntMap as IntMap
import Data.Array.IArray
import Data.Char
import GHC.Prim
import GHC.Base(getTag)
mkCanon2pgf :: Options -> SourceGrammar -> ModuleName -> IOE D.PGF
mkCanon2pgf opts gr am = do
@@ -65,7 +59,7 @@ mkCanon2pgf opts gr am = do
mkConcr cm = do
let cflags = err (const noOptions) mflags (lookupModule gr cm)
ciCmp | flag optCaseSensitive cflags = compare
| otherwise = compareCaseInsensitve
| otherwise = C.compareCaseInsensitve
(ex_seqs,cdefs) <- addMissingPMCFGs
Map.empty
@@ -74,7 +68,7 @@ mkCanon2pgf opts gr am = do
let flags = Map.fromList [(mkCId f,x) | (f,x) <- optionsPGF cflags]
seqs = (mkArray . sortNubBy ciCmp . concat) $
seqs = (mkArray . C.sortNubBy ciCmp . concat) $
(Map.keys ex_seqs : [maybe [] elems (mseqs mi) | (m,mi) <- allExtends gr cm])
ex_seqs_arr = mkMapArray ex_seqs :: Array SeqId Sequence
@@ -312,119 +306,3 @@ genPrintNames cdefs =
mkArray lst = listArray (0,length lst-1) lst
mkMapArray map = array (0,Map.size map-1) [(v,k) | (k,v) <- Map.toList map]
-- The following is a version of Data.List.sortBy which together
-- with the sorting also eliminates duplicate values
sortNubBy cmp = mergeAll . sequences
where
sequences (a:b:xs) =
case cmp a b of
GT -> descending b [a] xs
EQ -> sequences (b:xs)
LT -> ascending b (a:) xs
sequences xs = [xs]
descending a as [] = [a:as]
descending a as (b:bs) =
case cmp a b of
GT -> descending b (a:as) bs
EQ -> descending a as bs
LT -> (a:as) : sequences (b:bs)
ascending a as [] = let !x = as [a]
in [x]
ascending a as (b:bs) =
case cmp a b of
GT -> let !x = as [a]
in x : sequences (b:bs)
EQ -> ascending a as bs
LT -> ascending b (\ys -> as (a:ys)) bs
mergeAll [x] = x
mergeAll xs = mergeAll (mergePairs xs)
mergePairs (a:b:xs) = let !x = merge a b
in x : mergePairs xs
mergePairs xs = xs
merge as@(a:as') bs@(b:bs') =
case cmp a b of
GT -> b:merge as bs'
EQ -> a:merge as' bs'
LT -> a:merge as' bs
merge [] bs = bs
merge as [] = as
-- The following function does case-insensitive comparison of sequences.
-- This is used to allow case-insensitive parsing, while
-- the linearizer still has access to the original cases.
compareCaseInsensitve s1 s2 =
compareSeq (elems s1) (elems s2)
where
compareSeq [] [] = EQ
compareSeq [] _ = LT
compareSeq _ [] = GT
compareSeq (x:xs) (y:ys) =
case compareSym x y of
EQ -> compareSeq xs ys
x -> x
compareSym s1 s2 =
case s1 of
D.SymCat d1 r1
-> case s2 of
D.SymCat d2 r2
-> case compare d1 d2 of
EQ -> r1 `compare` r2
x -> x
_ -> LT
D.SymLit d1 r1
-> case s2 of
D.SymCat {} -> GT
D.SymLit d2 r2
-> case compare d1 d2 of
EQ -> r1 `compare` r2
x -> x
_ -> LT
D.SymVar d1 r1
-> if tagToEnum# (getTag s2 ># 2#)
then LT
else case s2 of
D.SymVar d2 r2
-> case compare d1 d2 of
EQ -> r1 `compare` r2
x -> x
_ -> GT
D.SymKS t1
-> if tagToEnum# (getTag s2 ># 3#)
then LT
else case s2 of
D.SymKS t2 -> t1 `compareToken` t2
_ -> GT
D.SymKP a1 b1
-> if tagToEnum# (getTag s2 ># 4#)
then LT
else case s2 of
D.SymKP a2 b2
-> case compare a1 a2 of
EQ -> b1 `compare` b2
x -> x
_ -> GT
_ -> let t1 = getTag s1
t2 = getTag s2
in if tagToEnum# (t1 <# t2)
then LT
else if tagToEnum# (t1 ==# t2)
then EQ
else GT
compareToken [] [] = EQ
compareToken [] _ = LT
compareToken _ [] = GT
compareToken (x:xs) (y:ys)
| x == y = compareToken xs ys
| otherwise = case compare (toLower x) (toLower y) of
EQ -> case compareToken xs ys of
EQ -> compare x y
x -> x
x -> x

View File

@@ -21,23 +21,16 @@ import GF.Grammar.Printer
import GF.Grammar.Macros
import GF.Grammar.Lookup
import GF.Grammar.Predef
--import GF.Compile.Refresh
--import GF.Compile.Compute.Concrete
import GF.Compile.Compute.ConcreteNew(GlobalEnv,normalForm,resourceValues)
--import GF.Compile.CheckGrammar
--import GF.Compile.Update
import GF.Data.Operations
--import GF.Infra.CheckM
import GF.Infra.Option
import Control.Monad
--import Data.List
import qualified Data.Set as Set
import qualified Data.Map as Map
import GF.Text.Pretty
import Debug.Trace
-- | partial evaluation of concrete syntax. AR 6\/2001 -- 16\/5\/2003 -- 5\/2\/2005.
optimizeModule :: Options -> SourceGrammar -> SourceModule -> Err SourceModule
@@ -54,7 +47,7 @@ optimizeModule opts sgr m@(name,mi)
updateEvalInfo mi (i,info) = do
info <- evalInfo oopts resenv sgr (name,mi) i info
return (mi{jments=updateTree (i,info) (jments mi)})
return (mi{jments=Map.insert i info (jments mi)})
evalInfo :: Options -> GlobalEnv -> SourceGrammar -> SourceModule -> Ident -> Info -> Err Info
evalInfo opts resenv sgr m c info = do

View File

@@ -26,50 +26,58 @@ import Data.List --(isPrefixOf, find, intersperse)
import qualified Data.Map as Map
type Prefix = String -> String
type DerivingClause = String
-- | the main function
grammar2haskell :: Options
-> String -- ^ Module name.
-> PGF
-> String
grammar2haskell opts name gr = foldr (++++) [] $
pragmas ++ haskPreamble gadt name ++ [types, gfinstances gId lexical gr'] ++ compos
grammar2haskell opts name gr = foldr (++++) [] $
pragmas ++ haskPreamble gadt name derivingClause extraImports ++
[types, gfinstances gId lexical gr'] ++ compos
where gr' = hSkeleton gr
gadt = haskellOption opts HaskellGADT
dataExt = haskellOption opts HaskellData
lexical cat = haskellOption opts HaskellLexical && isLexicalCat opts cat
gId | haskellOption opts HaskellNoPrefix = id
| otherwise = ("G"++)
pragmas | gadt = ["{-# OPTIONS_GHC -fglasgow-exts #-}","{-# LANGUAGE GADTs #-}"]
gId | haskellOption opts HaskellNoPrefix = rmForbiddenChars
| otherwise = ("G"++) . rmForbiddenChars
-- GF grammars allow weird identifier names inside '', e.g. 'VP/Object'
rmForbiddenChars = filter (`notElem` "'!#$%&*+./<=>?@\\^|-~")
pragmas | gadt = ["{-# LANGUAGE GADTs, FlexibleInstances, KindSignatures, RankNTypes, TypeSynonymInstances #-}"]
| dataExt = ["{-# LANGUAGE DeriveDataTypeable #-}"]
| otherwise = []
derivingClause
| dataExt = "deriving (Show,Data)"
| otherwise = "deriving Show"
extraImports | gadt = ["import Control.Monad.Identity",
"import Data.Monoid"]
| dataExt = ["import Data.Data"]
| otherwise = []
types | gadt = datatypesGADT gId lexical gr'
| otherwise = datatypes gId lexical gr'
| otherwise = datatypes gId derivingClause lexical gr'
compos | gadt = prCompos gId lexical gr' ++ composClass
| otherwise = []
haskPreamble gadt name =
haskPreamble gadt name derivingClause extraImports =
[
"module " ++ name ++ " where",
""
] ++
(if gadt then [
"import Control.Monad.Identity",
"import Data.Monoid"
] else []) ++
[
] ++ extraImports ++ [
"import PGF hiding (Tree)",
"----------------------------------------------------",
"-- automatic translation from GF to Haskell",
"----------------------------------------------------",
"",
"",
"class Gf a where",
" gf :: a -> Expr",
" fg :: Expr -> a",
"",
predefInst gadt "GString" "String" "unStr" "mkStr",
predefInst gadt derivingClause "GString" "String" "unStr" "mkStr",
"",
predefInst gadt "GInt" "Int" "unInt" "mkInt",
predefInst gadt derivingClause "GInt" "Int" "unInt" "mkInt",
"",
predefInst gadt "GFloat" "Double" "unFloat" "mkFloat",
predefInst gadt derivingClause "GFloat" "Double" "unFloat" "mkFloat",
"",
"----------------------------------------------------",
"-- below this line machine-generated",
@@ -77,11 +85,11 @@ haskPreamble gadt name =
""
]
predefInst gadt gtyp typ destr consr =
predefInst gadt derivingClause gtyp typ destr consr =
(if gadt
then []
else ("newtype" +++ gtyp +++ "=" +++ gtyp +++ typ +++ " deriving Show\n\n")
)
then []
else ("newtype" +++ gtyp +++ "=" +++ gtyp +++ typ +++ derivingClause ++ "\n\n")
)
++
"instance Gf" +++ gtyp +++ "where" ++++
" gf (" ++ gtyp +++ "x) =" +++ consr +++ "x" ++++
@@ -94,24 +102,24 @@ type OIdent = String
type HSkeleton = [(OIdent, [(OIdent, [OIdent])])]
datatypes :: Prefix -> (OIdent -> Bool) -> (String,HSkeleton) -> String
datatypes gId lexical = (foldr (+++++) "") . (filter (/="")) . (map (hDatatype gId lexical)) . snd
datatypes :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (String,HSkeleton) -> String
datatypes gId derivingClause lexical = (foldr (+++++) "") . (filter (/="")) . (map (hDatatype gId derivingClause lexical)) . snd
gfinstances :: Prefix -> (OIdent -> Bool) -> (String,HSkeleton) -> String
gfinstances gId lexical (m,g) = (foldr (+++++) "") $ (filter (/="")) $ (map (gfInstance gId lexical m)) g
hDatatype :: Prefix -> (OIdent -> Bool) -> (OIdent, [(OIdent, [OIdent])]) -> String
hDatatype _ _ ("Cn",_) = "" ---
hDatatype gId _ (cat,[]) = "data" +++ gId cat
hDatatype gId _ (cat,rules) | isListCat (cat,rules) =
"newtype" +++ gId cat +++ "=" +++ gId cat +++ "[" ++ gId (elemCat cat) ++ "]"
+++ "deriving Show"
hDatatype gId lexical (cat,rules) =
hDatatype :: Prefix -> DerivingClause -> (OIdent -> Bool) -> (OIdent, [(OIdent, [OIdent])]) -> String
hDatatype _ _ _ ("Cn",_) = "" ---
hDatatype gId _ _ (cat,[]) = "data" +++ gId cat
hDatatype gId derivingClause _ (cat,rules) | isListCat (cat,rules) =
"newtype" +++ gId cat +++ "=" +++ gId cat +++ "[" ++ gId (elemCat cat) ++ "]"
+++ derivingClause
hDatatype gId derivingClause lexical (cat,rules) =
"data" +++ gId cat +++ "=" ++
(if length rules == 1 then "" else "\n ") +++
foldr1 (\x y -> x ++ "\n |" +++ y) constructors ++++
" deriving Show"
" " +++ derivingClause
where
constructors = [gId f +++ foldr (+++) "" (map (gId) xx) | (f,xx) <- nonLexicalRules (lexical cat) rules]
++ if lexical cat then [lexicalConstructor cat +++ "String"] else []

View File

@@ -27,19 +27,20 @@ module GF.Compile.Rename (
renameModule
) where
import GF.Infra.Ident
import GF.Infra.CheckM
import GF.Grammar.Grammar
import GF.Grammar.Values
import GF.Grammar.Predef
import GF.Infra.Ident
import GF.Infra.CheckM
import GF.Grammar.Lookup
import GF.Grammar.Macros
import GF.Grammar.Printer
--import GF.Grammar.Lookup
--import GF.Grammar.Printer
import GF.Data.Operations
import Control.Monad
import Data.List (nub,(\\))
import qualified Data.Map as Map
import Data.Maybe(mapMaybe)
import GF.Text.Pretty
-- | this gives top-level access to renaming term input in the cc command
@@ -55,9 +56,9 @@ renameModule cwd gr mo@(m,mi) = do
js <- checkMapRecover (renameInfo cwd status mo) (jments mi)
return (m, mi{jments = js})
type Status = (StatusTree, [(OpenSpec, StatusTree)])
type Status = (StatusMap, [(OpenSpec, StatusMap)])
type StatusTree = BinTree Ident StatusInfo
type StatusMap = Map.Map Ident StatusInfo
type StatusInfo = Ident -> Term
@@ -73,12 +74,12 @@ renameIdentTerm' env@(act,imps) t0 =
Q (m',c) | m' == cPredef {- && isInPredefined c -} -> return t0
Q (m',c) -> do
m <- lookupErr m' qualifs
f <- lookupTree showIdent c m
f <- lookupIdent c m
return $ f c
QC (m',c) | m' == cPredef {- && isInPredefined c -} -> return t0
QC (m',c) -> do
m <- lookupErr m' qualifs
f <- lookupTree showIdent c m
f <- lookupIdent c m
return $ f c
_ -> return t0
where
@@ -93,30 +94,21 @@ renameIdentTerm' env@(act,imps) t0 =
| otherwise = checkError s
ident alt c =
case lookupTree showIdent c act of
Ok f -> return (f c)
_ -> case lookupTreeManyAll showIdent opens c of
[f] -> return (f c)
[] -> alt c ("constant not found:" <+> c $$
"given" <+> fsep (punctuate ',' (map fst qualifs)))
fs -> case nub [f c | f <- fs] of
[tr] -> return tr
{-
ts -> return $ AdHocOverload ts
-- name conflicts resolved as overloading in TypeCheck.RConcrete AR 31/1/2014
-- the old definition is below and still presupposed in TypeCheck.Concrete
-}
ts@(t:_) -> do checkWarn ("atomic term" <+> ppTerm Qualified 0 t0 $$
"conflict" <+> hsep (punctuate ',' (map (ppTerm Qualified 0) ts)) $$
"given" <+> fsep (punctuate ',' (map fst qualifs)))
return t
case Map.lookup c act of
Just f -> return (f c)
_ -> case mapMaybe (Map.lookup c) opens of
[f] -> return (f c)
[] -> alt c ("constant not found:" <+> c $$
"given" <+> fsep (punctuate ',' (map fst qualifs)))
fs -> case nub [f c | f <- fs] of
[tr] -> return tr
ts@(t:_) -> do checkWarn ("atomic term" <+> ppTerm Qualified 0 t0 $$
"conflict" <+> hsep (punctuate ',' (map (ppTerm Qualified 0) ts)) $$
"given" <+> fsep (punctuate ',' (map fst qualifs)))
return t
-- a warning will be generated in CheckGrammar, and the head returned
-- in next V:
-- Bad $ "conflicting imports:" +++ unwords (map prt ts)
info2status :: Maybe ModuleName -> (Ident,Info) -> StatusInfo
info2status mq (c,i) = case i of
info2status :: Maybe ModuleName -> Ident -> Info -> StatusInfo
info2status mq c i = case i of
AbsFun _ _ Nothing _ -> maybe Con (curry QC) mq
ResValue _ -> maybe Con (curry QC) mq
ResParam _ _ -> maybe Con (curry QC) mq
@@ -124,10 +116,10 @@ info2status mq (c,i) = case i of
AnyInd False m -> maybe Cn (const (curry Q m)) mq
_ -> maybe Cn (curry Q) mq
tree2status :: OpenSpec -> BinTree Ident Info -> BinTree Ident StatusInfo
tree2status :: OpenSpec -> Map.Map Ident Info -> StatusMap
tree2status o = case o of
OSimple i -> mapTree (info2status (Just i))
OQualif i j -> mapTree (info2status (Just j))
OSimple i -> Map.mapWithKey (info2status (Just i))
OQualif i j -> Map.mapWithKey (info2status (Just j))
buildStatus :: FilePath -> Grammar -> Module -> Check Status
buildStatus cwd gr mo@(m,mi) = checkInModule cwd mi NoLoc empty $ do
@@ -136,14 +128,14 @@ buildStatus cwd gr mo@(m,mi) = checkInModule cwd mi NoLoc empty $ do
ops <- mapM (\o -> lookupModule gr1 (openedModule o) >>= \mi -> return (o,mi)) (mopens mi)
let sts = map modInfo2status (exts++ops)
return (if isModCnc mi
then (emptyBinTree, reverse sts) -- the module itself does not define any names
then (Map.empty, reverse sts) -- the module itself does not define any names
else (self2status m mi,reverse sts)) -- so the empty ident is not needed
modInfo2status :: (OpenSpec,ModuleInfo) -> (OpenSpec, StatusTree)
modInfo2status :: (OpenSpec,ModuleInfo) -> (OpenSpec, StatusMap)
modInfo2status (o,mo) = (o,tree2status o (jments mo))
self2status :: ModuleName -> ModuleInfo -> StatusTree
self2status c m = mapTree (info2status (Just c)) (jments m)
self2status :: ModuleName -> ModuleInfo -> StatusMap
self2status c m = Map.mapWithKey (info2status (Just c)) (jments m)
renameInfo :: FilePath -> Status -> Module -> Ident -> Info -> Check Info
@@ -244,7 +236,7 @@ renamePattern :: Status -> Patt -> Check (Patt,[Ident])
renamePattern env patt =
do r@(p',vs) <- renp patt
let dupl = vs \\ nub vs
unless (null dupl) $ checkError (hang ("[C.4.13] Pattern is not linear:") 4
unless (null dupl) $ checkError (hang ("[C.4.13] Pattern is not linear. All variable names on the left-hand side must be distinct.") 4
patt)
return r
where

View File

@@ -1,3 +1,4 @@
{-# LANGUAGE CPP #-}
module GF.Compile.TypeCheck.ConcreteNew( checkLType, inferLType ) where
-- The code here is based on the paper:
@@ -19,6 +20,7 @@ import GF.Text.Pretty
import Data.List (nub, (\\), tails)
import qualified Data.IntMap as IntMap
import Data.Maybe(fromMaybe,isNothing)
import qualified Control.Monad.Fail as Fail
checkLType :: GlobalEnv -> Term -> Type -> Check (Term, Type)
checkLType ge t ty = runTcM $ do
@@ -646,8 +648,16 @@ instance Monad TcM where
f >>= g = TcM (\ms msgs -> case unTcM f ms msgs of
TcOk x ms msgs -> unTcM (g x) ms msgs
TcFail msgs -> TcFail msgs)
#if !(MIN_VERSION_base(4,13,0))
-- Monad(fail) will be removed in GHC 8.8+
fail = Fail.fail
#endif
instance Fail.MonadFail TcM where
fail = tcError . pp
instance Applicative TcM where
pure = return
(<*>) = ap

View File

@@ -127,8 +127,12 @@ inferLType gr g trm = case trm of
ty <- if isWildIdent z
then return val
else substituteLType [(bt,z,a')] val
return (App f' a',ty)
_ -> checkError ("A function type is expected for" <+> ppTerm Unqualified 0 f <+> "instead of type" <+> ppType fty)
return (App f' a',ty)
_ ->
let term = ppTerm Unqualified 0 f
funName = pp . head . words .render $ term
in checkError ("A function type is expected for" <+> term <+> "instead of type" <+> ppType fty $$
"\n ** Maybe you gave too many arguments to" <+> funName <+> "\n")
S f x -> do
(f', fty) <- inferLType gr g f
@@ -220,8 +224,14 @@ inferLType gr g trm = case trm of
return (RecType (zip ls ts'), typeType)
ExtR r s -> do
(r',rT) <- inferLType gr g r
--- over <- getOverload gr g Nothing r
--- let r1 = maybe r fst over
let r1 = r ---
(r',rT) <- inferLType gr g r1
rT' <- computeLType gr g rT
(s',sT) <- inferLType gr g s
sT' <- computeLType gr g sT
@@ -327,7 +337,7 @@ getOverload gr g mt ot = case appForm ot of
v <- matchOverload f typs ttys
return $ Just v
_ -> return Nothing
where
collectOverloads tr@(Q c) = case lookupOverload gr c of
Ok typs -> typs
@@ -395,7 +405,7 @@ getOverload gr g mt ot = case appForm ot of
matchVal mt v = elem mt [Nothing,Just v,Just (unlocked v)]
unlocked v = case v of
RecType fs -> RecType $ filter (not . isLockLabel . fst) fs
RecType fs -> RecType $ filter (not . isLockLabel . fst) (sortRec fs)
_ -> v
---- TODO: accept subtypes
---- TODO: use a trie
@@ -428,7 +438,9 @@ checkLType gr g trm typ0 = do
else do b' <- checkIn (pp "abs") $ substituteLType [(bt',z,Vr x)] b
checkLType gr ((bt,x,a):g) c b'
return $ (Abs bt x c', Prod bt' z a b')
_ -> checkError $ "function type expected instead of" <+> ppType typ
_ -> checkError $ "function type expected instead of" <+> ppType typ $$
"\n ** Double-check that the type signature of the operation" $$
"matches the number of arguments given to it.\n"
App f a -> do
over <- getOverload gr g (Just typ) trm
@@ -506,8 +518,13 @@ checkLType gr g trm typ0 = do
RecType ss -> return $ map fst ss
_ -> checkError ("cannot get labels from" $$ nest 2 (ppTerm Unqualified 0 typ2))
let ll1 = [l | (l,_) <- rr, notElem l ll2]
(r',_) <- checkLType gr g r (RecType [field | field@(l,_) <- rr, elem l ll1])
(s',_) <- checkLType gr g s (RecType [field | field@(l,_) <- rr, elem l ll2])
--- over <- getOverload gr g Nothing r --- this would solve #66 but fail ParadigmsAra. AR 6/7/2020
--- let r1 = maybe r fst over
let r1 = r ---
(r',_) <- checkLType gr g r1 (RecType [field | field@(l,_) <- rr, elem l ll1])
(s',_) <- checkLType gr g s (RecType [field | field@(l,_) <- rr, elem l ll2])
let rec = R ([(l,(Nothing,P r' l)) | l <- ll1] ++ [(l,(Nothing,P s' l)) | l <- ll2])
return (rec, typ)
@@ -638,9 +655,31 @@ checkEqLType gr g t u trm = do
(b,t',u',s) <- checkIfEqLType gr g t u trm
case b of
True -> return t'
False -> checkError $ s <+> "type of" <+> ppTerm Unqualified 0 trm $$
"expected:" <+> ppTerm Qualified 0 t $$ -- ppqType t u $$
"inferred:" <+> ppTerm Qualified 0 u -- ppqType u t
False ->
let inferredType = ppTerm Qualified 0 u
expectedType = ppTerm Qualified 0 t
term = ppTerm Unqualified 0 trm
funName = pp . head . words .render $ term
helpfulMsg =
case (arrows inferredType, arrows expectedType) of
(0,0) -> pp "" -- None of the types is a function
_ -> "\n **" <+>
if expectedType `isLessApplied` inferredType
then "Maybe you gave too few arguments to" <+> funName
else pp "Double-check that type signature and number of arguments match."
in checkError $ s <+> "type of" <+> term $$
"expected:" <+> expectedType $$ -- ppqType t u $$
"inferred:" <+> inferredType $$ -- ppqType u t
helpfulMsg
where
-- count the number of arrows in the prettyprinted term
arrows :: Doc -> Int
arrows = length . filter (=="->") . words . render
-- If prettyprinted type t has fewer arrows then prettyprinted type u,
-- then t is "less applied", and we can print out more helpful error msg.
isLessApplied :: Doc -> Doc -> Bool
isLessApplied t u = arrows t < arrows u
checkIfEqLType :: SourceGrammar -> Context -> Type -> Type -> Term -> Check (Bool,Type,Type,String)
checkIfEqLType gr g t u trm = do

View File

@@ -27,9 +27,10 @@ import Data.List
import qualified Data.Map as Map
import Control.Monad
import GF.Text.Pretty
import qualified Control.Monad.Fail as Fail
-- | combine a list of definitions into a balanced binary search tree
buildAnyTree :: Monad m => ModuleName -> [(Ident,Info)] -> m (BinTree Ident Info)
buildAnyTree :: Fail.MonadFail m => ModuleName -> [(Ident,Info)] -> m (Map.Map Ident Info)
buildAnyTree m = go Map.empty
where
go map [] = return map
@@ -101,16 +102,17 @@ rebuildModule cwd gr mo@(i,mi@(ModInfo mt stat fs_ me mw ops_ med_ msrc_ env_ js
[] -> return mi{jments=js'}
j0s -> do
m0s <- mapM (lookupModule gr) j0s
let notInM0 c _ = all (not . isInBinTree c . jments) m0s
let js2 = filterBinTree notInM0 js'
let notInM0 c _ = all (not . Map.member c . jments) m0s
let js2 = Map.filterWithKey notInM0 js'
return mi{jments=js2}
_ -> return mi
-- add the instance opens to an incomplete module "with" instances
Just (ext,incl,ops) -> do
let (infs,insts) = unzip ops
let stat' = ifNull MSComplete (const MSIncomplete)
[i | i <- is, notElem i infs]
let stat' = if all (flip elem infs) is
then MSComplete
else MSIncomplete
unless (stat' == MSComplete || stat == MSIncomplete)
(checkError ("module" <+> i <+> "remains incomplete"))
ModInfo mt0 _ fs me' _ ops0 _ fpath _ js <- lookupModule gr ext
@@ -123,8 +125,11 @@ rebuildModule cwd gr mo@(i,mi@(ModInfo mt stat fs_ me mw ops_ med_ msrc_ env_ js
--- check if me is incomplete
let fs1 = fs `addOptions` fs_ -- new flags have priority
let js0 = [(c,globalizeLoc fpath j) | (c,j) <- tree2list js, isInherited incl c]
let js1 = buildTree (tree2list js_ ++ js0)
let js0 = Map.mapMaybeWithKey (\c j -> if isInherited incl c
then Just (globalizeLoc fpath j)
else Nothing)
js
let js1 = Map.union js0 js_
let med1= nub (ext : infs ++ insts ++ med_)
return $ ModInfo mt0 stat' fs1 me Nothing ops1 med1 msrc_ env_ js1
@@ -135,14 +140,14 @@ rebuildModule cwd gr mo@(i,mi@(ModInfo mt stat fs_ me mw ops_ med_ msrc_ env_ js
-- If the extended module is incomplete, its judgements are just copied.
extendMod :: Grammar ->
Bool -> (Module,Ident -> Bool) -> ModuleName ->
BinTree Ident Info -> Check (BinTree Ident Info)
Map.Map Ident Info -> Check (Map.Map Ident Info)
extendMod gr isCompl ((name,mi),cond) base new = foldM try new $ Map.toList (jments mi)
where
try new (c,i0)
| not (cond c) = return new
| otherwise = case Map.lookup c new of
Just j -> case unifyAnyInfo name i j of
Ok k -> return $ updateTree (c,k) new
Ok k -> return $ Map.insert c k new
Bad _ -> do (base,j) <- case j of
AnyInd _ m -> lookupOrigInfo gr (m,c)
_ -> return (base,j)
@@ -155,8 +160,8 @@ extendMod gr isCompl ((name,mi),cond) base new = foldM try new $ Map.toList (jme
nest 4 (ppJudgement Qualified (c,j)) $$
"in module" <+> base)
Nothing-> if isCompl
then return $ updateTree (c,indirInfo name i) new
else return $ updateTree (c,i) new
then return $ Map.insert c (indirInfo name i) new
else return $ Map.insert c i new
where
i = globalizeLoc (msrc mi) i0

View File

@@ -20,6 +20,8 @@ import GF.Infra.Ident(moduleNameS)
import GF.Text.Pretty
import GF.System.Console(TermColors(..),getTermColors)
import qualified Data.ByteString.Lazy as BS
-- Control.Monad.Fail import will become redundant in GHC 8.8+
import qualified Control.Monad.Fail as Fail
-- | Compile the given grammar files and everything they depend on,
-- like 'batchCompile'. This function compiles modules in parallel.
@@ -83,7 +85,7 @@ batchCompile1 lib_dir (opts,filepaths) =
let rel = relativeTo lib_dir cwd
prelude_dir = lib_dir</>"prelude"
gfoDir = flag optGFODir opts
maybe done (D.createDirectoryIfMissing True) gfoDir
maybe (return ()) (D.createDirectoryIfMissing True) gfoDir
{-
liftIO $ writeFile (maybe "" id gfoDir</>"paths")
(unlines . map (unwords . map rel) . nub $ map snd filepaths)
@@ -241,14 +243,14 @@ instance (Functor m,Monad m) => Applicative (CollectOutput m) where
(<*>) = ap
instance Monad m => Monad (CollectOutput m) where
return x = CO (return (done,x))
return x = CO (return (return (),x))
CO m >>= f = CO $ do (o1,x) <- m
let CO m2 = f x
(o2,y) <- m2
return (o1>>o2,y)
instance MonadIO m => MonadIO (CollectOutput m) where
liftIO io = CO $ do x <- liftIO io
return (done,x)
return (return (),x)
instance Output m => Output (CollectOutput m) where
ePutStr s = CO (return (ePutStr s,()))
@@ -256,6 +258,9 @@ instance Output m => Output (CollectOutput m) where
putStrLnE s = CO (return (putStrLnE s,()))
putStrE s = CO (return (putStrE s,()))
instance Fail.MonadFail m => Fail.MonadFail (CollectOutput m) where
fail = CO . fail
instance ErrorMonad m => ErrorMonad (CollectOutput m) where
raise e = CO (raise e)
handle (CO m) h = CO $ handle m (unCO . h)

View File

@@ -21,7 +21,7 @@ import GF.Grammar.Binary(decodeModule,encodeModule)
import GF.Infra.Option
import GF.Infra.UseIO(FullPath,IOE,isGFO,gf2gfo,MonadIO(..),Output(..),putPointE)
import GF.Infra.CheckM(runCheck')
import GF.Data.Operations(ErrorMonad,liftErr,(+++),done)
import GF.Data.Operations(ErrorMonad,liftErr,(+++))
import GF.System.Directory(doesFileExist,getCurrentDirectory,renameFile)
import System.FilePath(makeRelative)
@@ -30,12 +30,13 @@ import qualified Data.Map as Map
import GF.Text.Pretty(render,(<+>),($$)) --Doc,
import GF.System.Console(TermColors(..),getTermColors)
import Control.Monad((<=<))
import qualified Control.Monad.Fail as Fail
type OneOutput = (Maybe FullPath,CompiledModule)
type CompiledModule = Module
compileOne, reuseGFO, useTheSource ::
(Output m,ErrorMonad m,MonadIO m) =>
(Output m,ErrorMonad m,MonadIO m, Fail.MonadFail m) =>
Options -> Grammar -> FullPath -> m OneOutput
-- | Compile a given source file (or just load a .gfo file),
@@ -66,7 +67,7 @@ reuseGFO opts srcgr file =
if flag optTagsOnly opts
then writeTags opts srcgr (gf2gftags opts file) sm1
else done
else return ()
return (Just file,sm)
@@ -137,7 +138,7 @@ compileSourceModule opts cwd mb_gfFile gr =
idump opts pass (dump out)
return (ret out)
maybeM f = maybe done f
maybeM f = maybe (return ()) f
--writeGFO :: Options -> InitPath -> FilePath -> SourceModule -> IOE ()
@@ -158,12 +159,12 @@ writeGFO opts cwd file mo =
--intermOut :: Options -> Dump -> Doc -> IOE ()
intermOut opts d doc
| dump opts d = ePutStrLn (render ("\n\n--#" <+> show d $$ doc))
| otherwise = done
| otherwise = return ()
idump opts pass = intermOut opts (Dump pass) . ppModule Internal
warnOut opts warnings
| null warnings = done
| null warnings = return ()
| otherwise = do t <- getTermColors
ePutStr (blueFg t);ePutStr ws;ePutStrLn (restore t)
where

View File

@@ -16,8 +16,6 @@ import GF.Compile.ReadFiles
import GF.Compile.Update
import GF.Compile.Refresh
import GF.Compile.Coding
import GF.Grammar.Grammar
import GF.Grammar.Lookup
import GF.Grammar.Printer

View File

@@ -13,6 +13,7 @@
-----------------------------------------------------------------------------
{-# LANGUAGE Rank2Types, MultiParamTypeClasses, FlexibleInstances #-}
{-# LANGUAGE CPP #-}
module GF.Data.BacktrackM (
-- * the backtracking state monad
BacktrackM,
@@ -32,6 +33,7 @@ import Data.List
import Control.Applicative
import Control.Monad
import Control.Monad.State.Class
import qualified Control.Monad.Fail as Fail
----------------------------------------------------------------------
-- Combining endomorphisms and continuations
@@ -69,6 +71,12 @@ instance Monad (BacktrackM s) where
return a = BM (\c s b -> c a s b)
BM m >>= k = BM (\c s b -> m (\a s b -> unBM (k a) c s b) s b)
where unBM (BM m) = m
#if !(MIN_VERSION_base(4,13,0))
fail = Fail.fail
#endif
instance Fail.MonadFail (BacktrackM s) where
fail _ = mzero
instance Functor (BacktrackM s) where

View File

@@ -12,10 +12,12 @@
-- hack for BNFC generated files. AR 21/9/2003
-----------------------------------------------------------------------------
{-# LANGUAGE CPP #-}
module GF.Data.ErrM where
import Control.Monad (MonadPlus(..),ap)
import Control.Applicative
import qualified Control.Monad.Fail as Fail
-- | Like 'Maybe' type with error msgs
data Err a = Ok a | Bad String
@@ -33,10 +35,19 @@ fromErr a = err (const a) id
instance Monad Err where
return = Ok
fail = Bad
Ok a >>= f = f a
Bad s >>= f = Bad s
#if !(MIN_VERSION_base(4,13,0))
-- Monad(fail) will be removed in GHC 8.8+
fail = Fail.fail
#endif
instance Fail.MonadFail Err where
fail = Bad
-- | added 2\/10\/2003 by PEB
instance Functor Err where
fmap f (Ok a) = Ok (f a)

View File

@@ -26,16 +26,8 @@ module GF.Data.Operations (
-- ** Checking
checkUnique, unifyMaybeBy, unifyMaybe,
-- ** Monadic operations on lists and pairs
mapPairListM, mapPairsM, pairM,
-- ** Binary search trees; now with FiniteMap
BinTree, emptyBinTree, isInBinTree, --justLookupTree,
lookupTree, --lookupTreeMany,
lookupTreeManyAll, updateTree,
buildTree, filterBinTree,
mapTree, --mapMTree,
tree2list,
-- ** Monadic operations on lists and pairs
mapPairsM, pairM,
-- ** Printing
indent, (+++), (++-), (++++), (+++-), (+++++),
@@ -47,13 +39,8 @@ module GF.Data.Operations (
topoTest, topoTest2,
-- ** Misc
ifNull,
combinations, done, readIntArg, --singleton,
readIntArg,
iterFix, chunks,
{-
-- ** State monad with error; from Agda 6\/11\/2001
STM(..), appSTM, stm, stmr, readSTM, updateSTM, writeSTM,
-}
) where
@@ -66,15 +53,13 @@ import Control.Monad (liftM,liftM2) --,ap
import GF.Data.ErrM
import GF.Data.Relation
import qualified Control.Monad.Fail as Fail
infixr 5 +++
infixr 5 ++-
infixr 5 ++++
infixr 5 +++++
ifNull :: b -> ([a] -> b) -> [a] -> b
ifNull b f xs = if null xs then b else f xs
-- the Error monad
-- | Add msg s to 'Maybe' failures
@@ -82,7 +67,7 @@ maybeErr :: ErrorMonad m => String -> Maybe a -> m a
maybeErr s = maybe (raise s) return
testErr :: ErrorMonad m => Bool -> String -> m ()
testErr cond msg = if cond then done else raise msg
testErr cond msg = if cond then return () else raise msg
errIn :: ErrorMonad m => String -> m a -> m a
errIn msg m = handle m (\s -> raise (s ++++ "OCCURRED IN" ++++ msg))
@@ -90,9 +75,6 @@ errIn msg m = handle m (\s -> raise (s ++++ "OCCURRED IN" ++++ msg))
lookupErr :: (ErrorMonad m,Eq a,Show a) => a -> [(a,b)] -> m b
lookupErr a abs = maybeErr ("Unknown" +++ show a) (lookup a abs)
mapPairListM :: Monad m => ((a,b) -> m c) -> [(a,b)] -> m [(a,c)]
mapPairListM f xys = mapM (\ p@(x,_) -> liftM ((,) x) (f p)) xys
mapPairsM :: Monad m => (b -> m c) -> [(a,b)] -> m [(a,c)]
mapPairsM f xys = mapM (\ (x,y) -> liftM ((,) x) (f y)) xys
@@ -107,54 +89,16 @@ checkUnique ss = ["overloaded" +++ show s | s <- nub overloads] where
overloaded s = length (filter (==s) ss) > 1
-- | this is what happens when matching two values in the same module
unifyMaybe :: (Eq a, Monad m) => Maybe a -> Maybe a -> m (Maybe a)
unifyMaybe :: (Eq a, Fail.MonadFail m) => Maybe a -> Maybe a -> m (Maybe a)
unifyMaybe = unifyMaybeBy id
unifyMaybeBy :: (Eq b, Monad m) => (a->b) -> Maybe a -> Maybe a -> m (Maybe a)
unifyMaybeBy :: (Eq b, Fail.MonadFail m) => (a->b) -> Maybe a -> Maybe a -> m (Maybe a)
unifyMaybeBy f (Just p1) (Just p2)
| f p1==f p2 = return (Just p1)
| otherwise = fail ""
unifyMaybeBy _ Nothing mp2 = return mp2
unifyMaybeBy _ mp1 _ = return mp1
-- binary search trees
type BinTree a b = Map a b
emptyBinTree :: BinTree a b
emptyBinTree = Map.empty
isInBinTree :: (Ord a) => a -> BinTree a b -> Bool
isInBinTree = Map.member
{-
justLookupTree :: (ErrorMonad m,Ord a) => a -> BinTree a b -> m b
justLookupTree = lookupTree (const [])
-}
lookupTree :: (ErrorMonad m,Ord a) => (a -> String) -> a -> BinTree a b -> m b
lookupTree pr x = maybeErr no . Map.lookup x
where no = "no occurrence of element" +++ pr x
lookupTreeManyAll :: Ord a => (a -> String) -> [BinTree a b] -> a -> [b]
lookupTreeManyAll pr (t:ts) x = case lookupTree pr x t of
Ok v -> v : lookupTreeManyAll pr ts x
_ -> lookupTreeManyAll pr ts x
lookupTreeManyAll pr [] x = []
updateTree :: (Ord a) => (a,b) -> BinTree a b -> BinTree a b
updateTree (a,b) = Map.insert a b
buildTree :: (Ord a) => [(a,b)] -> BinTree a b
buildTree = Map.fromList
mapTree :: ((a,b) -> c) -> BinTree a b -> BinTree a c
mapTree f = Map.mapWithKey (\k v -> f (k,v))
filterBinTree :: Ord a => (a -> b -> Bool) -> BinTree a b -> BinTree a b
filterBinTree = Map.filterWithKey
tree2list :: BinTree a b -> [(a,b)] -- inorder
tree2list = Map.toList
-- printing
indent :: Int -> String -> String
@@ -243,21 +187,6 @@ wrapLines n s@(c:cs) =
l = length w
_ -> s -- give up!!
--- optWrapLines = if argFlag "wraplines" True then wrapLines 0 else id
-- | 'combinations' is the same as 'sequence'!!!
-- peb 30\/5-04
combinations :: [[a]] -> [[a]]
combinations t = case t of
[] -> [[]]
aa:uu -> [a:u | a <- aa, u <- combinations uu]
{-
-- | 'singleton' is the same as 'return'!!!
singleton :: a -> [a]
singleton = (:[])
-}
-- | Topological sorting with test of cyclicity
topoTest :: Ord a => [(a,[a])] -> Either [a] [[a]]
topoTest = topologicalSort . mkRel'
@@ -297,46 +226,6 @@ chunks sep ws = case span (/= sep) ws of
readIntArg :: String -> Int
readIntArg n = if (not (null n) && all isDigit n) then read n else 0
{-
-- state monad with error; from Agda 6/11/2001
newtype STM s a = STM (s -> Err (a,s))
appSTM :: STM s a -> s -> Err (a,s)
appSTM (STM f) s = f s
stm :: (s -> Err (a,s)) -> STM s a
stm = STM
stmr :: (s -> (a,s)) -> STM s a
stmr f = stm (\s -> return (f s))
instance Functor (STM s) where fmap = liftM
instance Applicative (STM s) where
pure = return
(<*>) = ap
instance Monad (STM s) where
return a = STM (\s -> return (a,s))
STM c >>= f = STM (\s -> do
(x,s') <- c s
let STM f' = f x
f' s')
readSTM :: STM s s
readSTM = stmr (\s -> (s,s))
updateSTM :: (s -> s) -> STM s ()
updateSTM f = stmr (\s -> ((),f s))
writeSTM :: s -> STM s ()
writeSTM s = stmr (const ((),s))
-}
-- | @return ()@
done :: Monad m => m ()
done = return ()
class (Functor m,Monad m) => ErrorMonad m where
raise :: String -> m a
handle :: m a -> (String -> m a) -> m a
@@ -377,4 +266,4 @@ doUntil cond ms = case ms of
v <- a
if cond v then return v else doUntil cond as
_ -> raise "no result"
-}
-}

View File

@@ -11,6 +11,7 @@
module GF.Grammar.Canonical where
import Prelude hiding ((<>))
import GF.Text.Pretty
import GF.Infra.Ident (RawIdent)
-- | A Complete grammar
data Grammar = Grammar Abstract [Concrete] deriving Show
@@ -126,7 +127,7 @@ data FlagValue = Str String | Int Int | Flt Double deriving Show
-- *** Identifiers
type Id = String
type Id = RawIdent
data QualId = Qual ModId Id | Unqual Id deriving (Eq,Ord,Show)
--------------------------------------------------------------------------------
@@ -265,7 +266,6 @@ instance PPA LinPattern where
RecordPattern r -> block r
TuplePattern ps -> "<"<>punctuate "," ps<>">"
WildPattern -> pp "_"
_ -> parens p
instance RhsSeparator LinPattern where rhsSep _ = pp "="

View File

@@ -6,6 +6,8 @@ import Text.JSON
import Control.Applicative ((<|>))
import Data.Ratio (denominator, numerator)
import GF.Grammar.Canonical
import Control.Monad (guard)
import GF.Infra.Ident (RawIdent,showRawIdent,rawIdentS)
encodeJSON :: FilePath -> Grammar -> IO ()
@@ -28,7 +30,7 @@ instance JSON Grammar where
-- ** Abstract Syntax
instance JSON Abstract where
showJSON (Abstract absid flags cats funs)
showJSON (Abstract absid flags cats funs)
= makeObj [("abs", showJSON absid),
("flags", showJSON flags),
("cats", showJSON cats),
@@ -80,7 +82,7 @@ instance JSON TypeBinding where
-- ** Concrete syntax
instance JSON Concrete where
showJSON (Concrete cncid absid flags params lincats lins)
showJSON (Concrete cncid absid flags params lincats lins)
= makeObj [("cnc", showJSON cncid),
("abs", showJSON absid),
("flags", showJSON flags),
@@ -126,10 +128,10 @@ instance JSON LinType where
-- records are encoded as records:
showJSON (RecordType rows) = showJSON rows
readJSON o = do "Str" <- readJSON o; return StrType
<|> do "Float" <- readJSON o; return FloatType
<|> do "Int" <- readJSON o; return IntType
<|> do ptype <- readJSON o; return (ParamType ptype)
readJSON o = StrType <$ parseString "Str" o
<|> FloatType <$ parseString "Float" o
<|> IntType <$ parseString "Int" o
<|> ParamType <$> readJSON o
<|> TableType <$> o!".tblarg" <*> o!".tblval"
<|> TupleType <$> o!".tuple"
<|> RecordType <$> readJSON o
@@ -186,7 +188,7 @@ instance JSON LinPattern where
-- and records as records:
showJSON (RecordPattern r) = showJSON r
readJSON o = do "_" <- readJSON o; return WildPattern
readJSON o = do p <- parseString "_" o; return WildPattern
<|> do p <- readJSON o; return (ParamPattern (Param p []))
<|> ParamPattern <$> readJSON o
<|> RecordPattern <$> readJSON o
@@ -203,12 +205,12 @@ instance JSON a => JSON (RecordRow a) where
-- record rows and lists of record rows are both encoded as JSON records (i.e., objects)
showJSON row = showJSONs [row]
showJSONs rows = makeObj (map toRow rows)
where toRow (RecordRow (LabelId lbl) val) = (lbl, showJSON val)
where toRow (RecordRow (LabelId lbl) val) = (showRawIdent lbl, showJSON val)
readJSON obj = head <$> readJSONs obj
readJSONs obj = mapM fromRow (assocsJSObject obj)
where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue
return (RecordRow (LabelId lbl) value)
return (RecordRow (LabelId (rawIdentS lbl)) value)
instance JSON rhs => JSON (TableRow rhs) where
showJSON (TableRow l v) = makeObj [(".pattern", showJSON l), (".value", showJSON v)]
@@ -218,43 +220,47 @@ instance JSON rhs => JSON (TableRow rhs) where
-- *** Identifiers in Concrete Syntax
instance JSON PredefId where showJSON (PredefId s) = showJSON s ; readJSON = fmap PredefId . readJSON
instance JSON LabelId where showJSON (LabelId s) = showJSON s ; readJSON = fmap LabelId . readJSON
instance JSON VarValueId where showJSON (VarValueId s) = showJSON s ; readJSON = fmap VarValueId . readJSON
instance JSON ParamId where showJSON (ParamId s) = showJSON s ; readJSON = fmap ParamId . readJSON
instance JSON ParamType where showJSON (ParamTypeId s) = showJSON s ; readJSON = fmap ParamTypeId . readJSON
instance JSON PredefId where showJSON (PredefId s) = showJSON s ; readJSON = fmap PredefId . readJSON
instance JSON LabelId where showJSON (LabelId s) = showJSON s ; readJSON = fmap LabelId . readJSON
instance JSON VarValueId where showJSON (VarValueId s) = showJSON s ; readJSON = fmap VarValueId . readJSON
instance JSON ParamId where showJSON (ParamId s) = showJSON s ; readJSON = fmap ParamId . readJSON
instance JSON ParamType where showJSON (ParamTypeId s) = showJSON s ; readJSON = fmap ParamTypeId . readJSON
--------------------------------------------------------------------------------
-- ** Used in both Abstract and Concrete Syntax
instance JSON ModId where showJSON (ModId s) = showJSON s ; readJSON = fmap ModId . readJSON
instance JSON CatId where showJSON (CatId s) = showJSON s ; readJSON = fmap CatId . readJSON
instance JSON FunId where showJSON (FunId s) = showJSON s ; readJSON = fmap FunId . readJSON
instance JSON ModId where showJSON (ModId s) = showJSON s ; readJSON = fmap ModId . readJSON
instance JSON CatId where showJSON (CatId s) = showJSON s ; readJSON = fmap CatId . readJSON
instance JSON FunId where showJSON (FunId s) = showJSON s ; readJSON = fmap FunId . readJSON
instance JSON VarId where
-- the anonymous variable is the underscore:
showJSON Anonymous = showJSON "_"
showJSON (VarId x) = showJSON x
readJSON o = do "_" <- readJSON o; return Anonymous
readJSON o = do parseString "_" o; return Anonymous
<|> VarId <$> readJSON o
instance JSON QualId where
showJSON (Qual (ModId m) n) = showJSON (m++"."++n)
showJSON (Qual (ModId m) n) = showJSON (showRawIdent m++"."++showRawIdent n)
showJSON (Unqual n) = showJSON n
readJSON o = do qualid <- readJSON o
let (mod, id) = span (/= '.') qualid
return $ if null mod then Unqual id else Qual (ModId mod) id
return $ if null mod then Unqual (rawIdentS id) else Qual (ModId (rawIdentS mod)) (rawIdentS id)
instance JSON RawIdent where
showJSON i = showJSON $ showRawIdent i
readJSON o = rawIdentS <$> readJSON o
instance JSON Flags where
-- flags are encoded directly as JSON records (i.e., objects):
showJSON (Flags fs) = makeObj [(f, showJSON v) | (f, v) <- fs]
showJSON (Flags fs) = makeObj [(showRawIdent f, showJSON v) | (f, v) <- fs]
readJSON obj = Flags <$> mapM fromRow (assocsJSObject obj)
where fromRow (lbl, jsvalue) = do value <- readJSON jsvalue
return (lbl, value)
return (rawIdentS lbl, value)
instance JSON FlagValue where
-- flag values are encoded as basic JSON types:
@@ -268,6 +274,9 @@ instance JSON FlagValue where
--------------------------------------------------------------------------------
-- ** Convenience functions
parseString :: String -> JSValue -> Result ()
parseString s o = guard . (== s) =<< readJSON o
(!) :: JSON a => JSValue -> String -> Result a
obj ! key = maybe (fail $ "CanonicalJSON.(!): Could not find key: " ++ show key)
readJSON

View File

@@ -1,5 +1,6 @@
-- -*- haskell -*-
{
{-# LANGUAGE CPP #-}
module GF.Grammar.Lexer
( Token(..), Posn(..)
, P, runP, runPartial, token, lexer, getPosn, failLoc
@@ -18,6 +19,7 @@ import qualified Data.Map as Map
import Data.Word(Word8)
import Data.Char(readLitChar)
--import Debug.Trace(trace)
import qualified Control.Monad.Fail as Fail
}
@@ -33,7 +35,7 @@ $u = [.\n] -- universal: any character
:-
"--" [.]* ; -- Toss single line comments
"{-" ([$u # \-] | \- [$u # \}])* ("-")+ "}" ;
"{-" ([$u # \-] | \- [$u # \}])* ("-")+ "}" ;
$white+ ;
@rsyms { tok ident }
@@ -136,7 +138,7 @@ data Token
res = eitherResIdent
eitherResIdent :: (Ident -> Token) -> Ident -> Token
eitherResIdent tv s =
eitherResIdent tv s =
case Map.lookup s resWords of
Just t -> t
Nothing -> tv s
@@ -282,8 +284,16 @@ instance Monad P where
(P m) >>= k = P $ \ s -> case m s of
POk s a -> unP (k a) s
PFailed posn err -> PFailed posn err
#if !(MIN_VERSION_base(4,13,0))
-- Monad(fail) will be removed in GHC 8.8+
fail = Fail.fail
#endif
instance Fail.MonadFail P where
fail msg = P $ \(_,AI posn _ _) -> PFailed posn msg
runP :: P a -> BS.ByteString -> Either (Posn,String) a
runP p bs = snd <$> runP' p (Pn 1 0,bs)

View File

@@ -51,11 +51,11 @@ lock c = lockRecType c -- return
unlock c = unlockRecord c -- return
-- to look up a constant etc in a search tree --- why here? AR 29/5/2008
lookupIdent :: ErrorMonad m => Ident -> BinTree Ident b -> m b
lookupIdent :: ErrorMonad m => Ident -> Map.Map Ident b -> m b
lookupIdent c t =
case lookupTree showIdent c t of
Ok v -> return v
Bad _ -> raise ("unknown identifier" +++ showIdent c)
case Map.lookup c t of
Just v -> return v
Nothing -> raise ("unknown identifier" +++ showIdent c)
lookupIdentInfo :: ErrorMonad m => SourceModInfo -> Ident -> m Info
lookupIdentInfo mo i = lookupIdent i (jments mo)
@@ -148,7 +148,7 @@ lookupOrigInfo gr (m,c) = do
allOrigInfos :: Grammar -> ModuleName -> [(QIdent,Info)]
allOrigInfos gr m = fromErr [] $ do
mo <- lookupModule gr m
return [((m,c),i) | (c,_) <- tree2list (jments mo), Ok (m,i) <- [lookupOrigInfo gr (m,c)]]
return [((m,c),i) | (c,_) <- Map.toList (jments mo), Ok (m,i) <- [lookupOrigInfo gr (m,c)]]
lookupParamValues :: ErrorMonad m => Grammar -> QIdent -> m [Term]
lookupParamValues gr c = do
@@ -166,11 +166,11 @@ allParamValues cnc ptyp =
RecType r -> do
let (ls,tys) = unzip $ sortByFst r
tss <- mapM (allParamValues cnc) tys
return [R (zipAssign ls ts) | ts <- combinations tss]
return [R (zipAssign ls ts) | ts <- sequence tss]
Table pt vt -> do
pvs <- allParamValues cnc pt
vvs <- allParamValues cnc vt
return [V pt ts | ts <- combinations (replicate (length pvs) vvs)]
return [V pt ts | ts <- sequence (replicate (length pvs) vvs)]
_ -> raise (render ("cannot find parameter values for" <+> ptyp))
where
-- to normalize records and record types

View File

@@ -5,7 +5,7 @@
-- Stability : (stable)
-- Portability : (portable)
--
-- > CVS $Date: 2005/11/11 16:38:00 $
-- > CVS $Date: 2005/11/11 16:38:00 $
-- > CVS $Author: bringert $
-- > CVS $Revision: 1.24 $
--
@@ -22,17 +22,17 @@ import GF.Data.Operations
import GF.Data.Str
import GF.Infra.Ident
import GF.Grammar.Grammar
--import GF.Grammar.Values
import GF.Grammar.Predef
import GF.Grammar.Printer
import Control.Monad.Identity(Identity(..))
import qualified Data.Traversable as T(mapM)
import qualified Data.Map as Map
import Control.Monad (liftM, liftM2, liftM3)
--import Data.Char (isDigit)
import Data.List (sortBy,nub)
import Data.Monoid
import GF.Text.Pretty(render,(<+>),hsep,fsep)
import qualified Control.Monad.Fail as Fail
-- ** Functions for constructing and analysing source code terms.
@@ -51,14 +51,14 @@ typeForm t =
_ -> error (render ("no normal form of type" <+> ppTerm Unqualified 0 t))
typeFormCnc :: Type -> (Context, Type)
typeFormCnc t =
typeFormCnc t =
case t of
Prod b x a t -> let (x', v) = typeFormCnc t
in ((b,x,a):x',v)
_ -> ([],t)
valCat :: Type -> Cat
valCat typ =
valCat typ =
let (_,cat,_) = typeForm typ
in cat
@@ -99,7 +99,7 @@ isHigherOrderType t = fromErr True $ do -- pessimistic choice
contextOfType :: Monad m => Type -> m Context
contextOfType typ = case typ of
Prod b x a t -> liftM ((b,x,a):) $ contextOfType t
_ -> return []
_ -> return []
termForm :: Monad m => Term -> m ([(BindType,Ident)], Term, [Term])
termForm t = case t of
@@ -108,8 +108,8 @@ termForm t = case t of
return ((b,x):x', fun, args)
App c a ->
do (_,fun, args) <- termForm c
return ([],fun,args ++ [a])
_ ->
return ([],fun,args ++ [a])
_ ->
return ([],t,[])
termFormCnc :: Term -> ([(BindType,Ident)], Term)
@@ -238,7 +238,7 @@ isPredefConstant t = case t of
Q (mod,_) | mod == cPredef || mod == cPredefAbs -> True
_ -> False
checkPredefError :: Monad m => Term -> m Term
checkPredefError :: Fail.MonadFail m => Term -> m Term
checkPredefError t =
case t of
Error s -> fail ("Error: "++s)
@@ -254,7 +254,7 @@ mkTable :: [Term] -> Term -> Term
mkTable tt t = foldr Table t tt
mkCTable :: [(BindType,Ident)] -> Term -> Term
mkCTable ids v = foldr ccase v ids where
mkCTable ids v = foldr ccase v ids where
ccase (_,x) t = T TRaw [(PV x,t)]
mkHypo :: Term -> Hypo
@@ -287,7 +287,7 @@ plusRecType t1 t2 = case (t1, t2) of
filter (`elem` (map fst r1)) (map fst r2) of
[] -> return (RecType (r1 ++ r2))
ls -> raise $ render ("clashing labels" <+> hsep ls)
_ -> raise $ render ("cannot add record types" <+> ppTerm Unqualified 0 t1 <+> "and" <+> ppTerm Unqualified 0 t2)
_ -> raise $ render ("cannot add record types" <+> ppTerm Unqualified 0 t1 <+> "and" <+> ppTerm Unqualified 0 t2)
--plusRecord :: Term -> Term -> Err Term
plusRecord t1 t2 =
@@ -304,7 +304,7 @@ defLinType = RecType [(theLinLabel, typeStr)]
-- | refreshing variables
mkFreshVar :: [Ident] -> Ident
mkFreshVar olds = varX (maxVarIndex olds + 1)
mkFreshVar olds = varX (maxVarIndex olds + 1)
-- | trying to preserve a given symbol
mkFreshVarX :: [Ident] -> Ident -> Ident
@@ -313,7 +313,7 @@ mkFreshVarX olds x = if (elem x olds) then (varX (maxVarIndex olds + 1)) else x
maxVarIndex :: [Ident] -> Int
maxVarIndex = maximum . ((-1):) . map varIndex
mkFreshVars :: Int -> [Ident] -> [Ident]
mkFreshVars :: Int -> [Ident] -> [Ident]
mkFreshVars n olds = [varX (maxVarIndex olds + i) | i <- [1..n]]
-- | quick hack for refining with var in editor
@@ -413,11 +413,11 @@ patt2term pt = case pt of
PC c pp -> mkApp (Con c) (map patt2term pp)
PP c pp -> mkApp (QC c) (map patt2term pp)
PR r -> R [assign l (patt2term p) | (l,p) <- r]
PR r -> R [assign l (patt2term p) | (l,p) <- r]
PT _ p -> patt2term p
PInt i -> EInt i
PFloat i -> EFloat i
PString s -> K s
PString s -> K s
PAs x p -> appCons cAs [Vr x, patt2term p] --- an encoding
PChar -> appCons cChar [] --- an encoding
@@ -436,7 +436,7 @@ composSafeOp op = runIdentity . composOp (return . op)
-- | to define compositional term functions
composOp :: Monad m => (Term -> m Term) -> Term -> m Term
composOp co trm =
composOp co trm =
case trm of
App c a -> liftM2 App (co c) (co a)
Abs b x t -> liftM (Abs b x) (co t)
@@ -552,19 +552,15 @@ strsFromTerm t = case t of
v0 <- mapM (strsFromTerm . fst) vs
c0 <- mapM (strsFromTerm . snd) vs
--let vs' = zip v0 c0
return [strTok (str2strings def) vars |
return [strTok (str2strings def) vars |
def <- d0,
vars <- [[(str2strings v, map sstr c) | (v,c) <- zip vv c0] |
vv <- combinations v0]
vars <- [[(str2strings v, map sstr c) | (v,c) <- zip vv c0] |
vv <- sequence v0]
]
FV ts -> mapM strsFromTerm ts >>= return . concat
Strs ts -> mapM strsFromTerm ts >>= return . concat
Strs ts -> mapM strsFromTerm ts >>= return . concat
_ -> raise (render ("cannot get Str from term" <+> ppTerm Unqualified 0 t))
-- | to print an Str-denoting term as a string; if the term is of wrong type, the error msg
stringFromTerm :: Term -> String
stringFromTerm = err id (ifNull "" (sstr . head)) . strsFromTerm
getTableType :: TInfo -> Err Type
getTableType i = case i of
TTyped ty -> return ty
@@ -594,11 +590,11 @@ noExist = FV []
defaultLinType :: Type
defaultLinType = mkRecType linLabel [typeStr]
-- normalize records and record types; put s first
-- | normalize records and record types; put s first
sortRec :: [(Label,a)] -> [(Label,a)]
sortRec = sortBy ordLabel where
ordLabel (r1,_) (r2,_) =
ordLabel (r1,_) (r2,_) =
case (showIdent (label2ident r1), showIdent (label2ident r2)) of
("s",_) -> LT
(_,"s") -> GT
@@ -608,9 +604,9 @@ sortRec = sortBy ordLabel where
-- | dependency check, detecting circularities and returning topo-sorted list
allDependencies :: (ModuleName -> Bool) -> BinTree Ident Info -> [(Ident,[Ident])]
allDependencies ism b =
[(f, nub (concatMap opty (pts i))) | (f,i) <- tree2list b]
allDependencies :: (ModuleName -> Bool) -> Map.Map Ident Info -> [(Ident,[Ident])]
allDependencies ism b =
[(f, nub (concatMap opty (pts i))) | (f,i) <- Map.toList b]
where
opersIn t = case t of
Q (n,c) | ism n -> [c]
@@ -634,7 +630,7 @@ topoSortJments (m,mi) = do
return
(\cyc -> raise (render ("circular definitions:" <+> fsep (head cyc))))
(topoTest (allDependencies (==m) (jments mi)))
return (reverse [(i,info) | i <- is, Ok info <- [lookupTree showIdent i (jments mi)]])
return (reverse [(i,info) | i <- is, Just info <- [Map.lookup i (jments mi)]])
topoSortJments2 :: ErrorMonad m => SourceModule -> m [[(Ident,Info)]]
topoSortJments2 (m,mi) = do
@@ -644,4 +640,4 @@ topoSortJments2 (m,mi) = do
<+> fsep (head cyc))))
(topoTest2 (allDependencies (==m) (jments mi)))
return
[[(i,info) | i<-is,Ok info<-[lookupTree showIdent i (jments mi)]] | is<-iss]
[[(i,info) | i<-is,Just info<-[Map.lookup i (jments mi)]] | is<-iss]

View File

@@ -24,6 +24,7 @@ import GF.Grammar.Lexer
import GF.Compile.Update (buildAnyTree)
import Data.List(intersperse)
import Data.Char(isAlphaNum)
import qualified Data.Map as Map
import PGF(mkCId)
}
@@ -139,7 +140,7 @@ ModHeader
: ComplMod ModType '=' ModHeaderBody { let { mstat = $1 ;
(mtype,id) = $2 ;
(extends,with,opens) = $4 }
in (id, ModInfo mtype mstat noOptions extends with opens [] "" Nothing emptyBinTree) }
in (id, ModInfo mtype mstat noOptions extends with opens [] "" Nothing Map.empty) }
ComplMod :: { ModuleStatus }
ComplMod

View File

@@ -73,14 +73,13 @@ tryMatch (p,t) = do
t' <- termForm t
trym p t'
where
isInConstantFormt = True -- tested already in matchPattern
trym p t' =
case (p,t') of
-- (_,(x,Typed e ty,y)) -> trym p (x,e,y) -- Add this? /TH 2013-09-05
(_,(x,Empty,y)) -> trym p (x,K [],y) -- because "" = [""] = []
(PW, _) | isInConstantFormt -> return [] -- optimization with wildcard
(PV x, _) | isInConstantFormt -> return [(x,t)]
(PW, _) -> return [] -- optimization with wildcard
(PV x,([],K s,[])) -> return [(x,words2term (words s))]
(PV x, _) -> return [(x,t)]
(PString s, ([],K i,[])) | s==i -> return []
(PInt s, ([],EInt i,[])) | s==i -> return []
(PFloat s,([],EFloat i,[])) | s==i -> return [] --- rounding?
@@ -108,6 +107,10 @@ tryMatch (p,t) = do
return (concat matches)
(PT _ p',_) -> trym p' t'
(PAs x p',([],K s,[])) -> do
subst <- trym p' t'
return $ (x,words2term (words s)) : subst
(PAs x p',_) -> do
subst <- trym p' t'
return $ (x,t) : subst
@@ -132,6 +135,11 @@ tryMatch (p,t) = do
_ -> raise (render ("no match in case expr for" <+> t))
words2term [] = Empty
words2term [w] = K w
words2term (w:ws) = C (K w) (words2term ws)
matchPMSeq (m1,p1) (m2,p2) s = matchPSeq' m1 p1 m2 p2 s
--matchPSeq p1 p2 s = matchPSeq' (0,maxBound::Int) p1 (0,maxBound::Int) p2 s
matchPSeq p1 p2 s = matchPSeq' (lengthBounds p1) p1 (lengthBounds p2) p2 s
@@ -209,4 +217,4 @@ isMatchingForms ps ts = all match (zip ps ts') where
match _ = True
ts' = map appForm ts
-}
-}

View File

@@ -32,6 +32,7 @@ import System.FilePath(makeRelative)
import Control.Parallel.Strategies(parList,rseq,using)
import Control.Monad(liftM,ap)
import Control.Applicative(Applicative(..))
import qualified Control.Monad.Fail as Fail
type Message = Doc
type Error = Message
@@ -53,6 +54,9 @@ instance Monad Check where
(ws,Success x) -> unCheck (g x) {-ctxt-} ws
(ws,Fail msg) -> (ws,Fail msg)
instance Fail.MonadFail Check where
fail = raise
instance Applicative Check where
pure = return
(<*>) = ap

View File

@@ -5,7 +5,7 @@
-- Stability : (stable)
-- Portability : (portable)
--
-- > CVS $Date: 2005/11/15 11:43:33 $
-- > CVS $Date: 2005/11/15 11:43:33 $
-- > CVS $Author: aarne $
-- > CVS $Revision: 1.8 $
--
@@ -13,18 +13,18 @@
-----------------------------------------------------------------------------
module GF.Infra.Ident (-- ** Identifiers
ModuleName(..), moduleNameS,
Ident, ident2utf8, showIdent, prefixIdent,
-- *** Normal identifiers (returned by the parser)
identS, identC, identW,
-- *** Special identifiers for internal use
identV, identA, identAV,
argIdent, isArgIdent, getArgIndex,
varStr, varX, isWildIdent, varIndex,
-- *** Raw identifiers
RawIdent, rawIdentS, rawIdentC, ident2raw, prefixRawIdent,
isPrefixOf, showRawIdent
) where
ModuleName(..), moduleNameS,
Ident, ident2utf8, showIdent, prefixIdent,
-- *** Normal identifiers (returned by the parser)
identS, identC, identW,
-- *** Special identifiers for internal use
identV, identA, identAV,
argIdent, isArgIdent, getArgIndex,
varStr, varX, isWildIdent, varIndex,
-- *** Raw identifiers
RawIdent, rawIdentS, rawIdentC, ident2raw, prefixRawIdent,
isPrefixOf, showRawIdent
) where
import qualified Data.ByteString.UTF8 as UTF8
import qualified Data.ByteString.Char8 as BS(append,isPrefixOf)
@@ -46,7 +46,7 @@ instance Pretty ModuleName where pp (MN m) = pp m
-- | the constructors labelled /INTERNAL/ are
-- internal representation never returned by the parser
data Ident =
data Ident =
IC {-# UNPACK #-} !RawIdent -- ^ raw identifier after parsing, resolved in Rename
| IW -- ^ wildcard
--
@@ -54,7 +54,7 @@ data Ident =
| IV {-# UNPACK #-} !RawIdent {-# UNPACK #-} !Int -- ^ /INTERNAL/ variable
| IA {-# UNPACK #-} !RawIdent {-# UNPACK #-} !Int -- ^ /INTERNAL/ argument of cat at position
| IAV {-# UNPACK #-} !RawIdent {-# UNPACK #-} !Int {-# UNPACK #-} !Int -- ^ /INTERNAL/ argument of cat with bindings at position
--
--
deriving (Eq, Ord, Show, Read)
-- | Identifiers are stored as UTF-8-encoded bytestrings.
@@ -70,14 +70,13 @@ rawIdentS = Id . pack
rawIdentC = Id
showRawIdent = unpack . rawId2utf8
prefixRawIdent (Id x) (Id y) = Id (BS.append x y)
prefixRawIdent (Id x) (Id y) = Id (BS.append x y)
isPrefixOf (Id x) (Id y) = BS.isPrefixOf x y
instance Binary RawIdent where
put = put . rawId2utf8
get = fmap rawIdentC get
-- | This function should be used with care, since the returned ByteString is
-- UTF-8-encoded.
ident2utf8 :: Ident -> UTF8.ByteString
@@ -88,6 +87,7 @@ ident2utf8 i = case i of
IAV (Id s) b j -> BS.append s (pack ('_':show b ++ '_':show j))
IW -> pack "_"
ident2raw :: Ident -> RawIdent
ident2raw = Id . ident2utf8
showIdent :: Ident -> String
@@ -95,13 +95,14 @@ showIdent i = unpack $! ident2utf8 i
instance Pretty Ident where pp = pp . showIdent
instance Pretty RawIdent where pp = pp . showRawIdent
identS :: String -> Ident
identS = identC . rawIdentS
identC :: RawIdent -> Ident
identW :: Ident
prefixIdent :: String -> Ident -> Ident
prefixIdent pref = identC . Id . BS.append (pack pref) . ident2utf8
@@ -112,7 +113,7 @@ identV :: RawIdent -> Int -> Ident
identA :: RawIdent -> Int -> Ident
identAV:: RawIdent -> Int -> Int -> Ident
(identC, identV, identA, identAV, identW) =
(identC, identV, identA, identAV, identW) =
(IC, IV, IA, IAV, IW)
-- | to mark argument variables

View File

@@ -44,6 +44,7 @@ import Data.Set (Set)
import qualified Data.Set as Set
import PGF.Internal(Literal(..))
import qualified Control.Monad.Fail as Fail
usageHeader :: String
usageHeader = unlines
@@ -131,7 +132,7 @@ data CFGTransform = CFGNoLR
deriving (Show,Eq,Ord)
data HaskellOption = HaskellNoPrefix | HaskellGADT | HaskellLexical
| HaskellConcrete | HaskellVariants
| HaskellConcrete | HaskellVariants | HaskellData
deriving (Show,Eq,Ord)
data Warning = WarnMissingLincat
@@ -348,7 +349,7 @@ optDescr =
"Overrides the value of GF_LIB_PATH.",
Option [] ["src","force-recomp"] (NoArg (recomp AlwaysRecomp))
"Always recompile from source.",
Option [] ["gfo","recomp-if-newer"] (NoArg (recomp RecompIfNewer))
Option [] ["recomp-if-newer"] (NoArg (recomp RecompIfNewer))
"(default) Recompile from source if the source is newer than the .gfo file.",
Option [] ["gfo","no-recomp"] (NoArg (recomp NeverRecomp))
"Never recompile from source, if there is already .gfo file.",
@@ -530,7 +531,8 @@ haskellOptionNames =
("gadt", HaskellGADT),
("lexical", HaskellLexical),
("concrete", HaskellConcrete),
("variants", HaskellVariants)]
("variants", HaskellVariants),
("data", HaskellData)]
-- | This is for bacward compatibility. Since GHC 6.12 we
-- started using the native Unicode support in GHC but it
@@ -547,7 +549,7 @@ lookupShow xs z = fromMaybe "lookupShow" $ lookup z [(y,x) | (x,y) <- xs]
lookupReadsPrec :: [(String,a)] -> Int -> ReadS a
lookupReadsPrec xs _ s = [(z,rest) | (x,rest) <- lex s, (y,z) <- xs, y == x]
onOff :: Monad m => (Bool -> m a) -> Bool -> ArgDescr (m a)
onOff :: Fail.MonadFail m => (Bool -> m a) -> Bool -> ArgDescr (m a)
onOff f def = OptArg g "[on,off]"
where g ma = maybe (return def) readOnOff ma >>= f
readOnOff x = case map toLower x of
@@ -555,7 +557,7 @@ onOff f def = OptArg g "[on,off]"
"off" -> return False
_ -> fail $ "Expected [on,off], got: " ++ show x
readOutputFormat :: Monad m => String -> m OutputFormat
readOutputFormat :: Fail.MonadFail m => String -> m OutputFormat
readOutputFormat s =
maybe (fail $ "Unknown output format: " ++ show s) return $ lookup s outputFormats

View File

@@ -42,6 +42,7 @@ import qualified GF.Command.Importing as GF(importGrammar, importSource)
#ifdef C_RUNTIME
import qualified PGF2
#endif
import qualified Control.Monad.Fail as Fail
-- * The SIO monad
@@ -58,6 +59,9 @@ instance Monad SIO where
return x = SIO (const (return x))
SIO m1 >>= xm2 = SIO $ \ h -> m1 h >>= \ x -> unS (xm2 x) h
instance Fail.MonadFail SIO where
fail = lift0 . fail
instance Output SIO where
ePutStr = lift0 . ePutStr
ePutStrLn = lift0 . ePutStrLn

View File

@@ -159,6 +159,9 @@ instance ErrorMonad IO where
then h (ioeGetErrorString e)
else ioError e
{-
-- Control.Monad.Fail import will become redundant in GHC 8.8+
import qualified Control.Monad.Fail as Fail
instance Functor IOE where fmap = liftM
instance Applicative IOE where
@@ -170,7 +173,15 @@ instance Monad IOE where
IOE c >>= f = IOE $ do
x <- c -- Err a
appIOE $ err raise f x -- f :: a -> IOE a
#if !(MIN_VERSION_base(4,13,0))
fail = raise
#endif
instance Fail.MonadFail IOE where
fail = raise
-}
-- | Print the error message and return a default value if the IO operation 'fail's

View File

@@ -1,10 +1,10 @@
{-# LANGUAGE CPP, ScopedTypeVariables, FlexibleInstances #-}
-- | GF interactive mode
module GF.Interactive (mainGFI,mainRunGFI,mainServerGFI) where
import Prelude hiding (putStrLn,print)
import qualified Prelude as P(putStrLn)
import GF.Command.Interpreter(CommandEnv(..),mkCommandEnv,interpretCommandLine)
--import GF.Command.Importing(importSource,importGrammar)
import GF.Command.Commands(PGFEnv,HasPGFEnv(..),pgf,pgfEnv,pgfCommands)
import GF.Command.CommonCommands(commonCommands,extend)
import GF.Command.SourceCommands
@@ -12,16 +12,13 @@ import GF.Command.CommandInfo
import GF.Command.Help(helpCommand)
import GF.Command.Abstract
import GF.Command.Parse(readCommandLine,pCommand)
import GF.Data.Operations (Err(..),done)
import GF.Data.Operations (Err(..))
import GF.Data.Utilities(whenM,repeatM)
import GF.Grammar hiding (Ident,isPrefixOf)
import GF.Infra.UseIO(ioErrorText,putStrLnE)
import GF.Infra.SIO
import GF.Infra.Option
import qualified System.Console.Haskeline as Haskeline
--import GF.Text.Coding(decodeUnicode,encodeUnicode)
--import GF.Compile.Coding(codeTerm)
import PGF
import PGF.Internal(abstract,funs,lookStartCat,emptyPGF)
@@ -41,6 +38,9 @@ import GF.Server(server)
#endif
import GF.Command.Messages(welcome)
import GF.Infra.UseIO (Output)
-- Provides an orphan instance of MonadFail for StateT in ghc versions < 8
import Control.Monad.Trans.Instances ()
-- | Run the GF Shell in quiet mode (@gf -run@).
mainRunGFI :: Options -> [FilePath] -> IO ()
@@ -102,7 +102,7 @@ timeIt act =
-- | Optionally show how much CPU time was used to run an IO action
optionallyShowCPUTime :: (Monad m,MonadSIO m) => Options -> m a -> m a
optionallyShowCPUTime opts act
optionallyShowCPUTime opts act
| not (verbAtLeast opts Normal) = act
| otherwise = do (dt,r) <- timeIt act
liftSIO $ putStrLnFlush $ show (dt `div` 1000000000) ++ " msec"
@@ -165,7 +165,7 @@ execute1' s0 =
do execute . lines =<< lift (restricted (readFile w))
continue
where
execute [] = done
execute [] = return ()
execute (line:lines) = whenM (execute1' line) (execute lines)
execute_history _ =
@@ -290,8 +290,8 @@ importInEnv opts files =
pgf1 <- importGrammar pgf0 opts' files
if (verbAtLeast opts Normal)
then putStrLnFlush $
unwords $ "\nLanguages:" : map showCId (languages pgf1)
else done
unwords $ "\nLanguages:" : map showCId (languages pgf1)
else return ()
return pgf1
tryGetLine = do
@@ -366,7 +366,7 @@ wordCompletion gfenv (left,right) = do
pgf = multigrammar gfenv
cmdEnv = commandenv gfenv
optLang opts = valCIdOpts "lang" (head (languages pgf)) opts
optType opts =
optType opts =
let str = valStrOpts "cat" (showCId $ lookStartCat pgf) opts
in case readType str of
Just ty -> ty
@@ -413,7 +413,7 @@ wc_type = cmd_name
option x y (c :cs)
| isIdent c = option x y cs
| otherwise = cmd x cs
optValue x y ('"':cs) = str x y cs
optValue x y cs = cmd x cs
@@ -431,7 +431,7 @@ wc_type = cmd_name
where
x1 = take (length x - length y - d) x
x2 = takeWhile (\c -> isIdent c || isSpace c || c == '-' || c == '=' || c == '"') x1
cmd = case [x | (x,cs) <- RP.readP_to_S pCommand x2, all isSpace cs] of
[x] -> Just x
_ -> Nothing

View File

@@ -10,16 +10,13 @@ import GF.Command.CommandInfo
import GF.Command.Help(helpCommand)
import GF.Command.Abstract
import GF.Command.Parse(readCommandLine,pCommand)
import GF.Data.Operations (Err(..),done)
import GF.Data.Operations (Err(..))
import GF.Data.Utilities(whenM,repeatM)
import GF.Infra.UseIO(ioErrorText,putStrLnE)
import GF.Infra.SIO
import GF.Infra.Option
import qualified System.Console.Haskeline as Haskeline
--import GF.Text.Coding(decodeUnicode,encodeUnicode)
--import GF.Compile.Coding(codeTerm)
import qualified PGF2 as C
import qualified PGF as H
@@ -167,7 +164,7 @@ execute1' s0 =
continue
where
execute :: [String] -> ShellM ()
execute [] = done
execute [] = return ()
execute (line:lines) = whenM (execute1' line) (execute lines)
execute_history _ =
@@ -282,14 +279,14 @@ importInEnv opts files =
_ | flag optRetainResource opts ->
putStrLnE "Flag -retain is not supported in this shell"
[file] | takeExtensions file == ".pgf" -> importPGF file
[] -> done
[] -> return ()
_ -> do putStrLnE "Can only import one .pgf file"
where
importPGF file =
do gfenv <- get
case multigrammar gfenv of
Just _ -> putStrLnE "Discarding previous grammar"
_ -> done
_ -> return ()
pgf1 <- lift $ readPGF2 file
let gfenv' = gfenv { pgfenv = pgfEnv pgf1 }
when (verbAtLeast opts Normal) $

View File

@@ -6,7 +6,7 @@ import qualified Data.Map as M
import Control.Applicative -- for GHC<7.10
import Control.Monad(when)
import Control.Monad.State(StateT(..),get,gets,put)
import Control.Monad.Error(ErrorT(..),Error(..))
import Control.Monad.Except(ExceptT(..),runExceptT)
import System.Random(randomRIO)
--import System.IO(stderr,hPutStrLn)
import GF.System.Catch(try)
@@ -108,9 +108,9 @@ handle_fcgi execute1 state0 stateM cache =
-- * Request handler
-- | Handler monad
type HM s a = StateT (Q,s) (ErrorT Response IO) a
type HM s a = StateT (Q,s) (ExceptT Response IO) a
run :: HM s Response -> (Q,s) -> IO (s,Response)
run m s = either bad ok =<< runErrorT (runStateT m s)
run m s = either bad ok =<< runExceptT (runStateT m s)
where
bad resp = return (snd s,resp)
ok (resp,(qs,state)) = return (state,resp)
@@ -123,12 +123,12 @@ put_qs qs = do state <- get_state; put (qs,state)
put_state state = do qs <- get_qs; put (qs,state)
err :: Response -> HM s a
err e = StateT $ \ s -> ErrorT $ return $ Left e
err e = StateT $ \ s -> ExceptT $ return $ Left e
hmbracket_ :: IO () -> IO () -> HM s a -> HM s a
hmbracket_ pre post m =
do s <- get
e <- liftIO $ bracket_ pre post $ runErrorT $ runStateT m s
e <- liftIO $ bracket_ pre post $ runExceptT $ runStateT m s
case e of
Left resp -> err resp
Right (a,s) -> do put s;return a
@@ -407,9 +407,6 @@ resp404 path = Response 404 [plain,xo] $ "Not found: "++path++"\n"
resp500 msg = Response 500 [plain,xo] $ "Internal error: "++msg++"\n"
resp501 msg = Response 501 [plain,xo] $ "Not implemented: "++msg++"\n"
instance Error Response where
noMsg = resp500 "no message"
strMsg = resp500
-- * Content types
plain = ct "text/plain" ""

View File

@@ -1,5 +1,5 @@
-- | Lexers and unlexers - they work on space-separated word strings
module GF.Text.Lexing (stringOp,opInEnv) where
module GF.Text.Lexing (stringOp,opInEnv,bindTok) where
import GF.Text.Transliterations

View File

@@ -9,14 +9,24 @@ instance JSON Grammar where
showJSON (Grammar name extends abstract concretes) =
makeObj ["basename".=name, "extends".=extends,
"abstract".=abstract, "concretes".=concretes]
readJSON = error "Grammar.readJSON intentionally not defined"
instance JSON Abstract where
showJSON (Abstract startcat cats funs) =
makeObj ["startcat".=startcat, "cats".=cats, "funs".=funs]
readJSON = error "Abstract.readJSON intentionally not defined"
instance JSON Fun where showJSON (Fun name typ) = signature name typ
instance JSON Param where showJSON (Param name rhs) = definition name rhs
instance JSON Oper where showJSON (Oper name rhs) = definition name rhs
instance JSON Fun where
showJSON (Fun name typ) = signature name typ
readJSON = error "Fun.readJSON intentionally not defined"
instance JSON Param where
showJSON (Param name rhs) = definition name rhs
readJSON = error "Param.readJSON intentionally not defined"
instance JSON Oper where
showJSON (Oper name rhs) = definition name rhs
readJSON = error "Oper.readJSON intentionally not defined"
signature name typ = makeObj ["name".=name,"type".=typ]
definition name rhs = makeObj ["name".=name,"rhs".=rhs]
@@ -26,12 +36,15 @@ instance JSON Concrete where
makeObj ["langcode".=langcode, "opens".=opens,
"params".=params, "opers".=opers,
"lincats".=lincats, "lins".=lins]
readJSON = error "Concrete.readJSON intentionally not defined"
instance JSON Lincat where
showJSON (Lincat cat lintype) = makeObj ["cat".=cat, "type".=lintype]
readJSON = error "Lincat.readJSON intentionally not defined"
instance JSON Lin where
showJSON (Lin fun args lin) = makeObj ["fun".=fun, "args".=args, "lin".=lin]
readJSON = error "Lin.readJSON intentionally not defined"
infix 1 .=
name .= v = (name,showJSON v)

View File

@@ -9,7 +9,7 @@ executable exb.fcgi
main-is: exb-fcgi.hs
Hs-source-dirs: . ../server ../compiler ../runtime/haskell
other-modules: ExampleService ExampleDemo
FastCGIUtils Cache GF.Compile.ToAPI
CGIUtils Cache GF.Compile.ToAPI
-- and a lot more...
ghc-options: -threaded
if impl(ghc>=7.0)
@@ -17,7 +17,7 @@ executable exb.fcgi
build-depends: base >=4.2 && <5, json, cgi, fastcgi, random,
containers, old-time, directory, bytestring, utf8-string,
pretty, array, mtl, fst, filepath
pretty, array, mtl, time, filepath
if os(windows)
ghc-options: -optl-mwindows

View File

@@ -1,7 +1,7 @@
lib_LTLIBRARIES = libgu.la libpgf.la libsg.la
lib_LTLIBRARIES = libgu.la libpgf.la
pkgconfigdir = $(libdir)/pkgconfig
pkgconfig_DATA = libgu.pc libpgf.pc libsg.pc
pkgconfig_DATA = libgu.pc libpgf.pc
configincludedir = $(libdir)/libgu/include
@@ -37,10 +37,6 @@ pgfinclude_HEADERS = \
pgf/pgf.h \
pgf/data.h
sgincludedir=$(includedir)/sg
sginclude_HEADERS = \
sg/sg.h
libgu_la_SOURCES = \
gu/assert.c \
gu/bits.c \
@@ -92,12 +88,6 @@ libpgf_la_SOURCES = \
libpgf_la_LDFLAGS = -no-undefined
libpgf_la_LIBADD = libgu.la
libsg_la_SOURCES = \
sg/sqlite3Btree.c \
sg/sg.c
libsg_la_LDFLAGS = -no-undefined
libsg_la_LIBADD = libgu.la libpgf.la
bin_PROGRAMS =
AUTOMAKE_OPTIONS = foreign subdir-objects dist-bzip2
@@ -105,5 +95,4 @@ ACLOCAL_AMFLAGS = -I m4
EXTRA_DIST = \
libgu.pc.in \
libpgf.pc.in \
libsg.pc.in
libpgf.pc.in

View File

@@ -58,7 +58,6 @@ AC_CONFIG_LINKS(pgf/lightning/asm.h:$cpu_dir/asm.h dnl
AC_CONFIG_FILES([Makefile
libgu.pc
libpgf.pc
libsg.pc
])
AC_OUTPUT

View File

@@ -7,6 +7,9 @@
typedef struct GuMapData GuMapData;
#define SKIP_DELETED 1
#define SKIP_NONE 2
struct GuMapData {
uint8_t* keys;
uint8_t* values;
@@ -19,6 +22,7 @@ struct GuMap {
GuHasher* hasher;
size_t key_size;
size_t value_size;
size_t cell_size; // cell_size = GU_MAX(value_size,sizeof(uint8_t))
const void* default_value;
GuMapData data;
@@ -30,9 +34,7 @@ gu_map_finalize(GuFinalizer* fin)
{
GuMap* map = gu_container(fin, GuMap, fin);
gu_mem_buf_free(map->data.keys);
if (map->value_size) {
gu_mem_buf_free(map->data.values);
}
gu_mem_buf_free(map->data.values);
}
static const GuWord gu_map_empty_key = 0;
@@ -68,7 +70,7 @@ gu_map_entry_is_free(GuMap* map, GuMapData* data, size_t idx)
}
static bool
gu_map_lookup(GuMap* map, const void* key, size_t* idx_out)
gu_map_lookup(GuMap* map, const void* key, uint8_t del, size_t* idx_out)
{
size_t n = map->data.n_entries;
if (map->hasher == gu_addr_hasher) {
@@ -78,13 +80,17 @@ gu_map_lookup(GuMap* map, const void* key, size_t* idx_out)
while (true) {
const void* entry_key =
((const void**)map->data.keys)[idx];
if (entry_key == NULL && map->data.zero_idx != idx) {
*idx_out = idx;
return false;
if (map->data.values[idx * map->cell_size] != del) { //skip deleted
*idx_out = idx;
return false;
}
} else if (entry_key == key) {
*idx_out = idx;
return true;
}
idx = (idx + offset) % n;
}
} else if (map->hasher == gu_word_hasher) {
@@ -156,33 +162,18 @@ gu_map_resize(GuMap* map, size_t req_entries)
size_t key_size = map->key_size;
size_t key_alloc = 0;
data->keys = gu_mem_buf_alloc(req_entries * key_size, &key_alloc);
memset(data->keys, 0, key_alloc);
size_t value_size = map->value_size;
size_t value_alloc = 0;
if (value_size) {
data->values = gu_mem_buf_alloc(req_entries * value_size,
&value_alloc);
memset(data->values, 0, value_alloc);
}
data->n_entries = gu_twin_prime_inf(value_size ?
GU_MIN(key_alloc / key_size,
value_alloc / value_size)
: key_alloc / key_size);
if (map->hasher == gu_addr_hasher) {
for (size_t i = 0; i < data->n_entries; i++) {
((const void**)data->keys)[i] = NULL;
}
} else if (map->hasher == gu_string_hasher) {
for (size_t i = 0; i < data->n_entries; i++) {
((GuString*)data->keys)[i] = NULL;
}
} else {
memset(data->keys, 0, key_alloc);
}
size_t cell_size = map->cell_size;
data->values = gu_mem_buf_alloc(req_entries * cell_size, &value_alloc);
memset(data->values, 0, value_alloc);
data->n_entries = gu_twin_prime_inf(
GU_MIN(key_alloc / key_size,
value_alloc / cell_size));
gu_assert(data->n_entries > data->n_occupied);
data->n_occupied = 0;
data->zero_idx = SIZE_MAX;
@@ -196,16 +187,14 @@ gu_map_resize(GuMap* map, size_t req_entries)
} else if (map->hasher == gu_string_hasher) {
old_key = (void*) *(GuString*)old_key;
}
void* old_value = &old_data.values[i * value_size];
void* old_value = &old_data.values[i * cell_size];
memcpy(gu_map_insert(map, old_key),
old_value, map->value_size);
}
gu_mem_buf_free(old_data.keys);
if (value_size) {
gu_mem_buf_free(old_data.values);
}
gu_mem_buf_free(old_data.values);
}
@@ -226,9 +215,9 @@ GU_API void*
gu_map_find(GuMap* map, const void* key)
{
size_t idx;
bool found = gu_map_lookup(map, key, &idx);
bool found = gu_map_lookup(map, key, SKIP_DELETED, &idx);
if (found) {
return &map->data.values[idx * map->value_size];
return &map->data.values[idx * map->cell_size];
}
return NULL;
}
@@ -244,7 +233,7 @@ GU_API const void*
gu_map_find_key(GuMap* map, const void* key)
{
size_t idx;
bool found = gu_map_lookup(map, key, &idx);
bool found = gu_map_lookup(map, key, SKIP_DELETED, &idx);
if (found) {
return &map->data.keys[idx * map->key_size];
}
@@ -255,17 +244,17 @@ GU_API bool
gu_map_has(GuMap* ht, const void* key)
{
size_t idx;
return gu_map_lookup(ht, key, &idx);
return gu_map_lookup(ht, key, SKIP_DELETED, &idx);
}
GU_API void*
gu_map_insert(GuMap* map, const void* key)
{
size_t idx;
bool found = gu_map_lookup(map, key, &idx);
bool found = gu_map_lookup(map, key, SKIP_NONE, &idx);
if (!found) {
if (gu_map_maybe_resize(map)) {
found = gu_map_lookup(map, key, &idx);
found = gu_map_lookup(map, key, SKIP_NONE, &idx);
gu_assert(!found);
}
if (map->hasher == gu_addr_hasher) {
@@ -277,7 +266,7 @@ gu_map_insert(GuMap* map, const void* key)
key, map->key_size);
}
if (map->default_value) {
memcpy(&map->data.values[idx * map->value_size],
memcpy(&map->data.values[idx * map->cell_size],
map->default_value, map->value_size);
}
if (gu_map_entry_is_free(map, &map->data, idx)) {
@@ -286,7 +275,32 @@ gu_map_insert(GuMap* map, const void* key)
}
map->data.n_occupied++;
}
return &map->data.values[idx * map->value_size];
return &map->data.values[idx * map->cell_size];
}
GU_API void
gu_map_delete(GuMap* map, const void* key)
{
size_t idx;
bool found = gu_map_lookup(map, key, SKIP_NONE, &idx);
if (found) {
if (map->hasher == gu_addr_hasher) {
((const void**)map->data.keys)[idx] = NULL;
} else if (map->hasher == gu_string_hasher) {
((GuString*)map->data.keys)[idx] = NULL;
} else {
memset(&map->data.keys[idx * map->key_size],
0, map->key_size);
}
map->data.values[idx * map->cell_size] = SKIP_DELETED;
if (gu_map_buf_is_zero(&map->data.keys[idx * map->key_size],
map->key_size)) {
map->data.zero_idx = SIZE_MAX;
}
map->data.n_occupied--;
}
}
GU_API void
@@ -297,7 +311,7 @@ gu_map_iter(GuMap* map, GuMapItor* itor, GuExn* err)
continue;
}
const void* key = &map->data.keys[i * map->key_size];
void* value = &map->data.values[i * map->value_size];
void* value = &map->data.values[i * map->cell_size];
if (map->hasher == gu_addr_hasher) {
key = *(const void* const*) key;
} else if (map->hasher == gu_string_hasher) {
@@ -307,47 +321,33 @@ gu_map_iter(GuMap* map, GuMapItor* itor, GuExn* err)
}
}
typedef struct {
GuEnum en;
GuMap* ht;
size_t i;
GuMapKeyValue x;
} GuMapEnum;
static void
gu_map_enum_next(GuEnum* self, void* to, GuPool* pool)
GU_API bool
gu_map_next(GuMap* map, size_t* pi, void* pkey, void* pvalue)
{
*((GuMapKeyValue**) to) = NULL;
size_t i;
GuMapEnum* en = (GuMapEnum*) self;
for (i = en->i; i < en->ht->data.n_entries; i++) {
if (gu_map_entry_is_free(en->ht, &en->ht->data, i)) {
while (*pi < map->data.n_entries) {
if (gu_map_entry_is_free(map, &map->data, *pi)) {
(*pi)++;
continue;
}
en->x.key = &en->ht->data.keys[i * en->ht->key_size];
en->x.value = &en->ht->data.values[i * en->ht->value_size];
if (en->ht->hasher == gu_addr_hasher) {
en->x.key = *(const void* const*) en->x.key;
} else if (en->ht->hasher == gu_string_hasher) {
en->x.key = *(GuString*) en->x.key;
}
*((GuMapKeyValue**) to) = &en->x;
break;
if (map->hasher == gu_addr_hasher) {
*((void**) pkey) = *((void**) &map->data.keys[*pi * sizeof(void*)]);
} else if (map->hasher == gu_word_hasher) {
*((GuWord*) pkey) = *((GuWord*) &map->data.keys[*pi * sizeof(GuWord)]);
} else if (map->hasher == gu_string_hasher) {
*((GuString*) pkey) = *((GuString*) &map->data.keys[*pi * sizeof(GuString)]);
} else {
memcpy(pkey, &map->data.keys[*pi * map->key_size], map->key_size);
}
memcpy(pvalue, &map->data.values[*pi * map->cell_size],
map->value_size);
(*pi)++;
return true;
}
en->i = i+1;
}
GU_API GuEnum*
gu_map_enum(GuMap* ht, GuPool* pool)
{
GuMapEnum* en = gu_new(GuMapEnum, pool);
en->en.next = gu_map_enum_next;
en->ht = ht;
en->i = 0;
return &en->en;
return false;
}
GU_API size_t
@@ -363,8 +363,6 @@ gu_map_count(GuMap* map)
return count;
}
static const uint8_t gu_map_no_values[1] = { 0 };
GU_API GuMap*
gu_make_map(size_t key_size, GuHasher* hasher,
size_t value_size, const void* default_value,
@@ -375,7 +373,7 @@ gu_make_map(size_t key_size, GuHasher* hasher,
.n_occupied = 0,
.n_entries = 0,
.keys = NULL,
.values = value_size ? NULL : (uint8_t*) gu_map_no_values,
.values = NULL,
.zero_idx = SIZE_MAX
};
GuMap* map = gu_new(GuMap, pool);
@@ -384,6 +382,7 @@ gu_make_map(size_t key_size, GuHasher* hasher,
map->data = data;
map->key_size = key_size;
map->value_size = value_size;
map->cell_size = GU_MAX(value_size,sizeof(uint8_t));
map->fin.fn = gu_map_finalize;
gu_pool_finally(pool, &map->fin);

View File

@@ -62,6 +62,9 @@ gu_map_has(GuMap* ht, const void* key);
GU_API_DECL void*
gu_map_insert(GuMap* ht, const void* key);
GU_API_DECL void
gu_map_delete(GuMap* ht, const void* key);
#define gu_map_put(MAP, KEYP, V, VAL) \
GU_BEGIN \
V* gu_map_put_p_ = gu_map_insert((MAP), (KEYP)); \
@@ -71,13 +74,8 @@ gu_map_insert(GuMap* ht, const void* key);
GU_API_DECL void
gu_map_iter(GuMap* ht, GuMapItor* itor, GuExn* err);
typedef struct {
const void* key;
void* value;
} GuMapKeyValue;
GU_API_DECL GuEnum*
gu_map_enum(GuMap* ht, GuPool* pool);
GU_API bool
gu_map_next(GuMap* map, size_t* pi, void* pkey, void* pvalue);
typedef GuMap GuIntMap;

View File

@@ -142,14 +142,14 @@ pgf_aligner_lzn_symbol_token(PgfLinFuncs** funcs, PgfToken tok)
}
static void
pgf_aligner_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lindex, PgfCId fun)
pgf_aligner_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfAlignerLin* alin = gu_container(funcs, PgfAlignerLin, funcs);
gu_buf_push(alin->parent_stack, int, fid);
}
static void
pgf_aligner_lzn_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lindex, PgfCId fun)
pgf_aligner_lzn_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfAlignerLin* alin = gu_container(funcs, PgfAlignerLin, funcs);
gu_buf_pop(alin->parent_stack, int);

View File

@@ -322,7 +322,8 @@ typedef struct PgfProductionCoerce
typedef struct {
PgfExprProb *ep;
GuSeq* lins;
size_t n_lins;
PgfSymbols* lins[];
} PgfProductionExtern;
typedef struct {
@@ -344,8 +345,9 @@ struct PgfCCat {
PgfCncFuns* linrefs;
size_t n_synprods;
PgfProductionSeq* prods;
float viterbi_prob;
prob_t viterbi_prob;
int fid;
int chunk_count;
PgfItemConts* conts;
struct PgfAnswers* answers;
GuFinalizer fin[0];

View File

@@ -918,94 +918,6 @@ pgf_read_expr(GuIn* in, GuPool* pool, GuPool* tmp_pool, GuExn* err)
return expr;
}
PGF_API int
pgf_read_expr_tuple(GuIn* in,
size_t n_exprs, PgfExpr exprs[],
GuPool* pool, GuExn* err)
{
GuPool* tmp_pool = gu_new_pool();
PgfExprParser* parser =
pgf_new_parser(in, pgf_expr_parser_in_getc, pool, tmp_pool, err);
if (parser->token_tag != PGF_TOKEN_LTRIANGLE)
goto fail;
pgf_expr_parser_token(parser, false);
for (size_t i = 0; i < n_exprs; i++) {
if (i > 0) {
if (parser->token_tag != PGF_TOKEN_COMMA)
goto fail;
pgf_expr_parser_token(parser, false);
}
exprs[i] = pgf_expr_parser_expr(parser, false);
if (gu_variant_is_null(exprs[i]))
goto fail;
}
if (parser->token_tag != PGF_TOKEN_RTRIANGLE)
goto fail;
pgf_expr_parser_token(parser, false);
if (parser->token_tag != PGF_TOKEN_EOF)
goto fail;
gu_pool_free(tmp_pool);
return 1;
fail:
gu_pool_free(tmp_pool);
return 0;
}
PGF_API GuSeq*
pgf_read_expr_matrix(GuIn* in,
size_t n_exprs,
GuPool* pool, GuExn* err)
{
GuPool* tmp_pool = gu_new_pool();
PgfExprParser* parser =
pgf_new_parser(in, pgf_expr_parser_in_getc, pool, tmp_pool, err);
if (parser->token_tag != PGF_TOKEN_LTRIANGLE)
goto fail;
pgf_expr_parser_token(parser, false);
GuBuf* buf = gu_new_buf(PgfExpr, pool);
if (parser->token_tag != PGF_TOKEN_RTRIANGLE) {
for (;;) {
PgfExpr* exprs = gu_buf_extend_n(buf, n_exprs);
for (size_t i = 0; i < n_exprs; i++) {
if (i > 0) {
if (parser->token_tag != PGF_TOKEN_COMMA)
goto fail;
pgf_expr_parser_token(parser, false);
}
exprs[i] = pgf_expr_parser_expr(parser, false);
if (gu_variant_is_null(exprs[i]))
goto fail;
}
if (parser->token_tag != PGF_TOKEN_SEMI)
break;
pgf_expr_parser_token(parser, false);
}
if (parser->token_tag != PGF_TOKEN_RTRIANGLE)
goto fail;
}
pgf_expr_parser_token(parser, false);
if (parser->token_tag != PGF_TOKEN_EOF)
goto fail;
gu_pool_free(tmp_pool);
return gu_buf_data_seq(buf);
fail:
gu_pool_free(tmp_pool);
return NULL;
}
PGF_API PgfType*
pgf_read_type(GuIn* in, GuPool* pool, GuPool* tmp_pool, GuExn* err)
{
@@ -1723,19 +1635,6 @@ pgf_print_context(PgfHypos *hypos, PgfPrintContext* ctxt,
}
}
PGF_API void
pgf_print_expr_tuple(size_t n_exprs, PgfExpr exprs[], PgfPrintContext* ctxt,
GuOut* out, GuExn* err)
{
gu_putc('<', out, err);
for (size_t i = 0; i < n_exprs; i++) {
if (i > 0)
gu_putc(',', out, err);
pgf_print_expr(exprs[i], ctxt, 0, out, err);
}
gu_putc('>', out, err);
}
PGF_API bool
pgf_type_eq(PgfType* t1, PgfType* t2)
{
@@ -1771,6 +1670,168 @@ pgf_type_eq(PgfType* t1, PgfType* t2)
return true;
}
PGF_API PgfLiteral
pgf_clone_literal(PgfLiteral lit, GuPool* pool)
{
PgfLiteral new_lit = gu_null_variant;
GuVariantInfo inf = gu_variant_open(lit);
switch (inf.tag) {
case PGF_LITERAL_STR: {
PgfLiteralStr* lit_str = inf.data;
PgfLiteralStr* new_lit_str =
gu_new_flex_variant(PGF_LITERAL_STR,
PgfLiteralStr,
val, strlen(lit_str->val)+1,
&new_lit, pool);
strcpy(new_lit_str->val, lit_str->val);
break;
}
case PGF_LITERAL_INT: {
PgfLiteralInt *lit_int = inf.data;
PgfLiteralInt *new_lit_int =
gu_new_variant(PGF_LITERAL_INT,
PgfLiteralInt,
&new_lit, pool);
new_lit_int->val = lit_int->val;
break;
}
case PGF_LITERAL_FLT: {
PgfLiteralFlt *lit_flt = inf.data;
PgfLiteralFlt *new_lit_flt =
gu_new_variant(PGF_LITERAL_FLT,
PgfLiteralFlt,
&new_lit, pool);
new_lit_flt->val = lit_flt->val;
break;
}
default:
gu_impossible();
}
return new_lit;
}
PGF_API PgfExpr
pgf_clone_expr(PgfExpr expr, GuPool* pool)
{
PgfExpr new_expr = gu_null_variant;
GuVariantInfo inf = gu_variant_open(expr);
switch (inf.tag) {
case PGF_EXPR_ABS: {
PgfExprAbs* abs = inf.data;
PgfExprAbs* new_abs =
gu_new_variant(PGF_EXPR_ABS,
PgfExprAbs,
&new_expr, pool);
new_abs->bind_type = abs->bind_type;
new_abs->id = gu_string_copy(abs->id, pool);
new_abs->body = pgf_clone_expr(abs->body,pool);
break;
}
case PGF_EXPR_APP: {
PgfExprApp* app = inf.data;
PgfExprApp* new_app =
gu_new_variant(PGF_EXPR_APP,
PgfExprApp,
&new_expr, pool);
new_app->fun = pgf_clone_expr(app->fun, pool);
new_app->arg = pgf_clone_expr(app->arg, pool);
break;
}
case PGF_EXPR_LIT: {
PgfExprLit* lit = inf.data;
PgfExprLit* new_lit =
gu_new_variant(PGF_EXPR_LIT,
PgfExprLit,
&new_expr, pool);
new_lit->lit = pgf_clone_literal(lit->lit, pool);
break;
}
case PGF_EXPR_META: {
PgfExprMeta* meta = inf.data;
PgfExprMeta* new_meta =
gu_new_variant(PGF_EXPR_META,
PgfExprMeta,
&new_expr, pool);
new_meta->id = meta->id;
break;
}
case PGF_EXPR_FUN: {
PgfExprFun* fun = inf.data;
PgfExprFun* new_fun =
gu_new_flex_variant(PGF_EXPR_FUN,
PgfExprFun,
fun, strlen(fun->fun)+1,
&new_expr, pool);
strcpy(new_fun->fun, fun->fun);
break;
}
case PGF_EXPR_VAR: {
PgfExprVar* var = inf.data;
PgfExprVar* new_var =
gu_new_variant(PGF_EXPR_VAR,
PgfExprVar,
&new_expr, pool);
new_var->var = var->var;
break;
}
case PGF_EXPR_TYPED: {
PgfExprTyped* typed = inf.data;
PgfExprTyped *new_typed =
gu_new_variant(PGF_EXPR_TYPED,
PgfExprTyped,
&new_expr, pool);
new_typed->expr = pgf_clone_expr(typed->expr, pool);
new_typed->type = pgf_clone_type(typed->type, pool);
break;
}
case PGF_EXPR_IMPL_ARG: {
PgfExprImplArg* impl = inf.data;
PgfExprImplArg *new_impl =
gu_new_variant(PGF_EXPR_IMPL_ARG,
PgfExprImplArg,
&new_expr, pool);
new_impl->expr = pgf_clone_expr(impl->expr, pool);
break;
}
default:
gu_impossible();
}
return new_expr;
}
PGF_API PgfType*
pgf_clone_type(PgfType* type, GuPool* pool)
{
PgfType* new_type =
gu_new_flex(pool, PgfType, exprs, type->n_exprs);
size_t n_hypos = gu_seq_length(type->hypos);
new_type->hypos = gu_new_seq(PgfHypo, n_hypos, pool);
for (size_t i = 0; i < n_hypos; i++) {
PgfHypo* hypo = gu_seq_index(type->hypos, PgfHypo, i);
PgfHypo* new_hypo = gu_seq_index(new_type->hypos, PgfHypo, i);
new_hypo->bind_type = hypo->bind_type;
new_hypo->cid = gu_string_copy(hypo->cid, pool);
new_hypo->type = pgf_clone_type(hypo->type, pool);
}
new_type->cid = gu_string_copy(type->cid, pool);
new_type->n_exprs = type->n_exprs;
for (size_t i = 0; i < new_type->n_exprs; i++) {
new_type->exprs[i] = pgf_clone_expr(type->exprs[i], pool);
}
return new_type;
}
PGF_API prob_t
pgf_compute_tree_probability(PgfPGF *gr, PgfExpr expr)
{

View File

@@ -170,15 +170,6 @@ pgf_expr_unmeta(PgfExpr expr);
PGF_API_DECL PgfExpr
pgf_read_expr(GuIn* in, GuPool* pool, GuPool* tmp_pool, GuExn* err);
PGF_API_DECL int
pgf_read_expr_tuple(GuIn* in,
size_t n_exprs, PgfExpr exprs[],
GuPool* pool, GuExn* err);
PGF_API_DECL GuSeq*
pgf_read_expr_matrix(GuIn* in, size_t n_exprs,
GuPool* pool, GuExn* err);
PGF_API_DECL PgfType*
pgf_read_type(GuIn* in, GuPool* pool, GuPool* tmp_pool, GuExn* err);
@@ -238,9 +229,14 @@ PGF_API_DECL void
pgf_print_context(PgfHypos *hypos, PgfPrintContext* ctxt,
GuOut *out, GuExn *err);
PGF_API_DECL void
pgf_print_expr_tuple(size_t n_exprs, PgfExpr exprs[], PgfPrintContext* ctxt,
GuOut* out, GuExn* err);
PGF_API PgfLiteral
pgf_clone_literal(PgfLiteral lit, GuPool* pool);
PGF_API PgfExpr
pgf_clone_expr(PgfExpr expr, GuPool* pool);
PGF_API PgfType*
pgf_clone_type(PgfType* type, GuPool* pool);
PGF_API_DECL prob_t
pgf_compute_tree_probability(PgfPGF *gr, PgfExpr expr);

View File

@@ -155,7 +155,7 @@ pgf_bracket_lzn_symbol_token(PgfLinFuncs** funcs, PgfToken tok)
}
static void
pgf_bracket_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lindex, PgfCId fun)
pgf_bracket_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfBracketLznState* state = gu_container(funcs, PgfBracketLznState, funcs);
@@ -192,7 +192,7 @@ pgf_bracket_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t li
}
static void
pgf_bracket_lzn_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lindex, PgfCId fun)
pgf_bracket_lzn_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfBracketLznState* state = gu_container(funcs, PgfBracketLznState, funcs);

View File

@@ -606,7 +606,7 @@ typedef struct {
PgfLzrCachedTag tag;
PgfCId cat;
int fid;
int lin_idx;
GuString ann;
PgfCId fun;
} PgfLzrCached;
@@ -644,7 +644,7 @@ pgf_lzr_cache_flush(PgfLzrCache* cache, PgfSymbols* form)
cache->lzr->funcs,
event->cat,
event->fid,
event->lin_idx,
event->ann,
event->fun);
}
break;
@@ -654,7 +654,7 @@ pgf_lzr_cache_flush(PgfLzrCache* cache, PgfSymbols* form)
cache->lzr->funcs,
event->cat,
event->fid,
event->lin_idx,
event->ann,
event->fun);
}
break;
@@ -709,27 +709,27 @@ found:
}
static void
pgf_lzr_cache_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin_idx, PgfCId fun)
pgf_lzr_cache_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfLzrCache* cache = gu_container(funcs, PgfLzrCache, funcs);
PgfLzrCached* event = gu_buf_extend(cache->events);
event->tag = PGF_CACHED_BEGIN;
event->cat = cat;
event->fid = fid;
event->lin_idx = lin_idx;
event->fun = fun;
event->tag = PGF_CACHED_BEGIN;
event->cat = cat;
event->fid = fid;
event->ann = ann;
event->fun = fun;
}
static void
pgf_lzr_cache_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin_idx, PgfCId fun)
pgf_lzr_cache_end_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfLzrCache* cache = gu_container(funcs, PgfLzrCache, funcs);
PgfLzrCached* event = gu_buf_extend(cache->events);
event->tag = PGF_CACHED_END;
event->cat = cat;
event->fid = fid;
event->lin_idx = lin_idx;
event->fun = fun;
event->tag = PGF_CACHED_END;
event->cat = cat;
event->fid = fid;
event->ann = ann;
event->fun = fun;
}
static void
@@ -918,7 +918,7 @@ pgf_lzr_linearize_tree(PgfLzr* lzr, PgfCncTree ctree, size_t lin_idx)
if ((*lzr->funcs)->begin_phrase && fapp->ccat != NULL) {
(*lzr->funcs)->begin_phrase(lzr->funcs,
fapp->ccat->cnccat->abscat->name,
fapp->fid, lin_idx,
fapp->fid, fapp->ccat->cnccat->labels[lin_idx],
fapp->abs_id);
}
@@ -928,7 +928,7 @@ pgf_lzr_linearize_tree(PgfLzr* lzr, PgfCncTree ctree, size_t lin_idx)
if ((*lzr->funcs)->end_phrase && fapp->ccat != NULL) {
(*lzr->funcs)->end_phrase(lzr->funcs,
fapp->ccat->cnccat->abscat->name,
fapp->fid, lin_idx,
fapp->fid, fapp->ccat->cnccat->labels[lin_idx],
fapp->abs_id);
}
break;
@@ -957,7 +957,7 @@ pgf_lzr_linearize_tree(PgfLzr* lzr, PgfCncTree ctree, size_t lin_idx)
if ((*lzr->funcs)->begin_phrase && flit->fid >= 0) {
(*lzr->funcs)->begin_phrase(lzr->funcs,
cat, flit->fid, 0,
cat, flit->fid, "s",
"");
}
@@ -989,7 +989,7 @@ pgf_lzr_linearize_tree(PgfLzr* lzr, PgfCncTree ctree, size_t lin_idx)
if ((*lzr->funcs)->end_phrase && flit->fid >= 0) {
(*lzr->funcs)->end_phrase(lzr->funcs,
cat, flit->fid, 0,
cat, flit->fid, "s",
"");
}

View File

@@ -84,10 +84,10 @@ struct PgfLinFuncs
void (*symbol_token)(PgfLinFuncs** self, PgfToken tok);
/// Begin phrase
void (*begin_phrase)(PgfLinFuncs** self, PgfCId cat, int fid, size_t lindex, PgfCId fun);
void (*begin_phrase)(PgfLinFuncs** self, PgfCId cat, int fid, GuString ann, PgfCId fun);
/// End phrase
void (*end_phrase)(PgfLinFuncs** self, PgfCId cat, int fid, size_t lindex, PgfCId fun);
void (*end_phrase)(PgfLinFuncs** self, PgfCId cat, int fid, GuString ann, PgfCId fun);
/// handling nonExist
void (*symbol_ne)(PgfLinFuncs** self);

View File

@@ -6,11 +6,12 @@
static PgfExprProb*
pgf_match_string_lit(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool)
{
gu_assert(lin_idx == 0);
if (strcmp(ann,"s") != 0)
return NULL;
const uint8_t* buf = (uint8_t*) (sentence + *poffset);
const uint8_t* p = buf;
@@ -51,7 +52,7 @@ pgf_predict_empty_next(GuEnum* self, void* to, GuPool* pool)
static GuEnum*
pgf_predict_empty(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString prefix,
GuPool *out_pool)
{
@@ -67,11 +68,12 @@ static PgfLiteralCallback pgf_string_literal_callback =
static PgfExprProb*
pgf_match_int_lit(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool)
{
gu_assert(lin_idx == 0);
if (strcmp(ann,"s") != 0)
return NULL;
const uint8_t* buf = (uint8_t*) (sentence + *poffset);
const uint8_t* p = buf;
@@ -121,11 +123,12 @@ static PgfLiteralCallback pgf_int_literal_callback =
static PgfExprProb*
pgf_match_float_lit(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool)
{
gu_assert(lin_idx == 0);
if (strcmp(ann,"s") != 0)
return NULL;
const uint8_t* buf = (uint8_t*) (sentence + *poffset);
const uint8_t* p = buf;
@@ -226,11 +229,11 @@ pgf_match_name_morpho_callback(PgfMorphoCallback* self_,
static PgfExprProb*
pgf_match_name_lit(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool)
{
if (lin_idx != 0)
if (strcmp(ann,"s") != 0)
return NULL;
GuPool* tmp_pool = gu_local_pool();
@@ -349,7 +352,7 @@ pgf_match_unknown_morpho_callback(PgfMorphoCallback* self_,
static PgfExprProb*
pgf_match_unknown_lit(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool)
{

View File

@@ -876,7 +876,7 @@ pgf_lookup_symbol_token(PgfLinFuncs** self, PgfToken token)
}
static void
pgf_lookup_begin_phrase(PgfLinFuncs** self, PgfCId cat, int fid, size_t lindex, PgfCId funname)
pgf_lookup_begin_phrase(PgfLinFuncs** self, PgfCId cat, int fid, GuString ann, PgfCId funname)
{
PgfLookupState* st = gu_container(self, PgfLookupState, funcs);
@@ -890,7 +890,7 @@ pgf_lookup_begin_phrase(PgfLinFuncs** self, PgfCId cat, int fid, size_t lindex,
}
static void
pgf_lookup_end_phrase(PgfLinFuncs** self, PgfCId cat, int fid, size_t lindex, PgfCId fun)
pgf_lookup_end_phrase(PgfLinFuncs** self, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfLookupState* st = gu_container(self, PgfLookupState, funcs);
st->curr_absfun = NULL;

File diff suppressed because it is too large Load Diff

View File

@@ -6,7 +6,7 @@
typedef struct {
int start, end;
PgfCId cat;
size_t lin_idx;
GuString ann;
} PgfPhrase;
typedef struct {
@@ -46,14 +46,14 @@ pgf_metrics_lzn_symbol_token(PgfLinFuncs** funcs, PgfToken tok)
}
static void
pgf_metrics_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin_index, PgfCId fun)
pgf_metrics_lzn_begin_phrase(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfMetricsLznState* state = gu_container(funcs, PgfMetricsLznState, funcs);
gu_buf_push(state->marks, int, state->pos);
}
static void
pgf_metrics_lzn_end_phrase1(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin_idx, PgfCId fun)
pgf_metrics_lzn_end_phrase1(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfMetricsLznState* state = gu_container(funcs, PgfMetricsLznState, funcs);
@@ -65,7 +65,7 @@ pgf_metrics_lzn_end_phrase1(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin
phrase->start = start;
phrase->end = end;
phrase->cat = cat;
phrase->lin_idx = lin_idx;
phrase->ann = ann;
gu_buf_push(state->phrases, PgfPhrase*, phrase);
}
}
@@ -85,7 +85,7 @@ pgf_metrics_symbol_bind(PgfLinFuncs** funcs)
}
static void
pgf_metrics_lzn_end_phrase2(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin_idx, PgfCId fun)
pgf_metrics_lzn_end_phrase2(PgfLinFuncs** funcs, PgfCId cat, int fid, GuString ann, PgfCId fun)
{
PgfMetricsLznState* state = gu_container(funcs, PgfMetricsLznState, funcs);
@@ -100,7 +100,7 @@ pgf_metrics_lzn_end_phrase2(PgfLinFuncs** funcs, PgfCId cat, int fid, size_t lin
if (phrase->start == start &&
phrase->end == end &&
strcmp(phrase->cat, cat) == 0 &&
phrase->lin_idx == lin_idx) {
strcmp(phrase->ann, ann) == 0) {
state->matches++;
break;
}

View File

@@ -163,6 +163,20 @@ pgf_category_prob(PgfPGF* pgf, PgfCId catname)
return abscat->prob;
}
PGF_API GuString*
pgf_category_fields(PgfConcr* concr, PgfCId catname, size_t *n_lins)
{
PgfCncCat* cnccat =
gu_map_get(concr->cnccats, catname, PgfCncCat*);
if (!cnccat) {
*n_lins = 0;
return NULL;
}
*n_lins = cnccat->n_lins;
return &cnccat->labels;
}
PGF_API GuString
pgf_language_code(PgfConcr* concr)
{

View File

@@ -95,6 +95,9 @@ pgf_category_context(PgfPGF *gr, PgfCId catname);
PGF_API_DECL prob_t
pgf_category_prob(PgfPGF* pgf, PgfCId catname);
PGF_API GuString*
pgf_category_fields(PgfConcr* concr, PgfCId catname, size_t *n_lins);
PGF_API_DECL void
pgf_iter_functions(PgfPGF* pgf, GuMapItor* itor, GuExn* err);
@@ -168,8 +171,8 @@ pgf_lookup_morpho(PgfConcr *concr, GuString sentence,
PgfMorphoCallback* callback, GuExn* err);
typedef struct {
size_t pos;
GuString ptr;
size_t pos; // position in Unicode characters
GuString ptr; // pointer into the string
} PgfCohortSpot;
typedef struct {
@@ -208,6 +211,12 @@ pgf_parse_with_heuristics(PgfConcr* concr, PgfType* typ,
GuExn* err,
GuPool* pool, GuPool* out_pool);
typedef struct {
size_t start;
size_t end;
GuString field;
} PgfParseRange;
typedef struct PgfOracleCallback PgfOracleCallback;
struct PgfOracleCallback {
@@ -248,11 +257,11 @@ typedef struct PgfLiteralCallback PgfLiteralCallback;
struct PgfLiteralCallback {
PgfExprProb* (*match)(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString sentence, size_t* poffset,
GuPool *out_pool);
GuEnum* (*predict)(PgfLiteralCallback* self, PgfConcr* concr,
size_t lin_idx,
GuString ann,
GuString prefix,
GuPool *out_pool);
};

View File

@@ -844,6 +844,7 @@ pgf_read_fid(PgfReader* rdr, PgfConcr* concr)
ccat->prods = NULL;
ccat->viterbi_prob = 0;
ccat->fid = fid;
ccat->chunk_count = 1;
ccat->conts = NULL;
ccat->answers = NULL;
@@ -1081,6 +1082,7 @@ pgf_read_cnccat(PgfReader* rdr, PgfAbstr* abstr, PgfConcr* concr, PgfCId name)
ccat->prods = NULL;
ccat->viterbi_prob = 0;
ccat->fid = fid;
ccat->chunk_count = 1;
ccat->conts = NULL;
ccat->answers = NULL;

View File

@@ -114,7 +114,7 @@ pgf_morpho_iter(PgfProductionIdx* idx,
PgfCId lemma = entry->papp->fun->absfun->name;
GuString analysis = entry->ccat->cnccat->labels[entry->lin_idx];
prob_t prob = entry->ccat->cnccat->abscat->prob +
entry->papp->fun->absfun->ep.prob;
callback->callback(callback,
@@ -234,12 +234,13 @@ typedef struct {
GuEnum en;
PgfConcr* concr;
GuString sentence;
GuString current;
size_t len;
PgfMorphoCallback* callback;
GuExn* err;
bool case_sensitive;
GuBuf* spots;
GuBuf* skip_spots;
GuBuf* empty_buf;
GuBuf* found;
} PgfCohortsState;
@@ -255,6 +256,23 @@ cmp_cohort_spot(GuOrder* self, const void* a, const void* b)
static GuOrder
pgf_cohort_spot_order[1] = {{ cmp_cohort_spot }};
static void
pgf_lookup_cohorts_report_skip(PgfCohortsState *state,
PgfCohortSpot* spot)
{
size_t n_spots = gu_buf_length(state->skip_spots);
for (size_t i = 0; i < n_spots; i++) {
PgfCohortSpot* skip_spot =
gu_buf_index(state->skip_spots, PgfCohortSpot, i);
PgfCohortRange* range = gu_buf_insert(state->found, 0);
range->start = *skip_spot;
range->end = *spot;
range->buf = state->empty_buf;
}
gu_buf_flush(state->skip_spots);
}
static void
pgf_lookup_cohorts_helper(PgfCohortsState *state, PgfCohortSpot* spot,
int i, int j, ptrdiff_t min, ptrdiff_t max)
@@ -291,18 +309,23 @@ pgf_lookup_cohorts_helper(PgfCohortsState *state, PgfCohortSpot* spot,
pgf_lookup_cohorts_helper(state, spot, i, k-1, min, len);
if (seq->idx != NULL && gu_buf_length(seq->idx) > 0) {
// Report unknown words
pgf_lookup_cohorts_report_skip(state, spot);
// Report the actual hit
PgfCohortRange* range = gu_buf_insert(state->found, 0);
range->start = *spot;
range->end = current;
range->buf = seq->idx;
}
while (*current.ptr != 0) {
if (!skip_space(&current.ptr, &current.pos))
break;
}
// Schedule the next search spot
while (*current.ptr != 0) {
if (!skip_space(&current.ptr, &current.pos))
break;
}
gu_buf_heap_push(state->spots, pgf_cohort_spot_order, &current);
gu_buf_heap_push(state->spots, pgf_cohort_spot_order, &current);
}
if (len <= max)
pgf_lookup_cohorts_helper(state, spot, k+1, j, len, max);
@@ -318,29 +341,67 @@ pgf_lookup_cohorts_enum_next(GuEnum* self, void* to, GuPool* pool)
PgfCohortsState* state = gu_container(self, PgfCohortsState, en);
while (gu_buf_length(state->found) == 0 &&
gu_buf_length(state->spots) > 0) {
gu_buf_length(state->spots) > 0) {
PgfCohortSpot spot;
gu_buf_heap_pop(state->spots, pgf_cohort_spot_order, &spot);
if (spot.ptr == state->current)
continue;
GuString next_ptr = state->sentence+state->len;
while (gu_buf_length(state->spots) > 0) {
GuString ptr =
gu_buf_index(state->spots, PgfCohortSpot, 0)->ptr;
if (ptr > spot.ptr) {
next_ptr = ptr;
break;
}
gu_buf_heap_pop(state->spots, pgf_cohort_spot_order, &spot);
}
if (*spot.ptr == 0)
break;
bool needs_report = true;
while (next_ptr > spot.ptr) {
pgf_lookup_cohorts_helper
(state, &spot,
0, gu_seq_length(state->concr->sequences)-1,
1, (state->sentence+state->len)-spot.ptr);
pgf_lookup_cohorts_helper
(state, &spot,
0, gu_seq_length(state->concr->sequences)-1,
1, (state->sentence+state->len)-spot.ptr);
if (gu_buf_length(state->found) == 0) {
// skip one character and try again
gu_utf8_decode((const uint8_t**) &spot.ptr);
spot.pos++;
gu_buf_heap_push(state->spots, pgf_cohort_spot_order, &spot);
// got a hit -> exit
if (gu_buf_length(state->found) > 0)
break;
if (needs_report) {
// no hit, but the word must be reported as unknown.
gu_buf_push(state->skip_spots, PgfCohortSpot, spot);
needs_report = false;
}
// skip one character
const uint8_t* ptr = (const uint8_t*) spot.ptr;
GuUCS c = gu_utf8_decode(&ptr);
if (gu_ucs_is_space(c)) {
// We have encounter a space and we must report
// a new unknown word.
pgf_lookup_cohorts_report_skip(state, &spot);
spot.ptr = (GuString) ptr;
spot.pos++;
// Schedule the next search spot
while (*spot.ptr != 0) {
if (!skip_space(&spot.ptr, &spot.pos))
break;
}
gu_buf_heap_push(state->spots, pgf_cohort_spot_order, &spot);
break;
} else {
spot.ptr = (GuString) ptr;
spot.pos++;
}
}
}
PgfCohortSpot end_spot = {state->len, state->sentence+state->len};
pgf_lookup_cohorts_report_skip(state, &end_spot);
PgfCohortRange* pRes = (PgfCohortRange*)to;
if (gu_buf_length(state->found) == 0) {
@@ -349,15 +410,19 @@ pgf_lookup_cohorts_enum_next(GuEnum* self, void* to, GuPool* pool)
pRes->end.pos = 0;
pRes->end.ptr = NULL;
pRes->buf = NULL;
state->current = NULL;
return;
} else do {
} else for (;;) {
*pRes = gu_buf_pop(state->found, PgfCohortRange);
state->current = pRes->start.ptr;
pgf_morpho_iter(pRes->buf, state->callback, state->err);
} while (gu_buf_length(state->found) > 0 &&
gu_buf_index_last(state->found, PgfCohortRange)->end.ptr == pRes->end.ptr);
if (gu_buf_length(state->found) <= 0)
break;
PgfCohortRange* last =
gu_buf_index_last(state->found, PgfCohortRange);
if (last->start.ptr != pRes->start.ptr ||
last->end.ptr != pRes->end.ptr)
break;
}
}
PGF_API GuEnum*
@@ -374,15 +439,17 @@ pgf_lookup_cohorts(PgfConcr *concr, GuString sentence,
}
PgfCohortsState* state = gu_new(PgfCohortsState, pool);
state->en.next = pgf_lookup_cohorts_enum_next;
state->concr = concr;
state->sentence= sentence;
state->len = strlen(sentence);
state->callback= callback;
state->err = err;
state->case_sensitive = pgf_is_case_sensitive(concr);
state->spots = gu_new_buf(PgfCohortSpot, pool);
state->found = gu_new_buf(PgfCohortRange, pool);
state->en.next = pgf_lookup_cohorts_enum_next;
state->concr = concr;
state->sentence = sentence;
state->len = strlen(sentence);
state->callback = callback;
state->err = err;
state->case_sensitive= pgf_is_case_sensitive(concr);
state->spots = gu_new_buf(PgfCohortSpot, pool);
state->skip_spots = gu_new_buf(PgfCohortSpot, pool);
state->empty_buf = gu_new_buf(PgfProductionIdxEntry, pool);
state->found = gu_new_buf(PgfCohortRange, pool);
PgfCohortSpot spot = {0,sentence};
while (*spot.ptr != 0) {

File diff suppressed because it is too large Load Diff

View File

@@ -1,94 +0,0 @@
#ifndef SG_SG_H_
#define SG_SG_H_
typedef long long int SgId;
#include <gu/exn.h>
#include <pgf/pgf.h>
typedef struct SgSG SgSG;
SgSG*
sg_open(const char *filename, GuExn* err);
void
sg_close(SgSG *sg, GuExn* err);
void
sg_begin_trans(SgSG* sg, GuExn* err);
void
sg_commit(SgSG* sg, GuExn* err);
void
sg_rollback(SgSG* sg, GuExn* err);
SgId
sg_insert_expr(SgSG *sg, PgfExpr expr, int wrFlag, GuExn* err);
PgfExpr
sg_get_expr(SgSG *sg, SgId key, GuPool* out_pool, GuExn* err);
typedef struct SgQueryExprResult SgQueryExprResult;
SgQueryExprResult*
sg_query_expr(SgSG *sg, PgfExpr expr, GuPool* pool, GuExn* err);
PgfExpr
sg_query_next(SgSG *sg, SgQueryExprResult* ctxt, SgId* pKey, GuPool* pool, GuExn* err);
void
sg_query_close(SgSG* sg, SgQueryExprResult* ctxt, GuExn* err);
void
sg_update_fts_index(SgSG* sg, PgfPGF* pgf, GuExn* err);
GuSeq*
sg_query_linearization(SgSG *sg, GuString tok, GuPool* pool, GuExn* err);
typedef PgfExpr SgTriple[3];
SgId
sg_insert_triple(SgSG *sg, SgTriple triple, GuExn* err);
int
sg_get_triple(SgSG *sg, SgId key, SgTriple triple,
GuPool* out_pool, GuExn* err);
typedef struct SgTripleResult SgTripleResult;
SgTripleResult*
sg_query_triple(SgSG *sg, SgTriple triple, GuExn* err);
int
sg_triple_result_fetch(SgTripleResult* tres, SgId* pKey, SgTriple triple,
GuPool* out_pool, GuExn* err);
void
sg_triple_result_get_query(SgTripleResult* tres, SgTriple triple);
void
sg_triple_result_close(SgTripleResult* tres, GuExn* err);
typedef struct SgQueryResult SgQueryResult;
SgQueryResult*
sg_query(SgSG *sg, size_t n_triples, SgTriple* triples, GuExn* err);
size_t
sg_query_result_columns(SgQueryResult* qres);
int
sg_query_result_fetch_columns(SgQueryResult* qres, PgfExpr* res,
GuPool* out_pool, GuExn* err);
PgfExpr
sg_query_result_fetch_expr(SgQueryResult* qres, PgfExpr expr,
GuPool* out_pool, GuExn* err);
void
sg_query_result_close(SgQueryResult* qres, GuExn* err);
#endif

File diff suppressed because it is too large Load Diff

View File

@@ -1,705 +0,0 @@
/*
** 2001 September 15
**
** The author disclaims copyright to this source code. In place of
** a legal notice, here is a blessing:
**
** May you do good and not evil.
** May you find forgiveness for yourself and forgive others.
** May you share freely, never taking more than you give.
**
*************************************************************************
** This header file defines the interface that the sqlite B-Tree file
** subsystem. See comments in the source code for a detailed description
** of what each interface routine does.
*/
#ifndef _BTREE_H_
#define _BTREE_H_
/*
** The SQLITE_THREADSAFE macro must be defined as 0, 1, or 2.
** 0 means mutexes are permanently disable and the library is never
** threadsafe. 1 means the library is serialized which is the highest
** level of threadsafety. 2 means the library is multithreaded - multiple
** threads can use SQLite as long as no two threads try to use the same
** database connection at the same time.
**
** Older versions of SQLite used an optional THREADSAFE macro.
** We support that for legacy.
*/
#if !defined(SQLITE_THREADSAFE)
# if defined(THREADSAFE)
# define SQLITE_THREADSAFE THREADSAFE
# else
# define SQLITE_THREADSAFE 1 /* IMP: R-07272-22309 */
# endif
#endif
/*
** CAPI3REF: 64-Bit Integer Types
** KEYWORDS: sqlite_int64 sqlite_uint64
**
** Because there is no cross-platform way to specify 64-bit integer types
** SQLite includes typedefs for 64-bit signed and unsigned integers.
**
** The sqlite3_int64 and sqlite3_uint64 are the preferred type definitions.
** The sqlite_int64 and sqlite_uint64 types are supported for backwards
** compatibility only.
**
** ^The sqlite3_int64 and sqlite_int64 types can store integer values
** between -9223372036854775808 and +9223372036854775807 inclusive. ^The
** sqlite3_uint64 and sqlite_uint64 types can store integer values
** between 0 and +18446744073709551615 inclusive.
*/
#ifdef SQLITE_INT64_TYPE
typedef SQLITE_INT64_TYPE sqlite_int64;
typedef unsigned SQLITE_INT64_TYPE sqlite_uint64;
#elif defined(_MSC_VER) || defined(__BORLANDC__)
typedef __int64 sqlite_int64;
typedef unsigned __int64 sqlite_uint64;
#else
typedef long long int sqlite_int64;
typedef unsigned long long int sqlite_uint64;
#endif
typedef sqlite_int64 sqlite3_int64;
typedef sqlite_uint64 sqlite3_uint64;
/*
** Integers of known sizes. These typedefs might change for architectures
** where the sizes very. Preprocessor macros are available so that the
** types can be conveniently redefined at compile-type. Like this:
**
** cc '-DUINTPTR_TYPE=long long int' ...
*/
#ifndef UINT32_TYPE
# ifdef HAVE_UINT32_T
# define UINT32_TYPE uint32_t
# else
# define UINT32_TYPE unsigned int
# endif
#endif
#ifndef UINT16_TYPE
# ifdef HAVE_UINT16_T
# define UINT16_TYPE uint16_t
# else
# define UINT16_TYPE unsigned short int
# endif
#endif
#ifndef INT16_TYPE
# ifdef HAVE_INT16_T
# define INT16_TYPE int16_t
# else
# define INT16_TYPE short int
# endif
#endif
#ifndef UINT8_TYPE
# ifdef HAVE_UINT8_T
# define UINT8_TYPE uint8_t
# else
# define UINT8_TYPE unsigned char
# endif
#endif
#ifndef INT8_TYPE
# ifdef HAVE_INT8_T
# define INT8_TYPE int8_t
# else
# define INT8_TYPE signed char
# endif
#endif
#ifndef LONGDOUBLE_TYPE
# define LONGDOUBLE_TYPE long double
#endif
typedef sqlite_int64 i64; /* 8-byte signed integer */
typedef sqlite_uint64 u64; /* 8-byte unsigned integer */
typedef UINT32_TYPE u32; /* 4-byte unsigned integer */
typedef UINT16_TYPE u16; /* 2-byte unsigned integer */
typedef INT16_TYPE i16; /* 2-byte signed integer */
typedef UINT8_TYPE u8; /* 1-byte unsigned integer */
typedef INT8_TYPE i8; /* 1-byte signed integer */
/* TODO: This definition is just included so other modules compile. It
** needs to be revisited.
*/
#define SQLITE_N_BTREE_META 16
/*
** If defined as non-zero, auto-vacuum is enabled by default. Otherwise
** it must be turned on for each database using "PRAGMA auto_vacuum = 1".
*/
#ifndef SQLITE_DEFAULT_AUTOVACUUM
#define SQLITE_DEFAULT_AUTOVACUUM 0
#endif
#define BTREE_AUTOVACUUM_NONE 0 /* Do not do auto-vacuum */
#define BTREE_AUTOVACUUM_FULL 1 /* Do full auto-vacuum */
#define BTREE_AUTOVACUUM_INCR 2 /* Incremental vacuum */
/*
** CAPI3REF: Initialize The SQLite Library
**
** ^The sqlite3BtreeInitialize() routine initializes the
** SQLite library. ^The sqlite3BtreeShutdown() routine
** deallocates any resources that were allocated by sqlite3BtreeInitialize().
** These routines are designed to aid in process initialization and
** shutdown on embedded systems. Workstation applications using
** SQLite normally do not need to invoke either of these routines.
**
** A call to sqlite3BtreeInitialize() is an "effective" call if it is
** the first time sqlite3BtreeInitialize() is invoked during the lifetime of
** the process, or if it is the first time sqlite3BtreeInitialize() is invoked
** following a call to sqlite3BtreeShutdown(). ^(Only an effective call
** of sqlite3BtreeInitialize() does any initialization. All other calls
** are harmless no-ops.)^
**
** A call to sqlite3BtreeShutdown() is an "effective" call if it is the first
** call to sqlite3BtreeShutdown() since the last sqlite3BtreeInitialize(). ^(Only
** an effective call to sqlite3BtreeShutdown() does any deinitialization.
** All other valid calls to sqlite3BtreeShutdown() are harmless no-ops.)^
**
** The sqlite3BtreeInitialize() interface is threadsafe, but sqlite3BtreeShutdown()
** is not. The sqlite3BtreeShutdown() interface must only be called from a
** single thread. All open [database connections] must be closed and all
** other SQLite resources must be deallocated prior to invoking
** sqlite3BtreeShutdown().
**
** Among other things, ^sqlite3BtreeInitialize() will invoke
** sqlite3_os_init(). Similarly, ^sqlite3BtreeShutdown()
** will invoke sqlite3_os_end().
**
** ^The sqlite3BtreeInitialize() routine returns [SQLITE_OK] on success.
** ^If for some reason, sqlite3BtreeInitialize() is unable to initialize
** the library (perhaps it is unable to allocate a needed resource such
** as a mutex) it returns an [error code] other than [SQLITE_OK].
**
** ^The sqlite3BtreeInitialize() routine is called internally by many other
** SQLite interfaces so that an application usually does not need to
** invoke sqlite3BtreeInitialize() directly. For example, [sqlite3_open()]
** calls sqlite3BtreeInitialize() so the SQLite library will be automatically
** initialized when [sqlite3_open()] is called if it has not be initialized
** already. ^However, if SQLite is compiled with the [SQLITE_OMIT_AUTOINIT]
** compile-time option, then the automatic calls to sqlite3BtreeInitialize()
** are omitted and the application must call sqlite3BtreeInitialize() directly
** prior to using any other SQLite interface. For maximum portability,
** it is recommended that applications always invoke sqlite3BtreeInitialize()
** directly prior to using any other SQLite interface. Future releases
** of SQLite may require this. In other words, the behavior exhibited
** when SQLite is compiled with [SQLITE_OMIT_AUTOINIT] might become the
** default behavior in some future release of SQLite.
**
** The sqlite3_os_init() routine does operating-system specific
** initialization of the SQLite library. The sqlite3_os_end()
** routine undoes the effect of sqlite3_os_init(). Typical tasks
** performed by these routines include allocation or deallocation
** of static resources, initialization of global variables,
** setting up a default [sqlite3_vfs] module, or setting up
** a default configuration using [sqlite3_config()].
**
** The application should never invoke either sqlite3_os_init()
** or sqlite3_os_end() directly. The application should only invoke
** sqlite3BtreeInitialize() and sqlite3BtreeShutdown(). The sqlite3_os_init()
** interface is called automatically by sqlite3BtreeInitialize() and
** sqlite3_os_end() is called by sqlite3BtreeShutdown(). Appropriate
** implementations for sqlite3_os_init() and sqlite3_os_end()
** are built into SQLite when it is compiled for Unix, Windows, or OS/2.
** When [custom builds | built for other platforms]
** (using the [SQLITE_OS_OTHER=1] compile-time
** option) the application must supply a suitable implementation for
** sqlite3_os_init() and sqlite3_os_end(). An application-supplied
** implementation of sqlite3_os_init() or sqlite3_os_end()
** must return [SQLITE_OK] on success and some other [error code] upon
** failure.
*/
int sqlite3BtreeInitialize(void);
int sqlite3BtreeShutdown(void);
/*
** CAPI3REF: Result Codes
** KEYWORDS: {result code definitions}
**
** Many SQLite functions return an integer result code from the set shown
** here in order to indicate success or failure.
**
** New error codes may be added in future versions of SQLite.
**
** See also: [extended result code definitions]
*/
#define SQLITE_OK 0 /* Successful result */
/* beginning-of-error-codes */
#define SQLITE_ERROR 1 /* SQL error or missing database */
#define SQLITE_INTERNAL 2 /* Internal logic error in SQLite */
#define SQLITE_PERM 3 /* Access permission denied */
#define SQLITE_ABORT 4 /* Callback routine requested an abort */
#define SQLITE_BUSY 5 /* The database file is locked */
#define SQLITE_LOCKED 6 /* A table in the database is locked */
#define SQLITE_NOMEM 7 /* A malloc() failed */
#define SQLITE_READONLY 8 /* Attempt to write a readonly database */
#define SQLITE_INTERRUPT 9 /* Operation terminated by sqlite3_interrupt()*/
#define SQLITE_IOERR 10 /* Some kind of disk I/O error occurred */
#define SQLITE_CORRUPT 11 /* The database disk image is malformed */
#define SQLITE_NOTFOUND 12 /* Unknown opcode in sqlite3_file_control() */
#define SQLITE_FULL 13 /* Insertion failed because database is full */
#define SQLITE_CANTOPEN 14 /* Unable to open the database file */
#define SQLITE_PROTOCOL 15 /* Database lock protocol error */
#define SQLITE_EMPTY 16 /* Database is empty */
#define SQLITE_SCHEMA 17 /* The database schema changed */
#define SQLITE_TOOBIG 18 /* String or BLOB exceeds size limit */
#define SQLITE_CONSTRAINT 19 /* Abort due to constraint violation */
#define SQLITE_MISMATCH 20 /* Data type mismatch */
#define SQLITE_MISUSE 21 /* Library used incorrectly */
#define SQLITE_NOLFS 22 /* Uses OS features not supported on host */
#define SQLITE_AUTH 23 /* Authorization denied */
#define SQLITE_FORMAT 24 /* Auxiliary database format error */
#define SQLITE_RANGE 25 /* 2nd parameter to sqlite3_bind out of range */
#define SQLITE_NOTADB 26 /* File opened that is not a database file */
#define SQLITE_NOTICE 27 /* Notifications from sqlite3_log() */
#define SQLITE_WARNING 28 /* Warnings from sqlite3_log() */
#define SQLITE_ROW 100 /* sqlite3_step() has another row ready */
#define SQLITE_DONE 101 /* sqlite3_step() has finished executing */
/* end-of-error-codes */
/*
** CAPI3REF: Extended Result Codes
** KEYWORDS: {extended result code definitions}
**
** In its default configuration, SQLite API routines return one of 30 integer
** [result codes]. However, experience has shown that many of
** these result codes are too coarse-grained. They do not provide as
** much information about problems as programmers might like. In an effort to
** address this, newer versions of SQLite (version 3.3.8 and later) include
** support for additional result codes that provide more detailed information
** about errors. These [extended result codes] are enabled or disabled
** on a per database connection basis using the
** [sqlite3_extended_result_codes()] API. Or, the extended code for
** the most recent error can be obtained using
** [sqlite3_extended_errcode()].
*/
#define SQLITE_IOERR_READ (SQLITE_IOERR | (1<<8))
#define SQLITE_IOERR_SHORT_READ (SQLITE_IOERR | (2<<8))
#define SQLITE_IOERR_WRITE (SQLITE_IOERR | (3<<8))
#define SQLITE_IOERR_FSYNC (SQLITE_IOERR | (4<<8))
#define SQLITE_IOERR_DIR_FSYNC (SQLITE_IOERR | (5<<8))
#define SQLITE_IOERR_TRUNCATE (SQLITE_IOERR | (6<<8))
#define SQLITE_IOERR_FSTAT (SQLITE_IOERR | (7<<8))
#define SQLITE_IOERR_UNLOCK (SQLITE_IOERR | (8<<8))
#define SQLITE_IOERR_RDLOCK (SQLITE_IOERR | (9<<8))
#define SQLITE_IOERR_DELETE (SQLITE_IOERR | (10<<8))
#define SQLITE_IOERR_BLOCKED (SQLITE_IOERR | (11<<8))
#define SQLITE_IOERR_NOMEM (SQLITE_IOERR | (12<<8))
#define SQLITE_IOERR_ACCESS (SQLITE_IOERR | (13<<8))
#define SQLITE_IOERR_CHECKRESERVEDLOCK (SQLITE_IOERR | (14<<8))
#define SQLITE_IOERR_LOCK (SQLITE_IOERR | (15<<8))
#define SQLITE_IOERR_CLOSE (SQLITE_IOERR | (16<<8))
#define SQLITE_IOERR_DIR_CLOSE (SQLITE_IOERR | (17<<8))
#define SQLITE_IOERR_SHMOPEN (SQLITE_IOERR | (18<<8))
#define SQLITE_IOERR_SHMSIZE (SQLITE_IOERR | (19<<8))
#define SQLITE_IOERR_SHMLOCK (SQLITE_IOERR | (20<<8))
#define SQLITE_IOERR_SHMMAP (SQLITE_IOERR | (21<<8))
#define SQLITE_IOERR_SEEK (SQLITE_IOERR | (22<<8))
#define SQLITE_IOERR_DELETE_NOENT (SQLITE_IOERR | (23<<8))
#define SQLITE_IOERR_MMAP (SQLITE_IOERR | (24<<8))
#define SQLITE_IOERR_GETTEMPPATH (SQLITE_IOERR | (25<<8))
#define SQLITE_IOERR_CONVPATH (SQLITE_IOERR | (26<<8))
#define SQLITE_IOERR_VNODE (SQLITE_IOERR | (27<<8))
#define SQLITE_LOCKED_SHAREDCACHE (SQLITE_LOCKED | (1<<8))
#define SQLITE_BUSY_RECOVERY (SQLITE_BUSY | (1<<8))
#define SQLITE_BUSY_SNAPSHOT (SQLITE_BUSY | (2<<8))
#define SQLITE_CANTOPEN_NOTEMPDIR (SQLITE_CANTOPEN | (1<<8))
#define SQLITE_CANTOPEN_ISDIR (SQLITE_CANTOPEN | (2<<8))
#define SQLITE_CANTOPEN_FULLPATH (SQLITE_CANTOPEN | (3<<8))
#define SQLITE_CANTOPEN_CONVPATH (SQLITE_CANTOPEN | (4<<8))
#define SQLITE_CORRUPT_VTAB (SQLITE_CORRUPT | (1<<8))
#define SQLITE_READONLY_RECOVERY (SQLITE_READONLY | (1<<8))
#define SQLITE_READONLY_CANTLOCK (SQLITE_READONLY | (2<<8))
#define SQLITE_READONLY_ROLLBACK (SQLITE_READONLY | (3<<8))
#define SQLITE_READONLY_DBMOVED (SQLITE_READONLY | (4<<8))
#define SQLITE_ABORT_ROLLBACK (SQLITE_ABORT | (2<<8))
#define SQLITE_CONSTRAINT_CHECK (SQLITE_CONSTRAINT | (1<<8))
#define SQLITE_CONSTRAINT_COMMITHOOK (SQLITE_CONSTRAINT | (2<<8))
#define SQLITE_CONSTRAINT_FOREIGNKEY (SQLITE_CONSTRAINT | (3<<8))
#define SQLITE_CONSTRAINT_FUNCTION (SQLITE_CONSTRAINT | (4<<8))
#define SQLITE_CONSTRAINT_NOTNULL (SQLITE_CONSTRAINT | (5<<8))
#define SQLITE_CONSTRAINT_PRIMARYKEY (SQLITE_CONSTRAINT | (6<<8))
#define SQLITE_CONSTRAINT_TRIGGER (SQLITE_CONSTRAINT | (7<<8))
#define SQLITE_CONSTRAINT_UNIQUE (SQLITE_CONSTRAINT | (8<<8))
#define SQLITE_CONSTRAINT_VTAB (SQLITE_CONSTRAINT | (9<<8))
#define SQLITE_CONSTRAINT_ROWID (SQLITE_CONSTRAINT |(10<<8))
#define SQLITE_NOTICE_RECOVER_WAL (SQLITE_NOTICE | (1<<8))
#define SQLITE_NOTICE_RECOVER_ROLLBACK (SQLITE_NOTICE | (2<<8))
#define SQLITE_WARNING_AUTOINDEX (SQLITE_WARNING | (1<<8))
#define SQLITE_AUTH_USER (SQLITE_AUTH | (1<<8))
/* Reserved: 0x00F00000 */
/*
** Forward declarations of structure
*/
typedef struct Btree Btree;
typedef struct BtCursor BtCursor;
typedef struct BtShared BtShared;
typedef struct Mem Mem;
typedef struct KeyInfo KeyInfo;
typedef struct UnpackedRecord UnpackedRecord;
int sqlite3BtreeOpen(
const char *zVfs, /* VFS to use with this b-tree */
const char *zFilename, /* Name of database file to open */
Btree **ppBtree, /* Return open Btree* here */
int flags, /* Flags */
int vfsFlags /* Flags passed through to VFS open */
);
/* The flags parameter to sqlite3BtreeOpen can be the bitwise or of the
** following values.
**
** NOTE: These values must match the corresponding PAGER_ values in
** pager.h.
*/
#define BTREE_OMIT_JOURNAL 1 /* Do not create or use a rollback journal */
#define BTREE_MEMORY 2 /* This is an in-memory DB */
#define BTREE_SINGLE 4 /* The file contains at most 1 b-tree */
#define BTREE_UNORDERED 8 /* Use of a hash implementation is OK */
/*
** CAPI3REF: Flags For File Open Operations
**
** These bit values are intended for use in the
** 3rd parameter to the [sqlite3_open_v2()] interface and
** in the 4th parameter to the [sqlite3_vfs.xOpen] method.
*/
#define SQLITE_OPEN_READONLY 0x00000001 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_READWRITE 0x00000002 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_CREATE 0x00000004 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_DELETEONCLOSE 0x00000008 /* VFS only */
#define SQLITE_OPEN_EXCLUSIVE 0x00000010 /* VFS only */
#define SQLITE_OPEN_AUTOPROXY 0x00000020 /* VFS only */
#define SQLITE_OPEN_URI 0x00000040 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_MEMORY 0x00000080 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_MAIN_DB 0x00000100 /* VFS only */
#define SQLITE_OPEN_TEMP_DB 0x00000200 /* VFS only */
#define SQLITE_OPEN_TRANSIENT_DB 0x00000400 /* VFS only */
#define SQLITE_OPEN_MAIN_JOURNAL 0x00000800 /* VFS only */
#define SQLITE_OPEN_TEMP_JOURNAL 0x00001000 /* VFS only */
#define SQLITE_OPEN_SUBJOURNAL 0x00002000 /* VFS only */
#define SQLITE_OPEN_MASTER_JOURNAL 0x00004000 /* VFS only */
#define SQLITE_OPEN_NOMUTEX 0x00008000 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_FULLMUTEX 0x00010000 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_SHAREDCACHE 0x00020000 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_PRIVATECACHE 0x00040000 /* Ok for sqlite3_open_v2() */
#define SQLITE_OPEN_WAL 0x00080000 /* VFS only */
int sqlite3BtreeClose(Btree*);
int sqlite3BtreeSetCacheSize(Btree*,int);
#if SQLITE_MAX_MMAP_SIZE>0
int sqlite3BtreeSetMmapLimit(Btree*,sqlite3_int64);
#endif
int sqlite3BtreeSetPagerFlags(Btree*,unsigned);
int sqlite3BtreeSyncDisabled(Btree*);
int sqlite3BtreeSetPageSize(Btree *p, int nPagesize, int nReserve, int eFix);
int sqlite3BtreeGetPageSize(Btree*);
int sqlite3BtreeMaxPageCount(Btree*,int);
u32 sqlite3BtreeLastPage(Btree*);
int sqlite3BtreeSecureDelete(Btree*,int);
int sqlite3BtreeGetOptimalReserve(Btree*);
int sqlite3BtreeGetReserveNoMutex(Btree *p);
int sqlite3BtreeSetAutoVacuum(Btree *, int);
int sqlite3BtreeGetAutoVacuum(Btree *);
int sqlite3BtreeBeginTrans(Btree*,int);
int sqlite3BtreeCommitPhaseOne(Btree*, const char *zMaster);
int sqlite3BtreeCommitPhaseTwo(Btree*, int);
int sqlite3BtreeCommit(Btree*);
int sqlite3BtreeRollback(Btree*,int,int);
int sqlite3BtreeBeginStmt(Btree*,int);
int sqlite3BtreeCreateTable(Btree*, int*, int flags);
int sqlite3BtreeIsInTrans(Btree*);
int sqlite3BtreeIsInReadTrans(Btree*);
int sqlite3BtreeIsInBackup(Btree*);
void *sqlite3BtreeSchema(Btree *, int, void(*)(void *));
int sqlite3BtreeSchemaLocked(Btree *pBtree);
int sqlite3BtreeLockTable(Btree *pBtree, int iTab, u8 isWriteLock);
int sqlite3BtreeSavepoint(Btree *, int, int);
int sqlite3BtreeFileFormat(Btree *);
const char *sqlite3BtreeGetFilename(Btree *);
const char *sqlite3BtreeGetJournalname(Btree *);
int sqlite3BtreeCopyFile(Btree *, Btree *);
int sqlite3BtreeIncrVacuum(Btree *);
/* The flags parameter to sqlite3BtreeCreateTable can be the bitwise OR
** of the flags shown below.
**
** Every SQLite table must have either BTREE_INTKEY or BTREE_BLOBKEY set.
** With BTREE_INTKEY, the table key is a 64-bit integer and arbitrary data
** is stored in the leaves. (BTREE_INTKEY is used for SQL tables.) With
** BTREE_BLOBKEY, the key is an arbitrary BLOB and no content is stored
** anywhere - the key is the content. (BTREE_BLOBKEY is used for SQL
** indices.)
*/
#define BTREE_INTKEY 1 /* Table has only 64-bit signed integer keys */
#define BTREE_BLOBKEY 2 /* Table has keys only - no data */
int sqlite3BtreeDropTable(Btree*, int, int*);
int sqlite3BtreeClearTable(Btree*, int, int*);
int sqlite3BtreeClearTableOfCursor(BtCursor*);
int sqlite3BtreeTripAllCursors(Btree*, int, int);
void sqlite3BtreeGetMeta(Btree *pBtree, int idx, u32 *pValue);
int sqlite3BtreeUpdateMeta(Btree*, int idx, u32 value);
int sqlite3BtreeNewDb(Btree *p);
/*
** The second parameter to sqlite3BtreeGetMeta or sqlite3BtreeUpdateMeta
** should be one of the following values. The integer values are assigned
** to constants so that the offset of the corresponding field in an
** SQLite database header may be found using the following formula:
**
** offset = 36 + (idx * 4)
**
** For example, the free-page-count field is located at byte offset 36 of
** the database file header. The incr-vacuum-flag field is located at
** byte offset 64 (== 36+4*7).
**
** The BTREE_DATA_VERSION value is not really a value stored in the header.
** It is a read-only number computed by the pager. But we merge it with
** the header value access routines since its access pattern is the same.
** Call it a "virtual meta value".
*/
#define BTREE_FREE_PAGE_COUNT 0
#define BTREE_SCHEMA_VERSION 1
#define BTREE_FILE_FORMAT 2
#define BTREE_DEFAULT_CACHE_SIZE 3
#define BTREE_LARGEST_ROOT_PAGE 4
#define BTREE_TEXT_ENCODING 5
#define BTREE_USER_VERSION 6
#define BTREE_INCR_VACUUM 7
#define BTREE_APPLICATION_ID 8
#define BTREE_DATA_VERSION 15 /* A virtual meta-value */
/*
** An instance of the following structure holds information about a
** single index record that has already been parsed out into individual
** values.
**
** A record is an object that contains one or more fields of data.
** Records are used to store the content of a table row and to store
** the key of an index. A blob encoding of a record is created by
** the OP_MakeRecord opcode of the VDBE and is disassembled by the
** OP_Column opcode.
**
** This structure holds a record that has already been disassembled
** into its constituent fields.
**
** The r1 and r2 member variables are only used by the optimized comparison
** functions vdbeRecordCompareInt() and vdbeRecordCompareString().
*/
struct UnpackedRecord {
KeyInfo *pKeyInfo; /* Collation and sort-order information */
u16 nField; /* Number of entries in apMem[] */
i8 default_rc; /* Comparison result if keys are equal */
u8 errCode; /* Error detected by xRecordCompare (CORRUPT or NOMEM) */
Mem *aMem; /* Values */
int r1; /* Value to return if (lhs > rhs) */
int r2; /* Value to return if (rhs < lhs) */
};
/* One or more of the following flags are set to indicate the validOK
** representations of the value stored in the Mem struct.
**
** If the MEM_Null flag is set, then the value is an SQL NULL value.
** No other flags may be set in this case.
**
** If the MEM_Str flag is set then Mem.z points at a string representation.
** Usually this is encoded in the same unicode encoding as the main
** database (see below for exceptions). If the MEM_Term flag is also
** set, then the string is nul terminated. The MEM_Int and MEM_Real
** flags may coexist with the MEM_Str flag.
*/
#define MEM_Null 0x0001 /* Value is NULL */
#define MEM_Str 0x0002 /* Value is a string */
#define MEM_Int 0x0004 /* Value is an integer */
#define MEM_Real 0x0008 /* Value is a real number */
#define MEM_Blob 0x0010 /* Value is a BLOB */
#define MEM_Term 0x0200 /* String rep is nul terminated */
#define MEM_Dyn 0x0400 /* Need to call Mem.xDel() on Mem.z */
#define MEM_Static 0x0800 /* Mem.z points to a static string */
#define MEM_Ephem 0x1000 /* Mem.z points to an ephemeral string */
#define MEM_Zero 0x4000 /* Mem.i contains count of 0s appended to blob */
/*
** Internally, the vdbe manipulates nearly all SQL values as Mem
** structures. Each Mem struct may cache multiple representations (string,
** integer etc.) of the same value.
*/
struct Mem {
union MemValue {
double r; /* Real value used when MEM_Real is set in flags */
i64 i; /* Integer value used when MEM_Int is set in flags */
int nZero; /* Used when bit MEM_Zero is set in flags */
} u;
u16 flags; /* Some combination of MEM_Null, MEM_Str, MEM_Dyn, etc. */
u8 enc; /* SQLITE_UTF8, SQLITE_UTF16BE, SQLITE_UTF16LE */
u8 eSubtype; /* Subtype for this value */
int n; /* Number of characters in string value, excluding '\0' */
char *z; /* String or BLOB value */
/* ShallowCopy only needs to copy the information above */
char *zMalloc; /* Space to hold MEM_Str or MEM_Blob if szMalloc>0 */
int szMalloc; /* Size of the zMalloc allocation */
u32 uTemp; /* Transient storage for serial_type in OP_MakeRecord */
Btree *pBtree; /* The associated database connection */
void (*xDel)(void*);/* Destructor for Mem.z - only valid if MEM_Dyn */
#ifdef SQLITE_DEBUG
Mem *pScopyFrom; /* This Mem is a shallow copy of pScopyFrom */
void *pFiller; /* So that sizeof(Mem) is a multiple of 8 */
#endif
};
/*
** Values that may be OR'd together to form the second argument of an
** sqlite3BtreeCursorHints() call.
**
** The BTREE_BULKLOAD flag is set on index cursors when the index is going
** to be filled with content that is already in sorted order.
**
** The BTREE_SEEK_EQ flag is set on cursors that will get OP_SeekGE or
** OP_SeekLE opcodes for a range search, but where the range of entries
** selected will all have the same key. In other words, the cursor will
** be used only for equality key searches.
**
*/
#define BTREE_BULKLOAD 0x00000001 /* Used to full index in sorted order */
#define BTREE_SEEK_EQ 0x00000002 /* EQ seeks only - no range seeks */
int sqlite3BtreeCursor(
Btree*, /* BTree containing table to open */
int iTable, /* Index of root page */
int wrFlag, /* 1 for writing. 0 for read-only */
int N, int X, /* index of N key columns and X extra columns */
BtCursor **ppCursor /* Space to write cursor pointer */
);
int sqlite3BtreeCursorSize(void);
int sqlite3BtreeCloseCursor(BtCursor*);
void sqlite3BtreeInitUnpackedRecord(
UnpackedRecord *pUnKey,
BtCursor* pCur,
int nField,
int default_rc,
Mem* pMem);
int sqlite3BtreeMovetoUnpacked(
BtCursor*,
UnpackedRecord *pUnKey,
i64 intKey,
int bias,
int *pRes
);
int sqlite3BtreeCursorHasMoved(BtCursor*);
int sqlite3BtreeCursorRestore(BtCursor*, int*);
int sqlite3BtreeDelete(BtCursor*, int);
int sqlite3BtreeInsert(BtCursor*, const void *pKey, i64 nKey,
const void *pData, int nData,
int nZero, int bias, int seekResult);
int sqlite3BtreeFirst(BtCursor*, int *pRes);
int sqlite3BtreeLast(BtCursor*, int *pRes);
int sqlite3BtreeNext(BtCursor*, int *pRes);
int sqlite3BtreeEof(BtCursor*);
int sqlite3BtreePrevious(BtCursor*, int *pRes);
int sqlite3BtreeKeySize(BtCursor*, i64 *pSize);
int sqlite3BtreeKey(BtCursor*, u32 offset, u32 amt, void*);
const void *sqlite3BtreeKeyFetch(BtCursor*, u32 *pAmt);
const void *sqlite3BtreeDataFetch(BtCursor*, u32 *pAmt);
int sqlite3BtreeDataSize(BtCursor*, u32 *pSize);
int sqlite3BtreeData(BtCursor*, u32 offset, u32 amt, void*);
char *sqlite3BtreeIntegrityCheck(Btree*, int *aRoot, int nRoot, int, int*);
struct Pager *sqlite3BtreePager(Btree*);
int sqlite3BtreePutData(BtCursor*, u32 offset, u32 amt, void*);
void sqlite3BtreeIncrblobCursor(BtCursor *);
void sqlite3BtreeClearCursor(BtCursor *);
int sqlite3BtreeSetVersion(Btree *pBt, int iVersion);
void sqlite3BtreeCursorHints(BtCursor *, unsigned int mask);
#ifdef SQLITE_DEBUG
int sqlite3BtreeCursorHasHint(BtCursor*, unsigned int mask);
#endif
int sqlite3BtreeIsReadonly(Btree *pBt);
#ifndef NDEBUG
int sqlite3BtreeCursorIsValid(BtCursor*);
#endif
#ifndef SQLITE_OMIT_BTREECOUNT
int sqlite3BtreeCount(BtCursor *, i64 *);
#endif
#ifdef SQLITE_TEST
int sqlite3BtreeCursorInfo(BtCursor*, int*, int);
void sqlite3BtreeCursorList(Btree*);
#endif
#ifndef SQLITE_OMIT_WAL
int sqlite3BtreeCheckpoint(Btree*, int, int *, int *);
#endif
/*
** If we are not using shared cache, then there is no need to
** use mutexes to access the BtShared structures. So make the
** Enter and Leave procedures no-ops.
*/
#ifndef SQLITE_OMIT_SHARED_CACHE
void sqlite3BtreeEnter(Btree*);
#else
# define sqlite3BtreeEnter(X)
#endif
#if !defined(SQLITE_OMIT_SHARED_CACHE) && SQLITE_THREADSAFE
int sqlite3BtreeSharable(Btree*);
void sqlite3BtreeLeave(Btree*);
void sqlite3BtreeEnterCursor(BtCursor*);
void sqlite3BtreeLeaveCursor(BtCursor*);
#else
# define sqlite3BtreeSharable(X) 0
# define sqlite3BtreeLeave(X)
# define sqlite3BtreeEnterCursor(X)
# define sqlite3BtreeLeaveCursor(X)
#endif
u32 sqlite3BtreeSerialType(Mem *pMem, int file_format);
u32 sqlite3BtreeSerialTypeLen(u32);
u32 sqlite3BtreeSerialGet(const unsigned char*, u32, Mem *);
u32 sqlite3BtreeSerialPut(u8*, Mem*, u32);
/*
** Routines to read and write variable-length integers. These used to
** be defined locally, but now we use the varint routines in the util.c
** file.
*/
int sqlite3BtreePutVarint(unsigned char*, u64);
u8 sqlite3BtreeGetVarint(const unsigned char *, u64 *);
u8 sqlite3BtreeGetVarint32(const unsigned char *, u32 *);
int sqlite3BtreeVarintLen(u64 v);
/*
** The common case is for a varint to be a single byte. They following
** macros handle the common case without a procedure call, but then call
** the procedure for larger varints.
*/
#define getVarint32(A,B) \
(u8)((*(A)<(u8)0x80)?((B)=(u32)*(A)),1:sqlite3BtreeGetVarint32((A),(u32 *)&(B)))
#define putVarint32(A,B) \
(u8)(((u32)(B)<(u32)0x80)?(*(A)=(unsigned char)(B)),1:\
sqlite3BtreePutVarint((A),(B)))
#define getVarint sqlite3BtreeGetVarint
#define putVarint sqlite3BtreePutVarint
int sqlite3BtreeIdxRowid(Btree*, BtCursor*, i64*);
int sqlite3BtreeRecordCompare(int,const void*,UnpackedRecord*);
const char *sqlite3BtreeErrName(int rc);
#endif /* _BTREE_H_ */

View File

@@ -0,0 +1,22 @@
## 1.3.0
- Add completion support.
## 1.2.1
- Remove deprecated `pgf_print_expr_tuple`.
- Added an API for cloning expressions/types/literals.
## 1.2.0
- Stop `pgf-shell` from being built by default.
- parseToChart also returns the category.
- bugfix in bracketedLinearize.
## 1.1.0
- Remove SG library.
## 1.0.0
- Everything up until 2020-07-11.

View File

@@ -0,0 +1,10 @@
# Instructions for uploading to Hackage
You will need a Hackage account for steps 4 & 5.
1. Bump the version number in `pgf2.cabal`
2. Add details in `CHANGELOG.md`
3. Run `stack sdist` (or `cabal sdist`)
4. Visit `https://hackage.haskell.org/upload` and upload the file `./.stack-work/dist/x86_64-osx/Cabal-2.2.0.1/pgf2-x.y.z.tar.gz` (or Cabal equivalent)
5. If successful, upload documentation with `./stack-haddock-upload.sh pgf2 x.y.z` (compilation on Hackage's servers will fail because of missing C libraries)
6. Commit and push to this repository (`gf-core`)

View File

@@ -0,0 +1,165 @@
GNU LESSER GENERAL PUBLIC LICENSE
Version 3, 29 June 2007
Copyright (C) 2007 Free Software Foundation, Inc. <https://fsf.org/>
Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed.
This version of the GNU Lesser General Public License incorporates
the terms and conditions of version 3 of the GNU General Public
License, supplemented by the additional permissions listed below.
0. Additional Definitions.
As used herein, "this License" refers to version 3 of the GNU Lesser
General Public License, and the "GNU GPL" refers to version 3 of the GNU
General Public License.
"The Library" refers to a covered work governed by this License,
other than an Application or a Combined Work as defined below.
An "Application" is any work that makes use of an interface provided
by the Library, but which is not otherwise based on the Library.
Defining a subclass of a class defined by the Library is deemed a mode
of using an interface provided by the Library.
A "Combined Work" is a work produced by combining or linking an
Application with the Library. The particular version of the Library
with which the Combined Work was made is also called the "Linked
Version".
The "Minimal Corresponding Source" for a Combined Work means the
Corresponding Source for the Combined Work, excluding any source code
for portions of the Combined Work that, considered in isolation, are
based on the Application, and not on the Linked Version.
The "Corresponding Application Code" for a Combined Work means the
object code and/or source code for the Application, including any data
and utility programs needed for reproducing the Combined Work from the
Application, but excluding the System Libraries of the Combined Work.
1. Exception to Section 3 of the GNU GPL.
You may convey a covered work under sections 3 and 4 of this License
without being bound by section 3 of the GNU GPL.
2. Conveying Modified Versions.
If you modify a copy of the Library, and, in your modifications, a
facility refers to a function or data to be supplied by an Application
that uses the facility (other than as an argument passed when the
facility is invoked), then you may convey a copy of the modified
version:
a) under this License, provided that you make a good faith effort to
ensure that, in the event an Application does not supply the
function or data, the facility still operates, and performs
whatever part of its purpose remains meaningful, or
b) under the GNU GPL, with none of the additional permissions of
this License applicable to that copy.
3. Object Code Incorporating Material from Library Header Files.
The object code form of an Application may incorporate material from
a header file that is part of the Library. You may convey such object
code under terms of your choice, provided that, if the incorporated
material is not limited to numerical parameters, data structure
layouts and accessors, or small macros, inline functions and templates
(ten or fewer lines in length), you do both of the following:
a) Give prominent notice with each copy of the object code that the
Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the object code with a copy of the GNU GPL and this license
document.
4. Combined Works.
You may convey a Combined Work under terms of your choice that,
taken together, effectively do not restrict modification of the
portions of the Library contained in the Combined Work and reverse
engineering for debugging such modifications, if you also do each of
the following:
a) Give prominent notice with each copy of the Combined Work that
the Library is used in it and that the Library and its use are
covered by this License.
b) Accompany the Combined Work with a copy of the GNU GPL and this license
document.
c) For a Combined Work that displays copyright notices during
execution, include the copyright notice for the Library among
these notices, as well as a reference directing the user to the
copies of the GNU GPL and this license document.
d) Do one of the following:
0) Convey the Minimal Corresponding Source under the terms of this
License, and the Corresponding Application Code in a form
suitable for, and under terms that permit, the user to
recombine or relink the Application with a modified version of
the Linked Version to produce a modified Combined Work, in the
manner specified by section 6 of the GNU GPL for conveying
Corresponding Source.
1) Use a suitable shared library mechanism for linking with the
Library. A suitable mechanism is one that (a) uses at run time
a copy of the Library already present on the user's computer
system, and (b) will operate properly with a modified version
of the Library that is interface-compatible with the Linked
Version.
e) Provide Installation Information, but only if you would otherwise
be required to provide such information under section 6 of the
GNU GPL, and only to the extent that such information is
necessary to install and execute a modified version of the
Combined Work produced by recombining or relinking the
Application with a modified version of the Linked Version. (If
you use option 4d0, the Installation Information must accompany
the Minimal Corresponding Source and Corresponding Application
Code. If you use option 4d1, you must provide the Installation
Information in the manner specified by section 6 of the GNU GPL
for conveying Corresponding Source.)
5. Combined Libraries.
You may place library facilities that are a work based on the
Library side by side in a single library together with other library
facilities that are not Applications and are not covered by this
License, and convey such a combined library under terms of your
choice, if you do both of the following:
a) Accompany the combined library with a copy of the same work based
on the Library, uncombined with any other library facilities,
conveyed under the terms of this License.
b) Give prominent notice with the combined library that part of it
is a work based on the Library, and explaining where to find the
accompanying uncombined form of the same work.
6. Revised Versions of the GNU Lesser General Public License.
The Free Software Foundation may publish revised and/or new versions
of the GNU Lesser General Public License from time to time. Such new
versions will be similar in spirit to the present version, but may
differ in detail to address new problems or concerns.
Each version is given a distinguishing version number. If the
Library as you received it specifies that a certain numbered version
of the GNU Lesser General Public License "or any later version"
applies to it, you have the option of following the terms and
conditions either of that published version or of any later version
published by the Free Software Foundation. If the Library as you
received it does not specify a version number of the GNU Lesser
General Public License, you may choose any version of the GNU Lesser
General Public License ever published by the Free Software Foundation.
If the Library as you received it specifies that a proxy can decide
whether future versions of the GNU Lesser General Public License shall
apply, that proxy's public statement of acceptance of any version is
permanent authorization for you to choose that version for the
Library.

View File

@@ -1,3 +0,0 @@
module PGF(module PGF2) where
import PGF2

View File

@@ -1 +0,0 @@
module PGF.Internal where

View File

@@ -15,6 +15,7 @@
#include <pgf/pgf.h>
#include <pgf/linearizer.h>
#include <pgf/data.h>
#include <gu/enum.h>
#include <gu/exn.h>
@@ -42,35 +43,35 @@ module PGF2 (-- * PGF
mkCId,
exprHash, exprSize, exprFunctions, exprSubstitute,
treeProbability,
-- ** Types
Type, Hypo, BindType(..), startCat,
readType, showType, showContext,
mkType, unType,
-- ** Type checking
-- | Dynamically-built expressions should always be type-checked before using in other functions,
-- as the exceptions thrown by using invalid expressions may not catchable.
checkExpr, inferExpr, checkType,
-- ** Computing
compute,
-- * Concrete syntax
ConcName,Concr,languages,concreteName,languageCode,
-- ** Linearization
linearize,linearizeAll,tabularLinearize,tabularLinearizeAll,bracketedLinearize,bracketedLinearizeAll,
FId, LIndex, BracketedString(..), showBracketedString, flattenBracketedString,
printName,
FId, BracketedString(..), showBracketedString, flattenBracketedString,
printName, categoryFields,
alignWords,
-- ** Parsing
ParseOutput(..), parse, parseWithHeuristics,
parseToChart, PArg(..),
complete,
-- ** Sentence Lookup
lookupSentence,
-- ** Generation
generateAll,
-- ** Morphological Analysis
MorphoAnalysis, lookupMorpho, lookupCohorts, fullFormLexicon,
filterBest, filterLongest,
-- ** Visualizations
GraphvizOptions(..), graphvizDefaults,
graphvizAbstractTree, graphvizParseTree, graphvizWordAlignment,
@@ -86,6 +87,7 @@ import Prelude hiding (fromEnum,(<>)) -- GHC 8.4.1 clash with Text.PrettyPrint
import Control.Exception(Exception,throwIO)
import Control.Monad(forM_)
import System.IO.Unsafe(unsafePerformIO,unsafeInterleaveIO)
import System.IO(fixIO)
import Text.PrettyPrint
import PGF2.Expr
import PGF2.Type
@@ -96,11 +98,11 @@ import Foreign.C
import Data.Typeable
import qualified Data.Map as Map
import Data.IORef
import Data.Char(isUpper,isSpace)
import Data.Char(isUpper,isSpace,isPunctuation)
import Data.List(isSuffixOf,maximumBy,nub)
import Data.Function(on)
import Data.Maybe(maybe)
-----------------------------------------------------------------------
-- Functions that take a PGF.
-- PGF has many Concrs.
@@ -168,8 +170,6 @@ languages p =
concr <- fmap (\ptr -> Concr ptr (touchPGF p)) $ peek (castPtr value)
writeIORef ref $! Map.insert name concr langs
-- | The abstract language name is the name of the top-level
-- abstract module
concreteName :: Concr -> ConcName
concreteName c = unsafePerformIO (peekUtf8CString =<< pgf_concrete_name (concr c))
@@ -178,7 +178,7 @@ languageCode c = unsafePerformIO (peekUtf8CString =<< pgf_language_code (concr c
-- | Generates an exhaustive possibly infinite list of
-- all abstract syntax expressions of the given type.
-- all abstract syntax expressions of the given type.
-- The expressions are ordered by their probability.
generateAll :: PGF -> Type -> [(Expr,Float)]
generateAll p (Type ctype _) =
@@ -431,6 +431,7 @@ graphvizParseTree c opts e =
c_opts <- newGraphvizOptions tmpPl opts
pgf_graphviz_parse_tree (concr c) (expr e) c_opts out exn
touchExpr e
touchConcr c
s <- gu_string_buf_freeze sb tmpPl
peekUtf8CString s
@@ -466,21 +467,21 @@ newGraphvizOptions pool opts = do
-- Functions using Concr
-- Morpho analyses, parsing & linearization
-- | This triple is returned by all functions that deal with
-- | This triple is returned by all functions that deal with
-- the grammar's lexicon. Its first element is the name of an abstract
-- lexical function which can produce a given word or
-- lexical function which can produce a given word or
-- a multiword expression (i.e. this is the lemma).
-- After that follows a string which describes
-- After that follows a string which describes
-- the particular inflection form.
--
-- The last element is a logarithm from the
-- the probability of the function. The probability is not
-- the probability of the function. The probability is not
-- conditionalized on the category of the function. This makes it
-- possible to compare the likelihood of two functions even if they
-- have different types.
-- have different types.
type MorphoAnalysis = (Fun,String,Float)
-- | 'lookupMorpho' takes a string which must be a single word or
-- | 'lookupMorpho' takes a string which must be a single word or
-- a multiword expression. It then computes the list of all possible
-- morphological analyses.
lookupMorpho :: Concr -> String -> [MorphoAnalysis]
@@ -505,7 +506,7 @@ lookupMorpho (Concr concr master) sent =
-- The list is sorted first by the @start@ position and after than
-- by the @end@ position. This can be used for instance if you want to
-- filter only the longest matches.
lookupCohorts :: Concr -> String -> [(Int,[MorphoAnalysis],Int)]
lookupCohorts :: Concr -> String -> [(Int,String,[MorphoAnalysis],Int)]
lookupCohorts lang@(Concr concr master) sent =
unsafePerformIO $
do pl <- gu_new_pool
@@ -516,9 +517,9 @@ lookupCohorts lang@(Concr concr master) sent =
c_sent <- newUtf8CString sent pl
enum <- pgf_lookup_cohorts concr c_sent cback pl nullPtr
fpl <- newForeignPtr gu_pool_finalizer pl
fromCohortRange enum fpl fptr ref
fromCohortRange enum fpl fptr 0 sent ref
where
fromCohortRange enum fpl fptr ref =
fromCohortRange enum fpl fptr i sent ref =
allocaBytes (#size PgfCohortRange) $ \ptr ->
withForeignPtr fpl $ \pl ->
do gu_enum_next enum ptr pl
@@ -532,8 +533,80 @@ lookupCohorts lang@(Concr concr master) sent =
end <- (#peek PgfCohortRange, end.pos) ptr
ans <- readIORef ref
writeIORef ref []
cohs <- unsafeInterleaveIO (fromCohortRange enum fpl fptr ref)
return ((start,ans,end):cohs)
let sent' = drop (start-i) sent
tok = take (end-start) sent'
cohs <- unsafeInterleaveIO (fromCohortRange enum fpl fptr start sent' ref)
return ((start,tok,ans,end):cohs)
filterBest :: [(Int,String,[MorphoAnalysis],Int)] -> [(Int,String,[MorphoAnalysis],Int)]
filterBest ans =
reverse (iterate (maxBound :: Int) [(0,0,[],ans)] [] [])
where
iterate v0 [] [] res = res
iterate v0 [] new res = iterate v0 new [] res
iterate v0 ((_,v,conf, []):old) new res =
case compare v0 v of
LT -> res
EQ -> iterate v0 old new (merge conf res)
GT -> iterate v old new conf
iterate v0 ((_,v,conf,an:ans):old) new res = iterate v0 old (insert (v+valueOf an) conf an ans [] new) res
valueOf (_,_,[],_) = 2
valueOf _ = 1
insert v conf an@(start,_,_,end) ans l_new [] =
match start v conf ans ((end,v,comb conf an,filter end ans):l_new) []
insert v conf an@(start,_,_,end) ans l_new (new@(end0,v0,conf0,ans0):r_new) =
case compare end0 end of
LT -> insert v conf an ans (new:l_new) r_new
EQ -> case compare v0 v of
LT -> match start v conf ans ((end,v, conf0,ans0): l_new) r_new
EQ -> match start v conf ans ((end,v,merge (comb conf an) conf0,ans0): l_new) r_new
GT -> match start v conf ans ((end,v,comb conf an, ans0): l_new) r_new
GT -> match start v conf ans ((end,v,comb conf an, filter end ans):new:l_new) r_new
match start0 v conf (an@(start,_,_,end):ans) l_new r_new
| start0 == start = insert v conf an ans l_new r_new
match start0 v conf ans l_new r_new = revOn l_new r_new
comb ((start0,w0,an0,end0):conf) (start,w,an,end)
| end0 == start && (unk w0 an0 || unk w an) = (start0,w0++w,[],end):conf
comb conf an = an:conf
filter end [] = []
filter end (next@(start,_,_,_):ans)
| end <= start = next:ans
| otherwise = filter end ans
revOn [] ys = ys
revOn (x:xs) ys = revOn xs (x:ys)
merge [] ans = ans
merge ans [] = ans
merge (an1@(start1,_,_,end1):ans1) (an2@(start2,_,_,end2):ans2) =
case compare (start1,end1) (start2,end2) of
GT -> an1 : merge ans1 (an2:ans2)
EQ -> an1 : merge ans1 ans2
LT -> an2 : merge (an1:ans1) ans2
filterLongest :: [(Int,String,[MorphoAnalysis],Int)] -> [(Int,String,[MorphoAnalysis],Int)]
filterLongest [] = []
filterLongest (an:ans) = longest an ans
where
longest prev [] = [prev]
longest prev@(start0,_,_,end0) (next@(start,_,_,end):ans)
| start0 == start = longest next ans
| otherwise = filter prev (next:ans)
filter prev [] = [prev]
filter prev@(start0,w0,an0,end0) (next@(start,w,an,end):ans)
| end0 == start && (unk w0 an0 || unk w an)
= filter (start0,w0++w,[],end) ans
| end0 <= start = prev : longest next ans
| otherwise = filter prev ans
unk w [] | any (not . isPunctuation) w = True
unk _ _ = False
fullFormLexicon :: Concr -> [(String, [MorphoAnalysis])]
fullFormLexicon lang =
@@ -571,31 +644,31 @@ getAnalysis ref self c_lemma c_anal prob exn = do
writeIORef ref ((lemma, anal, prob):ans)
-- | This data type encodes the different outcomes which you could get from the parser.
data ParseOutput
data ParseOutput a
= ParseFailed Int String -- ^ The integer is the position in number of unicode characters where the parser failed.
-- The string is the token where the parser have failed.
| ParseOk [(Expr,Float)] -- ^ If the parsing and the type checking are successful we get a list of abstract syntax trees.
-- The list should be non-empty.
| ParseOk a -- ^ If the parsing and the type checking are successful
-- we get the abstract syntax trees as either a list or a chart.
| ParseIncomplete -- ^ The sentence is not complete.
parse :: Concr -> Type -> String -> ParseOutput
parse :: Concr -> Type -> String -> ParseOutput [(Expr,Float)]
parse lang ty sent = parseWithHeuristics lang ty sent (-1.0) []
parseWithHeuristics :: Concr -- ^ the language with which we parse
-> Type -- ^ the start category
-> String -- ^ the input sentence
-> Double -- ^ the heuristic factor.
-- A negative value tells the parser
-- to lookup up the default from
-> Double -- ^ the heuristic factor.
-- A negative value tells the parser
-- to lookup up the default from
-- the grammar flags
-> [(Cat, Int -> Int -> Maybe (Expr,Float,Int))]
-> [(Cat, String -> Int -> Maybe (Expr,Float,Int))]
-- ^ a list of callbacks for literal categories.
-- The arguments of the callback are:
-- the index of the constituent for the literal category;
-- the input sentence; the current offset in the sentence.
-- If a literal has been recognized then the output should
-- be Just (expr,probability,end_offset)
-> ParseOutput
-> ParseOutput [(Expr,Float)]
parseWithHeuristics lang (Type ctype touchType) sent heuristic callbacks =
unsafePerformIO $
do exprPl <- gu_new_pool
@@ -637,7 +710,136 @@ parseWithHeuristics lang (Type ctype touchType) sent heuristic callbacks =
exprs <- fromPgfExprEnum enum parseFPl (touchConcr lang >> touchForeignPtr exprFPl)
return (ParseOk exprs)
mkCallbacksMap :: Ptr PgfConcr -> [(String, Int -> Int -> Maybe (Expr,Float,Int))] -> Ptr GuPool -> IO (Ptr PgfCallbacksMap)
parseToChart :: Concr -- ^ the language with which we parse
-> Type -- ^ the start category
-> String -- ^ the input sentence
-> Double -- ^ the heuristic factor.
-- A negative value tells the parser
-- to lookup up the default from
-- the grammar flags
-> [(Cat, String -> Int -> Maybe (Expr,Float,Int))]
-- ^ a list of callbacks for literal categories.
-- The arguments of the callback are:
-- the index of the constituent for the literal category;
-- the input sentence; the current offset in the sentence.
-- If a literal has been recognized then the output should
-- be Just (expr,probability,end_offset)
-> Int -- ^ the maximal number of roots
-> ParseOutput ([FId],Map.Map FId ([(Int,Int,String)],[(Expr,[PArg],Float)],Cat))
parseToChart lang (Type ctype touchType) sent heuristic callbacks roots =
unsafePerformIO $
withGuPool $ \parsePl -> do
do exn <- gu_new_exn parsePl
sent <- newUtf8CString sent parsePl
callbacks_map <- mkCallbacksMap (concr lang) callbacks parsePl
ps <- pgf_parse_to_chart (concr lang) ctype sent heuristic callbacks_map (fromIntegral roots) exn parsePl parsePl
touchType
failed <- gu_exn_is_raised exn
if failed
then do is_parse_error <- gu_exn_caught exn gu_exn_type_PgfParseError
if is_parse_error
then do c_err <- (#peek GuExn, data.data) exn
c_incomplete <- (#peek PgfParseError, incomplete) c_err
if (c_incomplete :: CInt) == 0
then do c_offset <- (#peek PgfParseError, offset) c_err
token_ptr <- (#peek PgfParseError, token_ptr) c_err
token_len <- (#peek PgfParseError, token_len) c_err
tok <- peekUtf8CStringLen token_ptr token_len
touchConcr lang
return (ParseFailed (fromIntegral (c_offset :: CInt)) tok)
else do touchConcr lang
return ParseIncomplete
else do is_exn <- gu_exn_caught exn gu_exn_type_PgfExn
if is_exn
then do c_msg <- (#peek GuExn, data.data) exn
msg <- peekUtf8CString c_msg
touchConcr lang
throwIO (PGFError msg)
else do touchConcr lang
throwIO (PGFError "Parsing failed")
else do c_roots <- pgf_get_parse_roots ps parsePl
let get_range c_ccat = pgf_ccat_to_range ps c_ccat parsePl
c_len <- (#peek GuSeq, len) c_roots
chart <- peekCCats get_range Map.empty (c_len :: CSizeT) (c_roots `plusPtr` (#offset GuSeq, data))
touchConcr lang
return (ParseOk chart)
where
peekCCats get_range chart 0 ptr = return ([],chart)
peekCCats get_range chart len ptr = do
(root, chart) <- deRef (peekCCat get_range chart) ptr
(roots,chart) <- peekCCats get_range chart (len-1) (ptr `plusPtr` (#size PgfCCat*))
return (root:roots,chart)
peekCCat get_range chart c_ccat = do
fid <- peekFId c_ccat
c_total_cats <- (#peek PgfConcr, total_cats) (concr lang)
if Map.member fid chart || fid < c_total_cats
then return (fid,chart)
else do c_cnccat <- (#peek PgfCCat, cnccat) c_ccat
c_abscat <- (#peek PgfCCat, cnccat) c_cnccat
c_name <- (#peek PgfCCat, cnccat) c_abscat
cat <- peekUtf8CString c_name
range <- get_range c_ccat >>= peekSequence peekRange (#size PgfParseRange)
c_prods <- (#peek PgfCCat, prods) c_ccat
if c_prods == nullPtr
then do return (fid,Map.insert fid (range,[],cat) chart)
else do c_len <- (#peek PgfCCat, n_synprods) c_ccat
(prods,chart) <- fixIO (\res -> peekProductions (Map.insert fid (range,fst res,cat) chart)
(fromIntegral (c_len :: CSizeT))
(c_prods `plusPtr` (#offset GuSeq, data)))
return (fid,chart)
where
peekProductions chart 0 ptr = return ([],chart)
peekProductions chart len ptr = do
(ps1,chart) <- deRef (peekProduction chart) ptr
(ps2,chart) <- peekProductions chart (len-1) (ptr `plusPtr` (#size GuVariant))
return (ps1++ps2,chart)
peekProduction chart p = do
tag <- gu_variant_tag p
dt <- gu_variant_data p
case tag of
(#const PGF_PRODUCTION_APPLY) -> do { c_cncfun <- (#peek PgfProductionApply, fun) dt ;
c_absfun <- (#peek PgfCncFun, absfun) c_cncfun ;
expr <- (#peek PgfAbsFun, ep.expr) c_absfun ;
p <- (#peek PgfAbsFun, ep.prob) c_absfun ;
c_args <- (#peek PgfProductionApply, args) dt ;
c_len <- (#peek GuSeq, len) c_args ;
(pargs,chart) <- peekPArgs chart (c_len :: CSizeT) (c_args `plusPtr` (#offset GuSeq, data)) ;
return ([(Expr expr (touchConcr lang), pargs, p)],chart) }
(#const PGF_PRODUCTION_COERCE) -> do { c_coerce <- (#peek PgfProductionCoerce, coerce) dt ;
(fid,chart) <- peekCCat get_range chart c_coerce ;
return (maybe [] snd3 (Map.lookup fid chart),chart) }
(#const PGF_PRODUCTION_EXTERN) -> do { c_ep <- (#peek PgfProductionExtern, ep) dt ;
expr <- (#peek PgfExprProb, expr) c_ep ;
p <- (#peek PgfExprProb, prob) c_ep ;
return ([(Expr expr (touchConcr lang), [], p)],chart) }
_ -> error ("Unknown production type "++show tag++" in the grammar")
snd3 (_,x,_) = x
peekPArgs chart 0 ptr = return ([],chart)
peekPArgs chart len ptr = do
(a, chart) <- peekPArg chart ptr
(as,chart) <- peekPArgs chart (len-1) (ptr `plusPtr` (#size PgfPArg))
return (a:as,chart)
peekPArg chart ptr = do
c_hypos <- (#peek PgfPArg, hypos) ptr
hypos <- if c_hypos /= nullPtr
then peekSequence (deRef peekFId) (#size int) c_hypos
else return []
c_ccat <- (#peek PgfPArg, ccat) ptr
(fid,chart) <- peekCCat get_range chart c_ccat
return (PArg hypos fid,chart)
peekRange ptr = do
s <- (#peek PgfParseRange, start) ptr
e <- (#peek PgfParseRange, end) ptr
f <- (#peek PgfParseRange, field) ptr >>= peekCString
return ((fromIntegral :: CSizeT -> Int) s, (fromIntegral :: CSizeT -> Int) e, f)
mkCallbacksMap :: Ptr PgfConcr -> [(String, String -> Int -> Maybe (Expr,Float,Int))] -> Ptr GuPool -> IO (Ptr PgfCallbacksMap)
mkCallbacksMap concr callbacks pool = do
callbacks_map <- pgf_new_callbacks_map concr pool
forM_ callbacks $ \(cat,match) -> do
@@ -647,23 +849,15 @@ mkCallbacksMap concr callbacks pool = do
hspgf_callbacks_map_add_literal concr callbacks_map ccat match predict pool
return callbacks_map
where
match_callback match clin_idx poffset out_pool = do
match_callback match c_ann poffset out_pool = do
coffset <- peek poffset
case match (fromIntegral clin_idx) (fromIntegral coffset) of
ann <- peekUtf8CString c_ann
case match ann (fromIntegral coffset) of
Nothing -> return nullPtr
Just (e,prob,offset') -> do poke poffset (fromIntegral offset')
-- here we copy the expression to out_pool
c_e <- withGuPool $ \tmpPl -> do
exn <- gu_new_exn tmpPl
(sb,out) <- newOut tmpPl
let printCtxt = nullPtr
pgf_print_expr (expr e) printCtxt 1 out exn
c_str <- gu_string_buf_freeze sb tmpPl
guin <- gu_string_in c_str tmpPl
pgf_read_expr guin out_pool tmpPl exn
c_e <- pgf_clone_expr (expr e) out_pool
ep <- gu_malloc out_pool (#size PgfExprProb)
(#poke PgfExprProb, expr) ep c_e
@@ -690,7 +884,7 @@ lookupSentence lang (Type ctype _) sent =
-- | The oracle is a triple of functions.
-- The first two take a category name and a linearization field name
-- and they should return True/False when the corresponding
-- and they should return True/False when the corresponding
-- prediction or completion is appropriate. The third function
-- is the oracle for literals.
type Oracle = (Maybe (Cat -> String -> Int -> Bool)
@@ -702,7 +896,7 @@ parseWithOracle :: Concr -- ^ the language with which we parse
-> Cat -- ^ the start category
-> String -- ^ the input sentence
-> Oracle
-> ParseOutput
-> ParseOutput [(Expr,Float)]
parseWithOracle lang cat sent (predict,complete,literal) =
unsafePerformIO $
do parsePl <- gu_new_pool
@@ -778,6 +972,67 @@ parseWithOracle lang cat sent (predict,complete,literal) =
return ep
Nothing -> do return nullPtr
-- | Returns possible completions of the current partial input.
complete :: Concr -- ^ the language with which we parse
-> Type -- ^ the start category
-> String -- ^ the input sentence (excluding token being completed)
-> String -- ^ prefix (partial token being completed)
-> ParseOutput [(String, CId, CId, Float)] -- ^ (token, category, function, probability)
complete lang (Type ctype _) sent pfx =
unsafePerformIO $ do
parsePl <- gu_new_pool
exn <- gu_new_exn parsePl
sent <- newUtf8CString sent parsePl
pfx <- newUtf8CString pfx parsePl
enum <- pgf_complete (concr lang) ctype sent pfx exn parsePl
failed <- gu_exn_is_raised exn
if failed
then do
is_parse_error <- gu_exn_caught exn gu_exn_type_PgfParseError
if is_parse_error
then do
c_err <- (#peek GuExn, data.data) exn
c_offset <- (#peek PgfParseError, offset) c_err
token_ptr <- (#peek PgfParseError, token_ptr) c_err
token_len <- (#peek PgfParseError, token_len) c_err
tok <- peekUtf8CStringLen token_ptr token_len
gu_pool_free parsePl
return (ParseFailed (fromIntegral (c_offset :: CInt)) tok)
else do
is_exn <- gu_exn_caught exn gu_exn_type_PgfExn
if is_exn
then do
c_msg <- (#peek GuExn, data.data) exn
msg <- peekUtf8CString c_msg
gu_pool_free parsePl
throwIO (PGFError msg)
else do
gu_pool_free parsePl
throwIO (PGFError "Parsing failed")
else do
fpl <- newForeignPtr gu_pool_finalizer parsePl
ParseOk <$> fromCompletions enum fpl
where
fromCompletions :: Ptr GuEnum -> ForeignPtr GuPool -> IO [(String, CId, CId, Float)]
fromCompletions enum fpl =
withGuPool $ \tmpPl -> do
cmpEntry <- alloca $ \ptr ->
withForeignPtr fpl $ \pl ->
do gu_enum_next enum ptr pl
peek ptr
if cmpEntry == nullPtr
then do
finalizeForeignPtr fpl
touchConcr lang
return []
else do
tok <- peekUtf8CString =<< (#peek PgfTokenProb, tok) cmpEntry
cat <- peekUtf8CString =<< (#peek PgfTokenProb, cat) cmpEntry
fun <- peekUtf8CString =<< (#peek PgfTokenProb, fun) cmpEntry
prob <- (#peek PgfTokenProb, prob) cmpEntry
toks <- unsafeInterleaveIO (fromCompletions enum fpl)
return ((tok, cat, fun, prob) : toks)
-- | Returns True if there is a linearization defined for that function in that language
hasLinearization :: Concr -> Fun -> Bool
hasLinearization lang id = unsafePerformIO $
@@ -851,7 +1106,7 @@ linearizeAll lang e = unsafePerformIO $
-- | Generates a table of linearizations for an expression
tabularLinearize :: Concr -> Expr -> [(String, String)]
tabularLinearize lang e =
tabularLinearize lang e =
case tabularLinearizeAll lang e of
(lins:_) -> lins
_ -> []
@@ -863,6 +1118,7 @@ tabularLinearizeAll lang e = unsafePerformIO $
exn <- gu_new_exn tmpPl
cts <- pgf_lzr_concretize (concr lang) (expr e) exn tmpPl
failed <- gu_exn_is_raised exn
touchConcr lang
if failed
then throwExn exn
else collect cts exn tmpPl
@@ -908,8 +1164,27 @@ tabularLinearizeAll lang e = unsafePerformIO $
throwIO (PGFError msg)
else do throwIO (PGFError "The abstract tree cannot be linearized")
type FId = Int
type LIndex = Int
categoryFields :: Concr -> Cat -> Maybe [String]
categoryFields lang cat =
unsafePerformIO $ do
withGuPool $ \tmpPl -> do
p_n_lins <- gu_malloc tmpPl (#size size_t)
c_cat <- newUtf8CString cat tmpPl
c_fields <- pgf_category_fields (concr lang) c_cat p_n_lins
if c_fields == nullPtr
then do touchConcr lang
return Nothing
else do len <- peek p_n_lins
fs <- peekFields len c_fields
touchConcr lang
return (Just fs)
where
peekFields 0 ptr = return []
peekFields len ptr = do
f <- peek ptr >>= peekUtf8CString
fs <- peekFields (len-1) (ptr `plusPtr` (#size GuString))
return (f:fs)
-- | BracketedString represents a sentence that is linearized
-- as usual but we also want to retain the ''brackets'' that
@@ -917,22 +1192,22 @@ type LIndex = Int
data BracketedString
= Leaf String -- ^ this is the leaf i.e. a single token
| BIND -- ^ the surrounding tokens must be bound together
| Bracket CId {-# UNPACK #-} !FId {-# UNPACK #-} !LIndex CId [BracketedString]
| Bracket CId {-# UNPACK #-} !FId String CId [BracketedString]
-- ^ this is a bracket. The 'CId' is the category of
-- the phrase. The 'FId' is an unique identifier for
-- every phrase in the sentence. For context-free grammars
-- i.e. without discontinuous constituents this identifier
-- is also unique for every bracket. When there are discontinuous
-- is also unique for every bracket. When there are discontinuous
-- phrases then the identifiers are unique for every phrase but
-- not for every bracket since the bracket represents a constituent.
-- The different constituents could still be distinguished by using
-- the constituent index i.e. 'LIndex'. If the grammar is reduplicating
-- the analysis string. If the grammar is reduplicating
-- then the constituent indices will be the same for all brackets
-- that represents the same constituent.
-- The second 'CId' is the name of the abstract function that generated
-- this phrase.
-- | Renders the bracketed string as a string where
-- | Renders the bracketed string as a string where
-- the brackets are shown as @(S ...)@ where
-- @S@ is the category.
showBracketedString :: BracketedString -> String
@@ -940,7 +1215,7 @@ showBracketedString = render . ppBracketedString
ppBracketedString (Leaf t) = text t
ppBracketedString BIND = text "&+"
ppBracketedString (Bracket cat fid index _ bss) = parens (text cat <> colon <> int fid <+> hsep (map ppBracketedString bss))
ppBracketedString (Bracket cat fid _ _ bss) = parens (text cat <> colon <> int fid <+> hsep (map ppBracketedString bss))
-- | Extracts the sequence of tokens from the bracketed string
flattenBracketedString :: BracketedString -> [String]
@@ -950,7 +1225,7 @@ flattenBracketedString (Bracket _ _ _ _ bss) = concatMap flattenBracketedString
bracketedLinearize :: Concr -> Expr -> [BracketedString]
bracketedLinearize lang e = unsafePerformIO $
withGuPool $ \pl ->
withGuPool $ \pl ->
do exn <- gu_new_exn pl
cts <- pgf_lzr_concretize (concr lang) (expr e) exn pl
failed <- gu_exn_is_raised exn
@@ -976,7 +1251,7 @@ bracketedLinearize lang e = unsafePerformIO $
bracketedLinearizeAll :: Concr -> Expr -> [[BracketedString]]
bracketedLinearizeAll lang e = unsafePerformIO $
withGuPool $ \pl ->
withGuPool $ \pl ->
do exn <- gu_new_exn pl
cts <- pgf_lzr_concretize (concr lang) (expr e) exn pl
failed <- gu_exn_is_raised exn
@@ -1038,19 +1313,19 @@ withBracketLinFuncs ref exn f =
token <- peekUtf8CString c_token
writeIORef ref (stack,Leaf token : bs)
begin_phrase ref _ c_cat c_fid c_lindex c_fun = do
begin_phrase ref _ c_cat c_fid c_ann c_fun = do
(stack,bs) <- readIORef ref
writeIORef ref (bs:stack,[])
end_phrase ref _ c_cat c_fid c_lindex c_fun = do
end_phrase ref _ c_cat c_fid c_ann c_fun = do
(bs':stack,bs) <- readIORef ref
if null bs
then writeIORef ref (stack, bs')
else do cat <- peekUtf8CString c_cat
let fid = fromIntegral c_fid
let lindex = fromIntegral c_lindex
ann <- peekUtf8CString c_ann
fun <- peekUtf8CString c_fun
writeIORef ref (stack, Bracket cat fid lindex fun (reverse bs) : bs')
writeIORef ref (stack, Bracket cat fid ann fun (reverse bs) : bs')
symbol_ne exn _ = do
gu_exn_raise exn gu_exn_type_PgfLinNonExist
@@ -1245,13 +1520,13 @@ instance Exception PGFError
-----------------------------------------------------------------------
type LiteralCallback =
PGF -> (ConcName,Concr) -> String -> Int -> Int -> Maybe (Expr,Float,Int)
PGF -> (ConcName,Concr) -> String -> String -> Int -> Maybe (Expr,Float,Int)
-- | Callbacks for the App grammar
literalCallbacks :: [(AbsName,[(Cat,LiteralCallback)])]
literalCallbacks = [("App",[("PN",nerc),("Symb",chunk)])]
-- | Named entity recognition for the App grammar
-- | Named entity recognition for the App grammar
-- (based on ../java/org/grammaticalframework/pgf/NercLiteralCallback.java)
nerc :: LiteralCallback
nerc pgf (lang,concr) sentence lin_idx offset =

View File

@@ -6,6 +6,7 @@ module PGF2.FFI where
#include <gu/hash.h>
#include <gu/utf8.h>
#include <pgf/pgf.h>
#include <pgf/data.h>
import Foreign ( alloca, peek, poke, peekByteOff )
import Foreign.C
@@ -102,7 +103,7 @@ foreign import ccall unsafe "gu/file.h gu_file_in"
foreign import ccall safe "gu/enum.h gu_enum_next"
gu_enum_next :: Ptr a -> Ptr (Ptr b) -> Ptr GuPool -> IO ()
foreign import ccall unsafe "gu/string.h gu_string_buf_freeze"
gu_string_buf_freeze :: Ptr GuStringBuf -> Ptr GuPool -> IO CString
@@ -237,6 +238,16 @@ newSequence elem_size pokeElem values pool = do
pokeElem ptr x
pokeElems (ptr `plusPtr` (fromIntegral elem_size)) xs
type FId = Int
data PArg = PArg [FId] {-# UNPACK #-} !FId deriving (Eq,Ord,Show)
peekFId :: Ptr a -> IO FId
peekFId c_ccat = do
c_fid <- (#peek PgfCCat, fid) c_ccat
return (fromIntegral (c_fid :: CInt))
deRef peekValue ptr = peek ptr >>= peekValue
------------------------------------------------------------------
-- libpgf API
@@ -245,6 +256,7 @@ data PgfApplication
data PgfConcr
type PgfExpr = Ptr ()
data PgfExprProb
data PgfTokenProb
data PgfExprParser
data PgfFullFormEntry
data PgfMorphoCallback
@@ -261,6 +273,7 @@ data PgfAbsCat
data PgfCCat
data PgfCncFun
data PgfProductionApply
data PgfParsing
foreign import ccall "pgf/pgf.h pgf_read"
pgf_read :: CString -> Ptr GuPool -> Ptr GuExn -> IO (Ptr PgfPGF)
@@ -301,6 +314,9 @@ foreign import ccall "pgf/pgf.h pgf_category_context"
foreign import ccall "pgf/pgf.h pgf_category_prob"
pgf_category_prob :: Ptr PgfPGF -> CString -> IO (#type prob_t)
foreign import ccall "pgf/pgf.h pgf_category_fields"
pgf_category_fields :: Ptr PgfConcr -> CString -> Ptr CSize -> IO (Ptr CString)
foreign import ccall "pgf/pgf.h pgf_iter_functions"
pgf_iter_functions :: Ptr PgfPGF -> Ptr GuMapItor -> Ptr GuExn -> IO ()
@@ -338,7 +354,7 @@ foreign import ccall "pgf/pgf.h pgf_lzr_get_table"
pgf_lzr_get_table :: Ptr PgfConcr -> Ptr PgfCncTree -> Ptr CSizeT -> Ptr (Ptr CString) -> IO ()
type SymbolTokenCallback = Ptr (Ptr PgfLinFuncs) -> CString -> IO ()
type PhraseCallback = Ptr (Ptr PgfLinFuncs) -> CString -> CInt -> CSizeT -> CString -> IO ()
type PhraseCallback = Ptr (Ptr PgfLinFuncs) -> CString -> CInt -> CString -> CString -> IO ()
type NonExistCallback = Ptr (Ptr PgfLinFuncs) -> IO ()
type BindCallback = Ptr (Ptr PgfLinFuncs) -> IO ()
type MetaCallback = Ptr (Ptr PgfLinFuncs) -> CInt -> IO ()
@@ -361,18 +377,27 @@ foreign import ccall "wrapper"
foreign import ccall "pgf/pgf.h pgf_align_words"
pgf_align_words :: Ptr PgfConcr -> PgfExpr -> Ptr GuExn -> Ptr GuPool -> IO (Ptr GuSeq)
foreign import ccall "pgf/pgf.h pgf_parse_to_chart"
pgf_parse_to_chart :: Ptr PgfConcr -> PgfType -> CString -> Double -> Ptr PgfCallbacksMap -> CSizeT -> Ptr GuExn -> Ptr GuPool -> Ptr GuPool -> IO (Ptr PgfParsing)
foreign import ccall "pgf/pgf.h pgf_get_parse_roots"
pgf_get_parse_roots :: Ptr PgfParsing -> Ptr GuPool -> IO (Ptr GuSeq)
foreign import ccall "pgf/pgf.h pgf_ccat_to_range"
pgf_ccat_to_range :: Ptr PgfParsing -> Ptr PgfCCat -> Ptr GuPool -> IO (Ptr GuSeq)
foreign import ccall "pgf/pgf.h pgf_parse_with_heuristics"
pgf_parse_with_heuristics :: Ptr PgfConcr -> PgfType -> CString -> Double -> Ptr PgfCallbacksMap -> Ptr GuExn -> Ptr GuPool -> Ptr GuPool -> IO (Ptr GuEnum)
foreign import ccall "pgf/pgf.h pgf_lookup_sentence"
pgf_lookup_sentence :: Ptr PgfConcr -> PgfType -> CString -> Ptr GuPool -> Ptr GuPool -> IO (Ptr GuEnum)
type LiteralMatchCallback = CSizeT -> Ptr CSizeT -> Ptr GuPool -> IO (Ptr PgfExprProb)
type LiteralMatchCallback = CString -> Ptr CSizeT -> Ptr GuPool -> IO (Ptr PgfExprProb)
foreign import ccall "wrapper"
wrapLiteralMatchCallback :: LiteralMatchCallback -> IO (FunPtr LiteralMatchCallback)
type LiteralPredictCallback = CSizeT -> CString -> Ptr GuPool -> IO (Ptr PgfExprProb)
type LiteralPredictCallback = CString -> CString -> Ptr GuPool -> IO (Ptr PgfExprProb)
foreign import ccall "wrapper"
wrapLiteralPredictCallback :: LiteralPredictCallback -> IO (FunPtr LiteralPredictCallback)
@@ -398,6 +423,9 @@ foreign import ccall
foreign import ccall "pgf/pgf.h pgf_parse_with_oracle"
pgf_parse_with_oracle :: Ptr PgfConcr -> CString -> CString -> Ptr PgfOracleCallback -> Ptr GuExn -> Ptr GuPool -> Ptr GuPool -> IO (Ptr GuEnum)
foreign import ccall "pgf/pgf.h pgf_complete"
pgf_complete :: Ptr PgfConcr -> PgfType -> CString -> CString -> Ptr GuExn -> Ptr GuPool -> IO (Ptr GuEnum)
foreign import ccall "pgf/pgf.h pgf_lookup_morpho"
pgf_lookup_morpho :: Ptr PgfConcr -> CString -> Ptr PgfMorphoCallback -> Ptr GuExn -> IO ()
@@ -489,9 +517,6 @@ foreign import ccall "pgf/expr.h pgf_compute"
foreign import ccall "pgf/expr.h pgf_print_expr"
pgf_print_expr :: PgfExpr -> Ptr PgfPrintContext -> CInt -> Ptr GuOut -> Ptr GuExn -> IO ()
foreign import ccall "pgf/expr.h pgf_print_expr_tuple"
pgf_print_expr_tuple :: CSizeT -> Ptr PgfExpr -> Ptr PgfPrintContext -> Ptr GuOut -> Ptr GuExn -> IO ()
foreign import ccall "pgf/expr.h pgf_print_type"
pgf_print_type :: PgfType -> Ptr PgfPrintContext -> CInt -> Ptr GuOut -> Ptr GuExn -> IO ()
@@ -507,12 +532,6 @@ foreign import ccall "pgf/pgf.h pgf_print"
foreign import ccall "pgf/expr.h pgf_read_expr"
pgf_read_expr :: Ptr GuIn -> Ptr GuPool -> Ptr GuPool -> Ptr GuExn -> IO PgfExpr
foreign import ccall "pgf/expr.h pgf_read_expr_tuple"
pgf_read_expr_tuple :: Ptr GuIn -> CSizeT -> Ptr PgfExpr -> Ptr GuPool -> Ptr GuExn -> IO CInt
foreign import ccall "pgf/expr.h pgf_read_expr_matrix"
pgf_read_expr_matrix :: Ptr GuIn -> CSizeT -> Ptr GuPool -> Ptr GuExn -> IO (Ptr GuSeq)
foreign import ccall "pgf/expr.h pgf_read_type"
pgf_read_type :: Ptr GuIn -> Ptr GuPool -> Ptr GuPool -> Ptr GuExn -> IO PgfType
@@ -533,3 +552,6 @@ foreign import ccall "pgf/data.h pgf_lzr_index"
foreign import ccall "pgf/data.h pgf_production_is_lexical"
pgf_production_is_lexical :: Ptr PgfProductionApply -> Ptr GuBuf -> Ptr GuPool -> IO (#type bool)
foreign import ccall "pgf/expr.h pgf_clone_expr"
pgf_clone_expr :: PgfExpr -> Ptr GuPool -> IO PgfExpr

View File

@@ -35,7 +35,8 @@ import Control.Exception(Exception,throwIO)
import Control.Monad(foldM)
import qualified Data.Map as Map
type Token = String
type Token = String
type LIndex = Int
data Symbol
= SymCat {-# UNPACK #-} !Int {-# UNPACK #-} !LIndex
| SymLit {-# UNPACK #-} !Int {-# UNPACK #-} !LIndex
@@ -53,7 +54,6 @@ data Production
= PApply {-# UNPACK #-} !FunId [PArg]
| PCoerce {-# UNPACK #-} !FId
deriving (Eq,Ord,Show)
data PArg = PArg [FId] {-# UNPACK #-} !FId deriving (Eq,Ord,Show)
type FunId = Int
type SeqId = Int
data Literal =
@@ -186,10 +186,6 @@ concrProductions c fid = unsafePerformIO $ do
fid <- peekFId c_ccat
return (PArg hypos fid)
peekFId c_ccat = do
c_fid <- (#peek PgfCCat, fid) c_ccat
return (fromIntegral (c_fid :: CInt))
concrTotalFuns :: Concr -> FunId
concrTotalFuns c = unsafePerformIO $ do
c_cncfuns <- (#peek PgfConcr, cncfuns) (concr c)
@@ -271,8 +267,6 @@ concrSequence c seqid = unsafePerformIO $ do
forms <- peekForms (len-1) (ptr `plusPtr` (#size PgfAlternative))
return ((form,prefixes):forms)
deRef peekValue ptr = peek ptr >>= peekValue
fidString, fidInt, fidFloat, fidVar, fidStart :: FId
fidString = (-1)
fidInt = (-2)

View File

@@ -1,26 +0,0 @@
This is a binding to the new GF runtime in C.
The files are:
PGF2.hsc -- a user API similar to Python and Java APIs
PGF2/FFI.hs -- an internal module with FFI definitions for
-- the relevant C functions
HOW TO COMPILE:
cabal install
HOW TO USE:
- Import PGF to the Haskell program that you're writing.
The Cabal infrastructure will make sure to tell the compiler
where to find the relevant modules. Example:
module Main where
import PGF2
import qualified Data.Map as Map
main = do
pgf <- readPGF "Foo.pgf"
let Just english = Map.lookup "FooEng" (languages pgf)

Some files were not shown because too many files have changed in this diff Show More