|
90e54d66d0
|
Removed cosched support
|
2024-03-25 01:18:37 +01:00 |
|
niansa
|
fc5e4f5aa1
|
Updated llama.cpp-mainline
|
2023-10-04 22:13:48 +02:00 |
|
niansa
|
79cf49faae
|
Implemented grammar sampling and zero-temperature sampling
|
2023-08-31 19:37:33 +02:00 |
|
niansa
|
01b0d059ed
|
Added pre_tick
|
2023-06-15 18:14:09 +02:00 |
|
niansa
|
0199db02b7
|
Added GPU support
|
2023-06-10 00:49:21 +02:00 |
|
niansa
|
94953cd174
|
Improve some error handling macros
|
2023-06-09 23:53:01 +02:00 |
|
niansa
|
53a4623aef
|
Added mirostat support
|
2023-05-26 00:43:07 +02:00 |
|
niansa
|
5feca59be7
|
Fixed linebreaks and support latest llama.cpp
|
2023-05-20 02:25:46 +02:00 |
|
niansa
|
a98784aa53
|
Minor MPT improvements
|
2023-05-16 23:35:42 +02:00 |
|
niansa
|
60fe6b9c55
|
Load implemenations as shared objects
|
2023-05-16 19:10:05 +00:00 |
|
|
f3a9092ca5
|
Moved gptj/utils.* to g4a-common.*
|
2023-05-15 09:44:04 +02:00 |
|
|
36a6fa6290
|
Warn about use of exceptions with cosched properly
|
2023-05-15 09:40:41 +02:00 |
|
|
81c8f39732
|
Added missing LM_COTHROW define
|
2023-05-15 09:38:49 +02:00 |
|
niansa
|
087fe1396b
|
Fixed all other known compilation issues
|
2023-05-10 21:50:37 +02:00 |
|
niansa
|
b61c751d33
|
Reverted last commit. but fixed invalid ssize_t typedef on msvc
|
2023-05-10 16:39:44 +02:00 |
|
niansa
|
bdb87534e8
|
Eliminate use of ssize_t
|
2023-05-10 16:37:43 +02:00 |
|
niansa
|
0f8ef2fd32
|
Removed if_error because it wasn't needed
|
2023-05-10 12:18:42 +02:00 |
|
niansa
|
c9d261701d
|
(Hopefully) fixed msvc compilation
|
2023-05-10 12:14:49 +02:00 |
|
niansa
|
6968f3459a
|
Made last commit more beautiful
|
2023-05-09 22:25:04 +02:00 |
|
niansa
|
5e666d83db
|
Actually implemented error return values
|
2023-05-09 22:20:01 +02:00 |
|
niansa
|
0d5cba0530
|
Added LM_NOEXCEPT cmake option
|
2023-05-09 21:33:49 +02:00 |
|
niansa
|
ca33a27e05
|
Fixed a warning
|
2023-05-06 01:14:24 +02:00 |
|
niansa
|
7076f863d4
|
Check for task termination
|
2023-05-05 19:02:28 +02:00 |
|
|
96076d4e9b
|
Updated for latest cosched
|
2023-05-05 10:35:41 +02:00 |
|
|
58b3a278d8
|
Don't use weak pointers
|
2023-05-04 15:53:19 +02:00 |
|
|
5a57db5e75
|
Added CoSched support
|
2023-05-04 15:22:32 +02:00 |
|
niansa
|
3c62cbebaf
|
Inlined InferencePool constructor and added addional cleanup functions
|
2023-04-29 23:43:45 +02:00 |
|
niansa
|
d236e36d26
|
Implemented proper scrolling
|
2023-04-28 18:04:07 +02:00 |
|
|
17f30e0761
|
Removed uninitialized GPTJInference::State::n_ctx
|
2023-04-27 10:41:56 +02:00 |
|
|
e640f093df
|
Removed context size exception
|
2023-04-27 10:04:03 +02:00 |
|
|
493186509a
|
Renamed function and updated Python bindings
|
2023-04-27 09:48:44 +02:00 |
|
|
94e4ca5874
|
Set default context size limit
|
2023-04-27 09:47:23 +02:00 |
|
|
ca4ad5f096
|
Added context window scrolling with top bar
|
2023-04-27 09:45:37 +02:00 |
|
|
219186f4b6
|
Take const string reference instead of string view in append()
|
2023-04-27 09:31:22 +02:00 |
|
|
4e74517bb5
|
Changed parameter types to some that make more sense
|
2023-04-27 09:27:09 +02:00 |
|
|
0661b2e33d
|
Updated for latest llama.cpp and working gpt-j implementation
|
2023-04-27 08:21:02 +02:00 |
|
|
5f6cf17871
|
Moved pool functions into seperate file
|
2023-04-26 11:18:12 +02:00 |
|
|
1f75673523
|
Fixed compilation
|
2023-04-26 11:10:25 +02:00 |
|
|
aad1bd9ae4
|
Made Inference class virtual
|
2023-04-26 10:59:24 +02:00 |
|
niansa
|
55a310b005
|
Synced top_p default with llama.cpp
|
2023-04-25 17:08:55 +02:00 |
|
niansa
|
d88dc5ad98
|
Fixed InferencePool::store_all() storing empty slots
|
2023-04-25 15:13:57 +02:00 |
|
niansa
|
97f7f2ebc5
|
Removed debugging left-over
|
2023-04-23 18:24:42 +02:00 |
|
niansa
|
138a9bde52
|
Minor fixes and Python bindings
|
2023-04-23 18:06:57 +02:00 |
|
niansa
|
0466774286
|
Fully implemented InferencePool
|
2023-04-23 15:31:16 +02:00 |
|
niansa
|
414554c69a
|
Use ios::binary for file streams
|
2023-04-23 12:40:29 +02:00 |
|
niansa
|
cf2fec84e0
|
Added experimental InferencePool
|
2023-04-23 12:31:39 +02:00 |
|
niansa
|
e85556d94d
|
Added Inference::get_prompt
|
2023-04-20 21:34:20 +02:00 |
|
niansa
|
677ac470e7
|
Load/Restore prompt
|
2023-04-20 21:32:45 +02:00 |
|
niansa
|
06d99a6950
|
Allow generation without end string and fixed Python bug
|
2023-04-18 01:17:32 +02:00 |
|
niansa
|
80d33df458
|
Added is_valid to Savestate
|
2023-04-17 22:55:28 +02:00 |
|