Commit Graph

28 Commits

Author SHA1 Message Date
cf2ee082b1 Merge branch 'main' of https://github.com/splunk/AIHoneypot 2024-11-07 13:42:02 -05:00
ff848b44ac Create LICENSE
Added MIT license
2024-11-07 13:37:02 -05:00
e643ac344d Updated TODO 2024-08-26 14:36:32 -04:00
a73fefa9c4 Moved SSH honeypot to subdirectory 2024-08-26 14:31:52 -04:00
2461b42e40 Improved end-of-session handling
Rather than explicitly checking to see if the user
was typing a shell exit command, the LLM is now
instructed to provide a specific token starting
("XXX-END-OF-SESSION-XXX") to indicate that the
session should be closed. This allows the user to
exit the shell in any way they see fit, and the
LLM will still know when to end the session. It
also means that typing 'exit' or similar commands
to subshells or command interpreters (e.g. Python)
are less likely to cause the session to end.
2024-08-23 15:28:42 -04:00
ed95eda824 Improved honeypot logging
Logs now include the protocol (SSH) and the src/dest IPs and ports on each log line.
2024-08-23 13:52:36 -04:00
e2854e960c Now log passwords for any login attempt (failures and successes). 2024-08-23 12:41:51 -04:00
4029df5cdd Removed a redundant import 2024-08-23 11:48:08 -04:00
b49e743e7c Cleaned up and organized imports 2024-08-23 11:43:15 -04:00
a180bb58a2 Minor code formatting changes 2024-08-23 11:18:14 -04:00
e385b8a4bb Removed extraneous debug prints 2024-08-22 15:39:49 -04:00
7e38c43dee Experimental support for changing LLM providers and models in the config file. 2024-08-22 14:39:47 -04:00
df203a7a55 Log both successful and failed login attempts 2024-08-20 14:44:30 -04:00
c57cb0a240 Cleaned up sample prompt files. 2024-08-20 11:52:37 -04:00
8bb4cb3393 New file: TODO.txt 2024-08-20 09:43:54 -04:00
656872ab2c Initial SSH emulation prompt. 2024-08-20 09:17:19 -04:00
b72acb81be Changed name to HADES 2024-08-20 09:15:08 -04:00
ba5713d94c Now uses config.ini for all configuration parameters. 2024-08-16 17:11:41 -04:00
f84d0b2d37 Converted start_server() to more modern idiom granting better control of server parameters. 2024-08-16 15:09:07 -04:00
0f5c4d1f69 Implement chat message history trimming to avoid overflowing the LLM context window. 2024-08-16 11:34:29 -04:00
c40444a6cc Command output now logged as base64 string to avoid multiline issues. 2024-08-16 09:13:41 -04:00
eb4a67f094 Added a couple of conveniences:
* Moved the system prompt to its own file
* Added 'exit/logout/quit' support to realistically end the SSH session.
2024-08-15 16:50:42 -04:00
092ac94b05 Now a function prototype with an LLM backend.
* Added langchain support (OpenAI's gpt-4o model)
* Created a system prompt that gives functional results
* Initial integration of logging for LLM responses (needs improvement)
2024-08-15 15:44:54 -04:00
759814f8c9 Added requirements.txt 2024-08-15 13:33:08 -04:00
62178679c6 Each SSH session now gets a uniq ID in the log 2024-08-15 13:08:45 -04:00
f95ad39f32 Added centralized logging 2024-08-15 12:17:48 -04:00
12af949915 Initial commit of barebones SSH server. 2024-08-15 11:16:37 -04:00
e7d9a8ede6 Initial commit 2024-08-15 10:55:00 -04:00