Say I have a large txt or CSV file with data I want to search. And say I have several files.
What is the best way to index and make this data searchable? I’ve been using grep, but it is not ideal.
Is there any self hostable docker container for indexing and searching this? Or maybe should I use SQL?
You can import CSV files directly into an SQLite database. Then you are free to do whatever sql searches or manipulations you want. Python and golang both have great SQLite libraries from personal experience, but I’d be surprised if there is any language that doesn’t have a decent one.
If you are running Linux most distros have an SQLite gui browser in the repos that is pretty powerful.
Excel / OnlyOffice?
I love self-hosted tools, but you can do a lot on a spreadsheet.
Btw, if the files are not too large, you can query them using SQL without even hosting a database just by using Pandas. This avoids the problem of updating entries and handling migrations in case the CSVs change over time.
Importing that data into a RDBMS would be ideal. I’d use PostgreSQL for this but any other would work.