New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Elasticsearch doesn't care about timezone and creates indexes with UTC #7375
Comments
Sorry @3h4x but I don't understand what you mean. Could you explain in more detail please? |
No worries. Every midnight logrotate save new file on servers. When we want for whatever reason to delete elasticsearch index, recreate index and forward logs one more time then I need to grep like crazy because logrotated log isn't fitting the index. |
OK - more details still :) Are you talking about index names? the |
Okie dokie. Yes there is a reason why I don't manipulate |
@3h4x To ES, the index name is arbitrary. It's logstash that decides how to name the indices and it does so in UTC. It seems the logstash team has already discussed this and has voted not to implement. Perhaps you can share your use case there and see whether it helps getting this implemented? https://logstash.jira.com/browse/LOGSTASH-973 Since this is not really an ES issue, I'll close it for now. |
@bleskes thanks for your input. I'll try to lobby it there ✌️ |
But I guess it is something different if you need this functionality for dynamic filenames when using csv output module. |
This could benefit all people using ELK.
Our logs are rotated with daily fashion and this should match elasticsearch index. But when we do it now we got holes and duplicates :( Deleting index and just feeding logstash with data should be fine and dandy but there is mismatch timezone now.
There could be configuration option to set default timezone and the indexes would follow the log rotation.
All the best!
The text was updated successfully, but these errors were encountered: