-
Notifications
You must be signed in to change notification settings - Fork 196
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
request or news (I can look) save to database for docker #265
Comments
Hi. Have you considered having the container share a volume with the host? That way the locale catalogs saved by Rosetta will be persisted on the host. |
What are you using to create or edit translations stored in the database at
run time?
Are you able to use volumes in your deploy platform? (ie you can not with
DO App platform)
…On Tue, Dec 21, 2021, 5:51 AM Pascal de Sélys ***@***.***> wrote:
- Which version of Django are you using?: 3.2
- Which version of django-rosetta are you using?: None
- Have you looked trough recent issues
<https://github.com/mbi/django-rosetta/issues?utf8=%E2%9C%93&q=is%3Aissue>
and checked this isn't a duplicate? yes
Hi, I just came across your app which is exactly what I'm looking for
however, working with docker and since it's the client that browses and
translates as it goes along I can't save the po/mo files in docker
otherwise they disappear immediately.
Is there an implementation in development to save the translations in
database?
Would this be a good idea for integration with your project?
In a laravel application I developed, I used a library that reads the
translations and stores them in a table in the database for translation,
which allows me to do *this workflow*:
I create my container
I look in the table the translations done.
I generate the translation files
The translations are existing the time of life of the container
The administrators create new translations which are saved in the database
I update the po files
the new translations are used.
scaling
new container
*workflow with translations + new translations*
destroy all container
recreate container
*workflow with translations + new translations*
Thank's in advance
—
Reply to this email directly, view it on GitHub
<#265>, or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AKU5P6OVADDEE7KVIFPBXPDUSCA4HANCNFSM5KQHD4WA>
.
Triage notifications on the go with GitHub Mobile for iOS
<https://apps.apple.com/app/apple-store/id1477376905?ct=notification-email&mt=8&pt=524675>
or Android
<https://play.google.com/store/apps/details?id=com.github.android&referrer=utm_campaign%3Dnotification-email%26utm_medium%3Demail%26utm_source%3Dgithub>.
You are receiving this because you are subscribed to this thread.Message
ID: ***@***.***>
|
Hello, I apologize for the delay. To answer the question, here is how I proceed with laravel and docker
it generates the translations which are volatile during the docker time which allows to be always up to date, if an update is done then an update is done in the volatile files and at the next construction of the docker the new translations are directly applied to the launching To answer the question of a shared volume it doesn't seem to me a good idea, no possibility to scale the application on several nodes. The only possible way would be to store them on an S3 from amazon or minio. But this implies to translate the files to store them in an S3 to take them back after the docker build. But thinking about it. Storing them on an S3 could be a solution. Adding a command that translates and pushes the po's on an S3 would be a solution, it would be enough to retrieve them afterwards : 🧐 |
Using S3 as a translation storage for Rosetta is being discussed in #246. |
Perfect thank you I'm looking at this |
Hi, I just came across your app which is exactly what I'm looking for however, working with docker and since it's the client that browses and translates as it goes along I can't save the po/mo files in docker otherwise they disappear immediately.
Is there an implementation in development to save the translations in database?
Would this be a good idea for integration with your project?
In a laravel application I developed, I used a library that reads the translations and stores them in a table in the database for translation, which allows me to do this workflow:
I create my container
I look in the table the translations done.
I generate the translation files
The translations are existing the time of life of the container
The administrators create new translations which are saved in the database
I update the po files
the new translations are used.
scaling
new container
workflow with translations + new translations
destroy all container
recreate container
workflow with translations + new translations
Thank's in advance
The text was updated successfully, but these errors were encountered: