You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 10, 2020. It is now read-only.
Assuming un unstable connection, the last request (I think?) to harvest a file - where it sets the state to ready - if it fails, the recording is left is an aborted state.
Rerunning the harvester does not fix an aborted recording. It instead sends an exception like the one below. This breaks our eventual consistency/run-it-till-it-works philosophy.
Queue: harvest_production
Class: BawWorkers::Harvest::Action
Arguments: {:file_path=>"/home/ubuntu/bioacoustics/qcif_storage_nfs/data_prod/harvester_to_do/Brad Law/Koalas in forests/Koalas in forests/K08/Data/K8_20160907_203648.wav", :file_name=>"K8_20160907_203648.wav", :extension=>"wav", :access_time=>"2016-10-01T01:58:38.225+00:00", :change_time=>"2016-09-28T06:33:44.924+00:00", :modified_time=>"2016-09-28T06:33:44.924+00:00", :data_length_bytes=>158672108, :project_id=>1062, :site_id=>1403, :uploader_id=>9, :utc_offset=>"+10", :metadata=>nil, :raw=>{:year=>"2016", :month=>"09", :day=>"07", :hour=>"20", :min=>"36", :sec=>"48", :offset=>"", :ext=>"wav"}, :recorded_date=>"2016-09-07T20:36:48.000+10:00", :prefix=>"K8_", :separator=>"_", :suffix=>"", :file_rel_path=>"Koalas in forests/K08/Data/K8_20160907_203648.wav", :copy_on_success=>false}
The error was:
Error: Request to create audio recording from /home/ubuntu/bioacoustics/qcif_storage_nfs/data_prod/harvester_to_do/Brad Law/Koalas in forests/Koalas in forests/K08/Data/K8_20160907_203648.wav failed: Code 422, Message: Unprocessable Entity, Body: {"meta":{"status":422,"message":"Unprocessable Entity","error":{"details":"Record could not be saved","info":{"file_hash":["has already been taken"]}}},"data":null}
Associated DB record:
"id","uuid","uploader_id","recorded_date","site_id","duration_seconds","sample_rate_hertz","channels","bit_rate_bps","media_type","data_length_bytes","file_hash","status","notes","creator_id","updater_id","deleter_id","created_at","updated_at","deleted_at","original_file_name","recorded_utc_offset"
"355851","f6b5dcf2-2e5b-4057-a563-fb9d08b31201","9","2016-09-07 10:36:48","1403","3596.0000","22050","1","352800","audio/wav","158672108","SHA256::0e69a1b6a7baaa00a7e1e7cb3ad0de0269aa9ad4772d267e98f4a5bb9245fcdc","aborted","{""relative_path"":""Koalas in forests/K08/Data/K8_20160907_203648.wav"",""duration_adjustment_for_overlap"":[{""changed_at"":""2016-10-01T01:58:50Z"",""overlap_amount"":1.978,""old_duration"":3597.978,""new_duration"":3596.0,""other_uuid"":""950a7fc5-91e1-4bfa-8342-1430d8027c2f""}]}","2","2","","2016-10-01 01:58:50.980307","2016-10-01 01:58:51.464595","","K8_20160907_203648.wav",""
More information: the database record seems fully formed, however, the copy of the source file to the original audio directory did not happen.
The text was updated successfully, but these errors were encountered:
That method specifically queries for aborted audio recordings using a bunch of attributes. If exactly one row in the table matches, then it resumes.
I'd suggest having a look to see if the attributes for the audio recording that failed matches exactly one audio_recording. If it does, then there's a bug in that build_by_file_hash method.
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Assuming un unstable connection, the last request (I think?) to harvest a file - where it sets the state to
ready
- if it fails, the recording is left is an aborted state.Rerunning the harvester does not fix an aborted recording. It instead sends an exception like the one below. This breaks our eventual consistency/run-it-till-it-works philosophy.
Associated DB record:
More information: the database record seems fully formed, however, the copy of the source file to the original audio directory did not happen.
The text was updated successfully, but these errors were encountered: