Em Smith wrote:
> My guess is that you could change the curl line to:
I found that curl needs the "-f" option to get more reliable return codes... (i.e. unauthorized returns 0)
> So after a successful import, the json file is no longer needed and can be removed.
Yes, eventually but for testing it may be better to save for awhile, I move them to a separate dir and mark them as done.
> Maybe also add the "--max-time 60" to the curl to timeout after 60 seconds in case you turn off the other machine at the wrong time.
Yes, even 30 would probably be enough for a local net.
> So, then you need to find all the json files that exist (since they failed to import) and try to import them again:
Instead of retrying from the recording play, I would rather just run from the play2 when it starts, see below.
> Finally, before running the first time, you need to remove the json files from your old imports (otherwise they'd import again as failed imports).
Actually it is worse than that, they would import just fine since there is no duplicate checking. Then you have the problem of 2(or more) duplicate entries pointing to the same recording file. And if you remove one, the recording gets removed (deleted!!!).... I found this out with my trial and error testing :) At that point, you could move the duplicate entries to failed recordings, or you would have to stop tvheadend and remove all but one of the duplicate dvr/log files, then restart tvheadend. (helpful to have the uuids - again see below.)
In my case, I use a fixed "json" directory to put all the import related files. (/record/tvh/json)
Then my export script creates the .import.json files there and the rsync copies them so my other server where my import script sees them.
In your case, you can make a "json" directory in the base directory of your recordings, then you can modify the end of your export script to something like:
# Use logfile to create json data for import
dir="/path/to/recordings/json/" # note important trailing /
filename=${file##*/} # get just filename
printf "conf=" | cat - "$logfile" > "$dir$filename".import.json
#wait to make sure the network file is written
sleep 5
# Create recording entry on play2
curl -q --data @"$dir$filename".import.json 'http://user:pass@wetek_play2:9981/api/dvr/entry/create' > "$dir$filename".reply
if grep -q uuid "$dir$filename".reply ;then
echo "worked"
mv "$dir$filename".import.json "$dir"$filename"import.json.done
else
echo "failed"
fi
If successful, in the .reply file you will have the uuid of the NEWLY created entry on the plays (will be different than the uuid of the original recording on the play) (Also, if accidentally run again, it will have a new uuid...)
Looks like you can have an "autostart.sh" script
https://www.wetekforums.com/v/discussion/20882/startup-boot-script
In that script add a line at the end with something like:
nohup /storage/.config/import.sh &
And make an import.sh script in /storage/.config:
#! /bin/bash
# wait to give time for tvheadend to start.
sleep 180 # 3min - not sure how much time is really needed???
cd /path/to/recordings/json
for json_file in *.import.json
do
curl -q --silent --data @"$json_filename" 'http://user:pass@wetek_play2:9981/api/dvr/entry/create' > "$json_filename".reply
if grep -q uuid "$json_filename".reply ;then
#echo "worked"
mv "$json_file" "$json_file".done
else
#echo "failed"
mv "$json_file" "$json_file".failed #so not tried again
fi
done
Last part is optional, but if it fails again, there may be something wrong with it so best not to try again.
Again, untested as my scripts are different due to different setup.
Also, I wrote post in pieces as I was working on a few different things, so quite possibly there are some errors. :)
But in addition to Em Smith's post it should be a good start.