video_editing:davinci_resolve_in_the_cloud

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
video_editing:davinci_resolve_in_the_cloud [2021/03/05 08:28] – [AWS DaVinci Resolve Workflow Reconnect] davidvideo_editing:davinci_resolve_in_the_cloud [2021/11/22 09:55] (current) david
Line 2: Line 2:
  
 I'm trying to keep up with my notes on [[:video_editing|Video Editing]] and [[video editing:DaVinci Resolve]]. Hardware choices for DaVinci resolve is a subject for a small book (which needs updating every 6 months), but I've summarised my notes under [[Video Editing:DaVinci Resolve Specs]]. I'm trying to keep up with my notes on [[:video_editing|Video Editing]] and [[video editing:DaVinci Resolve]]. Hardware choices for DaVinci resolve is a subject for a small book (which needs updating every 6 months), but I've summarised my notes under [[Video Editing:DaVinci Resolve Specs]].
 +
 +===== Summary =====
 +
 +My questions were:
 +
 +> Can I use cloud computing to run DaVinci Resolve?
 +> At what point does it become cheaper to buy a new laptop for "editing on the go" in comparison to using cloud computing?
 +
 +This came from the fact that I had an older (but not that old) laptop which could no-longer keep up with running DaVinci resolve (at least reasonably). I didn't need to upgrade it for any productivity outside of running DaVinci Resolve.
 +
 +Laptop prices showed that a laptop with an Nvidia card that would run DaVinci Resolve confidently for a few years would cost about £3,000 If I went with Apple, then the new M1 chip brings the costs down. The Macbook Air which comes in at just over £1,000 (Or just under £1,500 for a more powerful Macbook Pro with the M1 chip).
 +
 +After doing some numbers (just for DaVinci Resolve), I've calculated (at least for UK prices) that a £1,000 laptop used 10 days a year over a 5 year lifetime works out at £20 a day. Worst case scenario is a £3,000 laptop used for 5 days a year over a 3 year lifetime, which averages out at £200 a day. That's a lot of variance!
 +
 +> If I can find a way of working which comes to less than £20 ($27.36) a day then I'm on par with my cheapest solution for my predicted usage. 
 +
 +This can be achieved by using one of:
 +  * ''g4dn.2xlarge'' - [[https://aws.amazon.com/marketplace/pp/prodview-xrrke4dwueqv6|NVIDIA Gaming PC - Windows Server 2019]]
 +  * ''g4dn.4xlarge'' - [[https://aws.amazon.com/marketplace/pp/prodview-zzy5tef4cq6sg|DaVinci Resolve 17 Studio]]
  
 ===== Note ===== ===== Note =====
  
 This looks at using powerful windows virtual machines in the cloud, they will run both versions of DaVinci Resolve (Free and Studio). However studio will be able to use more of the fancy GPU features. Warning from personal experience, getting a dongle to work remotely is hard/impossible (I couldn't do it) and I've been unable to test with a licence key (so much for the dongle for when I'm working without internet). This looks at using powerful windows virtual machines in the cloud, they will run both versions of DaVinci Resolve (Free and Studio). However studio will be able to use more of the fancy GPU features. Warning from personal experience, getting a dongle to work remotely is hard/impossible (I couldn't do it) and I've been unable to test with a licence key (so much for the dongle for when I'm working without internet).
 +
 +This mostly looks at two EC2 instances:
 +  * [[https://aws.amazon.com/marketplace/pp/prodview-xrrke4dwueqv6|NVIDIA Gaming PC - Windows Server 2019]]
 +  * [[https://aws.amazon.com/marketplace/pp/prodview-zzy5tef4cq6sg|DaVinci Resolve 17 Studio]] - New as of June 2021 (I couldn't find much extra information on this, [[https://web.archive.org/web/*/https://aws.amazon.com/marketplace/pp/prodview-zzy5tef4cq6sg|WayBackMachine]] first found the page on June 8th 2021).
  
 ===== Scenario ===== ===== Scenario =====
Line 58: Line 81:
  
 To compare, I looked at the EC2 prices based on the Paris region: To compare, I looked at the EC2 prices based on the Paris region:
 +
 +[[https://aws.amazon.com/marketplace/pp/prodview-xrrke4dwueqv6|NVIDIA Gaming PC - Windows Server 2019]]:
 ^              Model|  g4dn.xlarge  |  g4dn.2xlarge  |  g4dn.4xlarge  | ^              Model|  g4dn.xlarge  |  g4dn.2xlarge  |  g4dn.4xlarge  |
 ^              Cores|  4            |  8              16            | ^              Cores|  4            |  8              16            |
Line 67: Line 92:
  
 Using the 16 core machine for 12 hours (marathon day), I'm keeping (just) below the $27.36/£20 a day price I'm aiming for, although tax and data storage might push it over. So, instead to remain cost effective I'm better either using it for a shorter period of time or dropping down to a smaller machine. Fortunately using cloud resources makes this scaling up and down easy. Using the 16 core machine for 12 hours (marathon day), I'm keeping (just) below the $27.36/£20 a day price I'm aiming for, although tax and data storage might push it over. So, instead to remain cost effective I'm better either using it for a shorter period of time or dropping down to a smaller machine. Fortunately using cloud resources makes this scaling up and down easy.
 +
 +[[https://aws.amazon.com/marketplace/pp/prodview-zzy5tef4cq6sg|DaVinci Resolve 17 Studio]]:
 +^              Model|  g4dn.xlarge  |  g4dn.2xlarge  |  g4dn.4xlarge  |  g4dn.8xlarge  |
 +^              Cores|  N/A          |  N/A            16            |  32            |
 +^        Memory (GB)|  N/A          |  N/A            64            |  128           |
 +^    Hour Cost (USD)|  N/A          |  N/A            1.409          2.546         |
 +^   8 Hour Day (USD)|  N/A          |  N/A            11.272        |  20.368        |
 +^  10 Hour Day (USD)|  N/A          |  N/A            14.09          25.46         |
 +^  12 Hour Day (USD)|  N/A          |  N/A            16.908        |  30.552        |
 +
 +Using the 16 core machine for 12 hours (marathon day), I'm keeping well below the $27.36/£20 a day price I'm aiming for. I could go up to the more powerful instance for shorter bursts of productivity, if the requirement was there.
  
 When you stop (turn off) the virtual machine, you stop paying the prices above. Instead you only pay for the storage: When you stop (turn off) the virtual machine, you stop paying the prices above. Instead you only pay for the storage:
Line 243: Line 279:
 ====== AWS DaVinci Resolve Workflow Reconnect ====== ====== AWS DaVinci Resolve Workflow Reconnect ======
  
-Upload new files:+===== Upload new files ===== 
 + 
 +Use a bat script to opy the files from local machine to s3. 
   * https://superuser.com/questions/444726/windows-how-to-add-batch-script-action-to-right-click-menu   * https://superuser.com/questions/444726/windows-how-to-add-batch-script-action-to-right-click-menu
  
 +<file bash push_projects_to_aws.bat>
 +@ECHO OFF
 +REM ===========================================================================
 +REM Set common variables
 +REM project_list will be whatever directories you wish to copy from the s3 projects directory to local
 +REM ===========================================================================
 +SET aws_s3_bucket=drh-video1
 +SET drive_location="C:\Users\david\Pictures\Adventures In Cloud Computing"
 +
 +SET project_list=dir_name_1 dir_name_2
 +
 +REM remove quotes
 +SET drive_location=%drive_location:"=%
 + 
 +REM ===========================================================================
 +REM Set varialbes used in script
 +REM ===========================================================================
 + 
 +REM SET local_project_location=%drive_location%\projects
 +SET local_project_location=%drive_location%
 +SET s3_project_location=s3://%aws_s3_bucket%/projects
 + 
 +echo Local project location: %local_project_location%
 +echo    S3 project location: %s3_project_location%
 +
 +REM ===========================================================================
 +REM Push projects up
 +REM ===========================================================================
 + 
 +FOR %%i in (%project_list%) do (
 +echo Project: %%i
 +aws s3 sync "%local_project_location%/%%i" %s3_project_location%/%%i
 +)
 + 
 +pause
 +</file>
 +
 +Remember before connecting:
   - Check new IP/DNS Name   - Check new IP/DNS Name
 +
 +===== Create D drive =====
 +The second disk isn't kept, but can be recreated using a script
 +
 +After connecting:
   - Format/create hard disk: [[https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/diskpart-scripts-and-examples]]   - Format/create hard disk: [[https://docs.microsoft.com/en-us/windows-server/administration/windows-commands/diskpart-scripts-and-examples]]
  
-<code+<file bash script_diskpart.txt
-aws s3 ls s3://drh-video1 +select disk 1 
-D: +clean 
-aws s3 cp s3://drh-video1 D:\ --recursive+create partition primary 
 +format quick 
 +assign letter=
 +</file>
  
-mkdir D:\projects +<file bash d_drive_format.bat> 
-mkdir D:\MEDIA+diskpart /s script_diskpart.txt 
 +pause 
 +</file>
  
-aws s3 cp s3://drh-video1/projects/LariceaBook D:\projects --recursive +===== Copy files from projects in s3 =====
- +
-aws s3 sync s3://drh-video1/projects/LariceaBook D:\projects\LariceaBook +
-</code>+
  
 <file bash copy_to_d.bat> <file bash copy_to_d.bat>
Line 271: Line 355:
 SET drive_location=D:\projects SET drive_location=D:\projects
 REM SET drive_location=C:\Users\david\projects REM SET drive_location=C:\Users\david\projects
- +  
-SET project_list=LariceaBook +SET project_list=OVFM logo 
 + 
 REM =========================================================================== REM ===========================================================================
 REM Set varialbes used in script REM Set varialbes used in script
 REM =========================================================================== REM ===========================================================================
- +  
-SET local_project_location=%drive_location%\projects+SET local_project_location=%drive_location%
 SET s3_project_location=s3://%aws_s3_bucket%/projects SET s3_project_location=s3://%aws_s3_bucket%/projects
 + 
 echo Local project location: %local_project_location% echo Local project location: %local_project_location%
 echo    S3 project location: %s3_project_location% echo    S3 project location: %s3_project_location%
 +
 +REM ===========================================================================
 +REM List available projects
 +REM ===========================================================================
 +echo ===========================================================================
 +echo Available Projects
 +echo ===========================================================================
 +aws s3 ls %s3_project_location%/
  
 REM =========================================================================== REM ===========================================================================
 REM Create some local directories REM Create some local directories
 REM =========================================================================== REM ===========================================================================
 + 
 mkdir %drive_location% mkdir %drive_location%
 mkdir D:\MEDIA mkdir D:\MEDIA
 + 
 REM =========================================================================== REM ===========================================================================
 REM Pull projects down REM Pull projects down
 REM =========================================================================== REM ===========================================================================
 +echo =========================================================================== 
 +echo Downloading Configured Projects 
 +echo =========================================================================== 
 + 
 FOR %%i in (%project_list%) do ( FOR %%i in (%project_list%) do (
-echo aws s3 cp %s3_project_location%/%%i %local_project_location% --recursive+aws s3 cp %s3_project_location%/%%i %local_project_location%\%%i --recursive
 ) )
 + 
 pause pause
 </file> </file>
  
-Backup +===== Copy files from projects in to s3 =====
-<code> +
-D: +
-aws s3 cp DATA s3://drh-video1/data --recursive +
-aws s3 cp VIDEO s3://drh-video1/LB_Video --recursive +
- +
-aws s3 cp D:\projects\LariceaBook s3://drh-video1/projects/LariceaBook --recursive +
- +
-aws s3 sync D:\projects\LariceaBook s3://drh-video1/projects/LariceaBook +
-</code> +
- +
-For recursive: https://ss64.com/nt/for_d.html +
 <file bash sync_to_s3.bat> <file bash sync_to_s3.bat>
 @ECHO OFF @ECHO OFF
Line 323: Line 406:
 SET aws_s3_bucket=drh-video1 SET aws_s3_bucket=drh-video1
 SET drive_location=D:\projects SET drive_location=D:\projects
 + 
 REM =========================================================================== REM ===========================================================================
 REM Set varialbes used in script REM Set varialbes used in script
 REM =========================================================================== REM ===========================================================================
- +  
-SET local_project_location=%drive_location%\projects+SET local_project_location=%drive_location%
 SET s3_project_location=s3://%aws_s3_bucket%/projects SET s3_project_location=s3://%aws_s3_bucket%/projects
 + 
 echo Local project location: %local_project_location% echo Local project location: %local_project_location%
 echo    S3 project location: %s3_project_location% echo    S3 project location: %s3_project_location%
 + 
 REM =========================================================================== REM ===========================================================================
 REM Loop directories in projects directory and sync to s3 REM Loop directories in projects directory and sync to s3
Line 344: Line 427:
 aws s3 sync !FullDirName! %s3_project_location%/!CurrDirName! aws s3 sync !FullDirName! %s3_project_location%/!CurrDirName!
 ) )
 + 
 pause pause
 </file> </file>
  • video_editing/davinci_resolve_in_the_cloud.1614932884.txt.gz
  • Last modified: 2021/03/05 08:28
  • by david