|
| 1 | +# Monoport |
| 2 | + |
| 3 | +## How to run Siggraph RTL Demo |
| 4 | + |
| 5 | +#### 0. Setup the repo |
| 6 | +``` |
| 7 | +python setup.py develop |
| 8 | +``` |
| 9 | + |
| 10 | +#### 1. Start the main process as a server. |
| 11 | +``` |
| 12 | +# if you want to use the input from a webcam: |
| 13 | +python RTL/main.py --use_server --ip <YOUR_IP_ADDRESS> --port 5555 --camera -- netG.ckpt_path ./data/PIFu/net_G netC.ckpt_path ./data/PIFu/net_C |
| 14 | +
|
| 15 | +# or if you want to use the input from a image folder: |
| 16 | +python RTL/main.py --use_server --ip <YOUR_IP_ADDRESS> --port 5555 --image_folder <IMAGE_FOLDER> -- netG.ckpt_path ./data/PIFu/net_G netC.ckpt_path ./data/PIFu/net_C |
| 17 | +
|
| 18 | +# or if you want to use the input from a video: |
| 19 | +python RTL/main.py --use_server --ip <YOUR_IP_ADDRESS> --port 5555 --videos <VIDEO_PATH> -- netG.ckpt_path ./data/PIFu/net_G netC.ckpt_path ./data/PIFu/net_C |
| 20 | +``` |
| 21 | + |
| 22 | +If everything goes well, you should be able to see those logs after waiting for a few seconds: |
| 23 | + |
| 24 | + loading networkG from ./data/PIFu/net_G ... |
| 25 | + loading networkC from ./data/PIFu/net_C ... |
| 26 | + initialize data streamer ... |
| 27 | + Using cache found in /home/rui/.cache/torch/hub/NVIDIA_DeepLearningExamples_torchhub |
| 28 | + Using cache found in /home/rui/.cache/torch/hub/NVIDIA_DeepLearningExamples_torchhub |
| 29 | + * Serving Flask app "main" (lazy loading) |
| 30 | + * Environment: production |
| 31 | + WARNING: This is a development server. Do not use it in a production deployment. |
| 32 | + Use a production WSGI server instead. |
| 33 | + * Debug mode: on |
| 34 | + * Running on http://<YOUR_IP_ADDRESS>:5555/ (Press CTRL+C to quit) |
| 35 | + |
| 36 | +#### 2. Access the server to start. |
| 37 | +Open the page `http://<YOUR_IP_ADDRESS>:5555/` on a web browser from any device (Desktop/IPad/IPhone), You should be able to see the **MonoPort VR Demo** page on that device, and at the same time you should be able to see the a screen poping up on your desktop, showing the reconstructed normal and texture image. |
| 38 | + |
| 39 | +#### 3. Play with VR demo. (TODO: bc of the https cert, this step is not easy for the public to use) |
| 40 | +As a VR prototype, this system also allow users to control the camera in the **MonoPort VR Demo** using the sensor from IPad/IPhone. To achieve that, you need to start another server : |
| 41 | +``` |
| 42 | +python RTL/VRweb/server_webxr.py --port 8000 --cert ruilong |
| 43 | +``` |
| 44 | + |
| 45 | +Then you can access `https://www/liruilong.codes:8000` from the app **XRViewer**, and click the button **Enter WebXR** in the page. From that moment, your mobile device would become a camera in our VR scene, and you can move your mobile device around to observe the reconstructed human. |
| 46 | + |
| 47 | + |
0 commit comments