You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+37-16Lines changed: 37 additions & 16 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -24,33 +24,46 @@
24
24
<a href="#License">License</a>
25
25
</p>
26
26
27
-
HyperPose is a library for building human pose estimation systems that can efficiently operate in the wild.
27
+
# HyperPose
28
+
29
+
HyperPose is a library for building high-performance custom pose estimation systems.
28
30
29
31
## Features
30
32
31
-
HyperPose has two key features, which are not available in existing libraries:
33
+
HyperPose has two key features:
32
34
33
-
-**Flexible training platform**: HyperPose provides flexible Python APIs to provide a customise pipeline for developing various pose estimation models. HyperPose users can:
34
-
* make use of uniform pipelines for train,evaluation,visualization,pre-processing and post-processing across various models (e.g., OpenPose,Pifpaf,PoseProposal Network)
35
-
* customise model and dataset for their own use(e.g. user-defined model,user-defined dataset,mitiple dataset combination)
36
-
* parallel training using multiple GPUs(using *Kungfu* adaptive distribute training library)
35
+
-**High-performance pose estimation wth CPUs/GPUs**: HyperPose achieves real-time pose estimation though a high-performance pose estimation engine. This engine implements numerous system optimizations: pipeline parallelism, model inference with TensorRT, CPU/GPU hybrid scheduling, and many others. This allows HyperPose to run 4x FASTER than OpenPose and 10x FASTER than TF-Pose.
36
+
-**Flexibility for developing custom pose estimation models**: HyperPose provides flexible Python APIs to provide a customise pipeline for developing various pose estimation models. HyperPose users can:
37
+
* Make use of uniform pipelines for train,evaluation,visualization,pre-processing and post-processing across various models (e.g., OpenPose,Pifpaf,PoseProposal Network)
38
+
* Customise model and dataset for their own use(e.g. user-defined model,user-defined dataset,mitiple dataset combination)
39
+
* Parallel training using multiple GPUs(using *Kungfu* adaptive distribute training library)
37
40
thus building models specific to their real-world scenarios.
38
-
-**High-performance pose estimation**: HyperPose achieves real-time pose estimation though a high-performance pose estimation engine. This engine implements numerous system optimizations: pipeline parallelism, model inference with TensorRT, CPU/GPU hybrid scheduling, and many others. This allows HyperPose to **run 4x FASTER than OpenPose and 10x FASTER than TF-Pose**.
39
41
40
-
## Documentation
42
+
## Quick Start
43
+
44
+
The HyperPose library contains two parts:
45
+
* A C++ library for high-performance pose estimation model inference.
46
+
* A Python library for developing custom pose estimation models (e.g., OpenPose, PifPaf, PoseProposal).
41
47
42
-
You can install HyperPose(Python Training Library, C++ inference Library) and learn its APIs through [HyperPose Documentation](https://hyperpose.readthedocs.io/en/latest/).
48
+
### C++ inference library
43
49
44
-
## Quick-Start with Docker
50
+
The best way to try the inference library is using a [Docker image](https://hub.docker.com/r/tensorlayer/hyperpose). Pre-requisites for running this images are:
45
51
46
-
The official docker image is on [DockerHub](https://hub.docker.com/r/tensorlayer/hyperpose).
Make sure you have [docker](https://docs.docker.com/get-docker/) with [nvidia-docker](https://github.com/NVIDIA/nvidia-docker) functionality installed.
56
+
Once pre-requisites are ready, you can pull
57
+
the HyperPose docker as follows:
58
+
59
+
```bash
60
+
docker pull tensorlayer/hyperpose
61
+
```
49
62
50
-
> Also note that your nvidia driver should be [compatible](https://docs.nvidia.com/deploy/cuda-compatibility/index.html#support-title)with CUDA10.2.
63
+
We provide 4 examples to run with this image (The following commands have been tested with Ubuntu 18.04):
51
64
52
65
```bash
53
-
# [Example 1]: Doing inference on given video, copy the output.avi to the local path.
66
+
# [Example 1]: Doing inference on given video, copy the output.avi to the local path.
54
67
docker run --name quick-start --gpus all tensorlayer/hyperpose --runtime=stream
@@ -73,11 +86,19 @@ xhost +; docker run --rm --gpus all -it -e DISPLAY=$DISPLAY -v /tmp/.X11-unix:/t
73
86
# docker run --rm --gpus all -it --entrypoint /bin/bash tensorlayer/hyperpose
74
87
```
75
88
76
-
> For more details, please check [here](https://hyperpose.readthedocs.io/en/latest/markdown/quick_start/prediction.html#table-of-flags-for-hyperpose-cli).
89
+
More information of the HyperPose Docker image can be found [here](https://hyperpose.readthedocs.io/en/latest/markdown/quick_start/prediction.html#table-of-flags-for-hyperpose-cli).
90
+
91
+
### Python training library
92
+
93
+
To install the Python training library, you can follow the steps [here](https://hyperpose.readthedocs.io/en/latest/markdown/install/training.html).
94
+
95
+
## Documentation
96
+
97
+
The APIs of the HyperPose training library and the inference library are both described in [Documentation](https://hyperpose.readthedocs.io/en/latest/).
77
98
78
99
## Performance
79
100
80
-
We compare the prediction performance of HyperPose with [OpenPose 1.6](https://github.com/CMU-Perceptual-Computing-Lab/openpose) and [TF-Pose](https://github.com/ildoonet/tf-pose-estimation). We implement the OpenPose algorithms with different configurations in HyperPose. The test-bed has Ubuntu18.04, 1070Ti GPU, Intel i7 CPU (12 logic cores).
101
+
We compare the prediction performance of HyperPose with [OpenPose 1.6](https://github.com/CMU-Perceptual-Computing-Lab/openpose) and [TF-Pose](https://github.com/ildoonet/tf-pose-estimation). We implement the OpenPose algorithms with different configurations in HyperPose. The test-bed has Ubuntu18.04, 1070Ti GPU, Intel i7 CPU (12 logic cores).
0 commit comments