论文标题
PYSKL:迈向骨骼动作识别的良好实践
PYSKL: Towards Good Practices for Skeleton Action Recognition
论文作者
论文摘要
我们提出Pyskl:基于Pytorch的基于骨架的动作识别的开源工具箱。该工具箱支持各种骨架动作识别算法,包括基于GCN和CNN的方法。与现有的开源骨骼行动识别项目相比,仅包括一种或两种算法,Pyskl在统一框架下实现了六种不同的算法,并具有最新和最初的良好实践,以减轻功效和效率的比较。我们还提供了一个名为ST-GCN ++的原始基于GCN的骨架动作识别模型,该模型可在没有任何复杂的注意力方案的情况下实现竞争性识别性能,作为强大的基线。同时,Pyskl支持九个基于骨架的动作识别基准的培训和测试,并在其中八个方面取得了最先进的识别表现。为了促进对骨架动作识别的未来研究,我们还提供了大量训练有素的模型和详细的基准结果,以提供一些见解。 Pyskl于https://github.com/kennymckormick/pyskl发行,并积极维护。添加新功能或基准测试时,我们将更新此报告。当前版本对应于Pyskl V0.2。
We present PYSKL: an open-source toolbox for skeleton-based action recognition based on PyTorch. The toolbox supports a wide variety of skeleton action recognition algorithms, including approaches based on GCN and CNN. In contrast to existing open-source skeleton action recognition projects that include only one or two algorithms, PYSKL implements six different algorithms under a unified framework with both the latest and original good practices to ease the comparison of efficacy and efficiency. We also provide an original GCN-based skeleton action recognition model named ST-GCN++, which achieves competitive recognition performance without any complicated attention schemes, serving as a strong baseline. Meanwhile, PYSKL supports the training and testing of nine skeleton-based action recognition benchmarks and achieves state-of-the-art recognition performance on eight of them. To facilitate future research on skeleton action recognition, we also provide a large number of trained models and detailed benchmark results to give some insights. PYSKL is released at https://github.com/kennymckormick/pyskl and is actively maintained. We will update this report when we add new features or benchmarks. The current version corresponds to PYSKL v0.2.