论文标题

Scalene:Python的脚本语言意识分析

Scalene: Scripting-Language Aware Profiling for Python

论文作者

Berger, Emery D.

论文摘要

像Python这样的脚本语言(又称“胶水”语言)的现有参考器遭受了许多问题,这些问题极大地限制了它们的实用性。他们施加了魔力顺序的开销,以颗粒状过于粗糙或在线程面前报告信息。更糟糕的是,过去的剖道师 - 基本上是C ---基本上是其对应物的变体,因为它忽略了以下事实:在脚本语言中优化代码需要有关跨越脚本语言和以编译语言编写的库之间划分的代码的信息。 本文介绍了脚本语言意识分析,并介绍了Scalene,这是Python的脚本语言意识分析的实现。 Scalene采用了将字节代码的采样,推理和拆卸的组合来有效,精确地将执行时间和内存使用归因于Python,开发人员可以优化或库代码,或者库不能。它包括一种新型的采样内存分配器,该分配器报告线条级内存消耗和开销低的趋势,从而帮助开发人员减少足迹并识别泄漏。最后,它引入了一个新的指标,复制量,以帮助开发人员扎根python/library边界上的阴险复制成本,这可以极大地降低性能。 Scalene适用于单个或多线程Python代码,是精确的,在线粒度上报告详细信息,同时施加适度的开销(26% - 53%)。

Existing profilers for scripting languages (a.k.a. "glue" languages) like Python suffer from numerous problems that drastically limit their usefulness. They impose order-of-magnitude overheads, report information at too coarse a granularity, or fail in the face of threads. Worse, past profilers---essentially variants of their counterparts for C---are oblivious to the fact that optimizing code in scripting languages requires information about code spanning the divide between the scripting language and libraries written in compiled languages. This paper introduces scripting-language aware profiling, and presents Scalene, an implementation of scripting-language aware profiling for Python. Scalene employs a combination of sampling, inference, and disassembly of byte-codes to efficiently and precisely attribute execution time and memory usage to either Python, which developers can optimize, or library code, which they cannot. It includes a novel sampling memory allocator that reports line-level memory consumption and trends with low overhead, helping developers reduce footprints and identify leaks. Finally, it introduces a new metric, copy volume, to help developers root out insidious copying costs across the Python/library boundary, which can drastically degrade performance. Scalene works for single or multi-threaded Python code, is precise, reporting detailed information at the line granularity, while imposing modest overheads (26%--53%).

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源