Background
In python enum.Flag
appears to offer an ergonomic interface that can be used to represent an arbitrary combination of a fixed set of boolean flag values. But the following program seems to show that simply using int
to represent such flags and manipulating them with bitwise operators, as is traditionally done, is 8-10x more performant than doing the equivalent operations with enum.Flag
types:
import enum
import random
import time
class FlagsEnum(enum.Flag):
ONE = enum.auto()
TWO = enum.auto()
FOUR = enum.auto()
EIGHT = enum.auto()
SIXTEEN = enum.auto()
class FlagsInt:
ONE = 0x01
TWO = 0x02
FOUR = 0x04
EIGHT = 0x08
SIXTEEN = 0x10
workload_ints = [random.randrange(31) for _ in range(10_000)]
workload_enums = [FlagsEnum(i) for i in workload_ints]
start = time.perf_counter()
for x in workload_ints:
bool(x & FlagsInt.FOUR) # Check a bit
x |= FlagsInt.SIXTEEN # set a bit
x &= ~FlagsInt.EIGHT # clear a bit
x ^= FlagsInt.TWO # toggle a bit
print("ints took ", "%.4f" % (time.perf_counter() - start))
start = time.perf_counter()
for x in workload_enums:
FlagsEnum.FOUR in x # Check a bit
x |= FlagsEnum.SIXTEEN # set a bi
x &= ~FlagsEnum.EIGHT # clear a bit
x ^= FlagsEnum.TWO # toggle a bit
print("FlagsEnums took ", "%.4f" % (time.perf_counter() - start))
Results:
ints took 0.0044
FlagsEnums took 0.0335
Question
Why is the performance of enum.Flag
so terrible compared to int
, and what is the point of using enum.Flag
's at all if they're so much slower than int
s? Surely it is an extreme minority of programs that are willing to take an 80-90% performance hit, just so that they can enjoy the (very minor) ergonomic and type-safety benefits enum.Flag
offers compared to int
.
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1744337736a4569214.html
评论列表(0条)