Skip to content

Commit 718c199

Browse files
authored
Merge branch 'master' into localparttype
2 parents dd193a6 + 3b39b1b commit 718c199

297 files changed

Lines changed: 21577 additions & 2719 deletions

File tree

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

.github/workflows/test.yml

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -112,6 +112,11 @@ jobs:
112112
toxenv: py
113113
tox_extra_args: "-n 4 --mypy-num-workers=4 mypy/test/testcheck.py"
114114
test_mypyc: true
115+
- name: Parallel tests with py314-windows-64, interpreted
116+
python: '3.14'
117+
os: windows-latest
118+
toxenv: py
119+
tox_extra_args: "-n 2 --mypy-num-workers=2 mypy/test/testcheck.py -k 'incremental or modules or classes'"
115120

116121
- name: Type check our own code (py310-ubuntu)
117122
python: '3.10'

docs/source/command_line.rst

Lines changed: 22 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -598,8 +598,8 @@ of the above sections.
598598
.. option:: --allow-redefinition-new
599599

600600
By default, mypy won't allow a variable to be redefined with an
601-
unrelated type. This *experimental* flag enables the redefinition of
602-
unannotated variables with an arbitrary type. You will also need to enable
601+
unrelated type. This flag enables the redefinition of unannotated
602+
variables with an arbitrary type. You will also need to enable
603603
:option:`--local-partial-types <mypy --local-partial-types>`.
604604
Example:
605605

@@ -631,12 +631,31 @@ of the above sections.
631631
# Type of "x" is "str" here.
632632
...
633633
634+
Function arguments are special, changing their type within function body
635+
is allowed even if they are annotated, but that annotation is used to infer
636+
types of r.h.s. of all subsequent assignments. Such middle-ground semantics
637+
provides good balance for majority of common use cases. For example:
638+
639+
.. code-block:: python
640+
641+
def process(values: list[float]) -> None:
642+
if not values:
643+
values = [0, 0, 0]
644+
reveal_type(values) # Revealed type is list[float]
645+
634646
Note: We are planning to turn this flag on by default in a future mypy
635647
release, along with :option:`--local-partial-types <mypy --local-partial-types>`.
636648
The feature is still experimental, and the semantics may still change.
637649

638650
.. option:: --allow-redefinition
639651

652+
This is an alias to :option:`--allow-redefinition-old <mypy --allow-redefinition-old>`.
653+
In mypy v2.0 this will point to
654+
:option:`--allow-redefinition-new <mypy --allow-redefinition-new>`, and will
655+
eventually became the default.
656+
657+
.. option:: --allow-redefinition-old
658+
640659
This is an older variant of
641660
:option:`--allow-redefinition-new <mypy --allow-redefinition-new>`.
642661
This flag enables redefinition of a variable with an
@@ -689,7 +708,7 @@ of the above sections.
689708
reveal_type(Foo().bar) # 'int | None' without --local-partial-types
690709
691710
Note: this option is always implicitly enabled in mypy daemon and will become
692-
enabled by default for mypy in a future release.
711+
enabled by default in mypy v2.0 release.
693712

694713
.. option:: --no-implicit-reexport
695714

docs/source/common_issues.rst

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -148,6 +148,14 @@ error:
148148
The second line is now fine, since the ignore comment causes the name
149149
``frobnicate`` to get an implicit ``Any`` type.
150150

151+
The type ignore comment must be at the start of the comments on a line.
152+
This type ignore will not take effect:
153+
154+
.. code-block:: python
155+
156+
import frobnicate #example other comment # type: ignore
157+
frobnicate.start()
158+
151159
.. note::
152160

153161
You can use the form ``# type: ignore[<code>]`` to only ignore

docs/source/config_file.rst

Lines changed: 24 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -719,8 +719,8 @@ section of the command line docs.
719719
:default: False
720720

721721
By default, mypy won't allow a variable to be redefined with an
722-
unrelated type. This *experimental* flag enables the redefinition of
723-
unannotated variables with an arbitrary type. You will also need to enable
722+
unrelated type. This flag enables the redefinition of unannotated
723+
variables with an arbitrary type. You will also need to enable
724724
:confval:`local_partial_types`.
725725
Example:
726726

@@ -748,10 +748,22 @@ section of the command line docs.
748748
# Type of "x" is "str" here.
749749
...
750750
751+
Function arguments are special, changing their type within function body
752+
is allowed even if they are annotated, but that annotation is used to infer
753+
types of r.h.s. of all subsequent assignments. Such middle-ground semantics
754+
provides good balance for majority of common use cases. For example:
755+
756+
.. code-block:: python
757+
758+
def process(values: list[float]) -> None:
759+
if not values:
760+
values = [0, 0, 0]
761+
reveal_type(values) # Revealed type is list[float]
762+
751763
Note: We are planning to turn this flag on by default in a future mypy
752764
release, along with :confval:`local_partial_types`.
753765

754-
.. confval:: allow_redefinition
766+
.. confval:: allow_redefinition_old
755767

756768
:type: boolean
757769
:default: False
@@ -777,14 +789,22 @@ section of the command line docs.
777789
items = "100" # valid, items now has type str
778790
items = int(items) # valid, items now has type int
779791
792+
.. confval:: allow_redefinition
793+
794+
:type: boolean
795+
:default: False
796+
797+
An alias to :confval:`allow_redefinition_old`, in mypy v2.0 this will point to
798+
:confval:`allow_redefinition_new`, and will eventually became the default.
799+
780800
.. confval:: local_partial_types
781801

782802
:type: boolean
783803
:default: False
784804

785805
Disallows inferring variable type for ``None`` from two assignments in different scopes.
786806
This is always implicitly enabled when using the :ref:`mypy daemon <mypy_daemon>`.
787-
This will be enabled by default in a future mypy release.
807+
This will be enabled by default in mypy v2.0 release.
788808

789809
.. confval:: disable_error_code
790810

docs/source/stubs.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -126,7 +126,7 @@ For example:
126126
"""Some docstring"""
127127
pass
128128
129-
# Error: Incompatible default for argument "foo" (default has
130-
# type "ellipsis", argument has type "list[str]")
129+
# Error: Incompatible default for parameter "foo" (default has
130+
# type "ellipsis", parameter has type "list[str]")
131131
def not_ok(self, foo: list[str] = ...) -> None:
132132
print(foo)

misc/apply-cache-diff.py

Lines changed: 19 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -3,6 +3,8 @@
33
44
With some infrastructure, this can allow for distributing small cache diffs to users in
55
many cases instead of full cache artifacts.
6+
7+
Use diff-cache.py to generate a cache diff.
68
"""
79

810
from __future__ import annotations
@@ -13,6 +15,10 @@
1315

1416
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
1517

18+
from librt import base64
19+
from librt.internal import ReadBuffer
20+
21+
from mypy.cache import CacheMeta
1622
from mypy.metastore import FilesystemMetadataStore, MetadataStore, SqliteMetadataStore
1723
from mypy.util import json_dumps, json_loads
1824

@@ -35,10 +41,19 @@ def apply_diff(cache_dir: str, diff_file: str, sqlite: bool = False) -> None:
3541
if data is None:
3642
cache.remove(file)
3743
else:
38-
cache.write(file, data)
39-
if file.endswith(".meta.json") and "@deps" not in file:
40-
meta = json_loads(data)
41-
old_deps["snapshot"][meta["id"]] = meta["hash"]
44+
if file.endswith(".ff"):
45+
data_bytes = base64.b64decode(data)
46+
else:
47+
data_bytes = data.encode() if isinstance(data, str) else data
48+
cache.write(file, data_bytes)
49+
if file.endswith(".meta.ff") and "@deps" not in file:
50+
buf = ReadBuffer(data_bytes[2:])
51+
meta = CacheMeta.read(buf, data_file="")
52+
assert meta is not None
53+
old_deps["snapshot"][meta.id] = meta.hash
54+
elif file.endswith(".meta.json") and "@deps" not in file:
55+
meta_dict = json_loads(data_bytes)
56+
old_deps["snapshot"][meta_dict["id"]] = meta_dict["hash"]
4257

4358
cache.write("@deps.meta.json", json_dumps(old_deps))
4459

misc/convert-cache.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -50,7 +50,7 @@ def main() -> None:
5050
input, output = SqliteMetadataStore(input_dir), FilesystemMetadataStore(output_dir)
5151

5252
for s in input.list_all():
53-
if s.endswith(".json"):
53+
if s.endswith((".json", ".ff")):
5454
assert output.write(
5555
s, input.read(s), input.getmtime(s)
5656
), f"Failed to write cache file {s}!"

misc/diff-cache.py

Lines changed: 110 additions & 24 deletions
Original file line numberDiff line numberDiff line change
@@ -15,6 +15,10 @@
1515

1616
sys.path.insert(0, os.path.dirname(os.path.dirname(os.path.abspath(__file__))))
1717

18+
from librt import base64
19+
from librt.internal import ReadBuffer, WriteBuffer
20+
21+
from mypy.cache import CacheMeta
1822
from mypy.metastore import FilesystemMetadataStore, MetadataStore, SqliteMetadataStore
1923
from mypy.util import json_dumps, json_loads
2024

@@ -31,40 +35,116 @@ def merge_deps(all: dict[str, set[str]], new: dict[str, set[str]]) -> None:
3135
all.setdefault(k, set()).update(v)
3236

3337

38+
def sort_deps(
39+
dependencies: list[str], suppressed: list[str], dep_prios: list[int], dep_lines: list[int]
40+
) -> tuple[list[str], list[str], list[int], list[int]]:
41+
"""Sort dependencies and suppressed independently, keeping prios/lines aligned."""
42+
all_deps = list(zip(dependencies + suppressed, dep_prios, dep_lines))
43+
num_deps = len(dependencies)
44+
sorted_deps = sorted(all_deps[:num_deps])
45+
sorted_supp = sorted(all_deps[num_deps:])
46+
if sorted_deps:
47+
deps_t, prios1_t, lines1_t = zip(*sorted_deps)
48+
deps_out = list(deps_t)
49+
prios1 = list(prios1_t)
50+
lines1 = list(lines1_t)
51+
else:
52+
deps_out = []
53+
prios1 = []
54+
lines1 = []
55+
if sorted_supp:
56+
supp_t, prios2_t, lines2_t = zip(*sorted_supp)
57+
supp_out = list(supp_t)
58+
prios2 = list(prios2_t)
59+
lines2 = list(lines2_t)
60+
else:
61+
supp_out = []
62+
prios2 = []
63+
lines2 = []
64+
return deps_out, supp_out, prios1 + prios2, lines1 + lines2
65+
66+
67+
def normalize_meta(meta: CacheMeta) -> None:
68+
"""Normalize a CacheMeta instance to avoid spurious diffs.
69+
70+
Zero out mtimes and sort dependencies deterministically.
71+
"""
72+
meta.mtime = 0
73+
meta.data_mtime = 0
74+
meta.dependencies, meta.suppressed, meta.dep_prios, meta.dep_lines = sort_deps(
75+
meta.dependencies, meta.suppressed, meta.dep_prios, meta.dep_lines
76+
)
77+
78+
79+
def serialize_meta_ff(meta: CacheMeta, version_prefix: bytes) -> bytes:
80+
"""Serialize a CacheMeta instance back to fixed format binary."""
81+
buf = WriteBuffer()
82+
meta.write(buf)
83+
return version_prefix + buf.getvalue()
84+
85+
86+
def normalize_json_meta(obj: dict[str, Any]) -> None:
87+
"""Normalize a JSON meta dict to avoid spurious diffs.
88+
89+
Zero out mtimes and sort dependencies deterministically.
90+
"""
91+
obj["mtime"] = 0
92+
obj["data_mtime"] = 0
93+
if "dependencies" in obj:
94+
obj["dependencies"], obj["suppressed"], obj["dep_prios"], obj["dep_lines"] = sort_deps(
95+
obj["dependencies"], obj["suppressed"], obj["dep_prios"], obj["dep_lines"]
96+
)
97+
98+
3499
def load(cache: MetadataStore, s: str) -> Any:
100+
"""Load and normalize a cache entry.
101+
102+
Returns:
103+
- For .meta.ff: normalized binary bytes (with version prefix)
104+
- For .data.ff: raw binary bytes
105+
- For .meta.json/.data.json/.deps.json: parsed and normalized dict/list
106+
"""
35107
data = cache.read(s)
108+
if s.endswith(".meta.ff"):
109+
version_prefix = data[:2]
110+
buf = ReadBuffer(data[2:])
111+
meta = CacheMeta.read(buf, data_file="")
112+
if meta is None:
113+
# Can't deserialize (e.g. different mypy version). Fall back to
114+
# raw bytes -- we lose mtime normalization but the diff stays correct.
115+
return data
116+
normalize_meta(meta)
117+
return serialize_meta_ff(meta, version_prefix)
118+
if s.endswith(".data.ff"):
119+
return data
36120
obj = json_loads(data)
37121
if s.endswith(".meta.json"):
38-
# For meta files, zero out the mtimes and sort the
39-
# dependencies to avoid spurious conflicts
40-
obj["mtime"] = 0
41-
obj["data_mtime"] = 0
42-
if "dependencies" in obj:
43-
all_deps = obj["dependencies"] + obj["suppressed"]
44-
num_deps = len(obj["dependencies"])
45-
thing = list(zip(all_deps, obj["dep_prios"], obj["dep_lines"]))
46-
47-
def unzip(x: Any) -> Any:
48-
return zip(*x) if x else ((), (), ())
49-
50-
obj["dependencies"], prios1, lines1 = unzip(sorted(thing[:num_deps]))
51-
obj["suppressed"], prios2, lines2 = unzip(sorted(thing[num_deps:]))
52-
obj["dep_prios"] = prios1 + prios2
53-
obj["dep_lines"] = lines1 + lines2
122+
normalize_json_meta(obj)
54123
if s.endswith(".deps.json"):
55124
# For deps files, sort the deps to avoid spurious mismatches
56125
for v in obj.values():
57126
v.sort()
58127
return obj
59128

60129

130+
def encode_for_diff(s: str, obj: object) -> str:
131+
"""Encode a cache entry value for inclusion in the JSON diff.
132+
133+
Fixed format binary entries are base64-encoded, JSON entries are
134+
re-serialized as JSON strings.
135+
"""
136+
if isinstance(obj, bytes):
137+
return base64.b64encode(obj).decode()
138+
return json_dumps(obj).decode()
139+
140+
61141
def main() -> None:
62142
parser = argparse.ArgumentParser()
63143
parser.add_argument("--verbose", action="store_true", default=False, help="Increase verbosity")
64144
parser.add_argument("--sqlite", action="store_true", default=False, help="Use a sqlite cache")
65-
parser.add_argument("input_dir1", help="Input directory for the cache")
66-
parser.add_argument("input_dir2", help="Input directory for the cache")
67-
parser.add_argument("output", help="Output file")
145+
parser.add_argument("input_dir1", help="Input directory for the original cache")
146+
parser.add_argument("input_dir2", help="Input directory for the target cache")
147+
parser.add_argument("output", help="Output file with the diff from original cache")
68148
args = parser.parse_args()
69149

70150
cache1 = make_cache(args.input_dir1, args.sqlite)
@@ -73,7 +153,7 @@ def main() -> None:
73153
type_misses: dict[str, int] = defaultdict(int)
74154
type_hits: dict[str, int] = defaultdict(int)
75155

76-
updates: dict[str, bytes | None] = {}
156+
updates: dict[str, str | None] = {}
77157

78158
deps1: dict[str, set[str]] = {}
79159
deps2: dict[str, set[str]] = {}
@@ -96,10 +176,12 @@ def main() -> None:
96176
# so we can produce a much smaller direct diff of them.
97177
if ".deps." not in s:
98178
if obj2 is not None:
99-
updates[s] = json_dumps(obj2)
179+
updates[s] = encode_for_diff(s, obj2)
100180
else:
101181
updates[s] = None
102182
elif obj2:
183+
# This is a deps file, with json data
184+
assert ".deps." in s
103185
merge_deps(deps1, obj1)
104186
merge_deps(deps2, obj2)
105187
else:
@@ -109,11 +191,15 @@ def main() -> None:
109191
cache1_all_set = set(cache1_all)
110192
for s in cache2.list_all():
111193
if s not in cache1_all_set:
112-
updates[s] = cache2.read(s)
194+
raw = cache2.read(s)
195+
if s.endswith(".ff"):
196+
updates[s] = base64.b64encode(raw).decode()
197+
else:
198+
updates[s] = raw.decode()
113199

114200
# Compute what deps have been added and merge them all into the
115201
# @root deps file.
116-
new_deps = {k: deps1.get(k, set()) - deps2.get(k, set()) for k in deps2}
202+
new_deps = {k: deps2.get(k, set()) - deps1.get(k, set()) for k in deps2}
117203
new_deps = {k: v for k, v in new_deps.items() if v}
118204
try:
119205
root_deps = load(cache1, "@root.deps.json")
@@ -122,7 +208,7 @@ def main() -> None:
122208
merge_deps(new_deps, root_deps)
123209

124210
new_deps_json = {k: list(v) for k, v in new_deps.items() if v}
125-
updates["@root.deps.json"] = json_dumps(new_deps_json)
211+
updates["@root.deps.json"] = json_dumps(new_deps_json).decode()
126212

127213
# Drop updates to deps.meta.json for size reasons. The diff
128214
# applier will manually fix it up.

0 commit comments

Comments
 (0)