1 /* 2 * Copyright (C) 2017 The Android Open Source Project 3 * 4 * Licensed under the Apache License, Version 2.0 (the "License"); 5 * you may not use this file except in compliance with the License. 6 * You may obtain a copy of the License at 7 * 8 * http://www.apache.org/licenses/LICENSE-2.0 9 * 10 * Unless required by applicable law or agreed to in writing, software 11 * distributed under the License is distributed on an "AS IS" BASIS, 12 * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. 13 * See the License for the specific language governing permissions and 14 * limitations under the License. 15 */ 16 17 #ifndef ART_RUNTIME_SUBTYPE_CHECK_H_ 18 #define ART_RUNTIME_SUBTYPE_CHECK_H_ 19 20 #include "subtype_check_bits_and_status.h" 21 #include "subtype_check_info.h" 22 23 #include "base/locks.h" 24 #include "mirror/class.h" 25 #include "runtime.h" 26 27 // Build flag for the bitstring subtype check runtime hooks. 28 constexpr bool kBitstringSubtypeCheckEnabled = false; 29 30 /** 31 * Any node in a tree can have its path (from the root to the node) represented as a string by 32 * concatenating the path of the parent to that of the current node. 33 * 34 * We can annotate each node with a `sibling-label` which is some value unique amongst all of the 35 * node's siblings. As a special case, the root is empty. 36 * 37 * (none) 38 * / | \ 39 * A B C 40 * / \ 41 * A’ B’ 42 * | 43 * A’’ 44 * | 45 * A’’’ 46 * | 47 * A’’’’ 48 * 49 * Given these sibling-labels, we can now encode the path from any node to the root by starting at 50 * the node and going up to the root, marking each node with this `path-label`. The special 51 * character $ means "end of path". 52 * 53 * $ 54 * / | \ 55 * A$ B$ C$ 56 * / \ 57 * A’A$ B’A$ 58 * | 59 * A’’B’A$ 60 * | 61 * A’’’A’’B’A$ 62 * | 63 * A’’’’A’’B’A$ 64 * 65 * Given the above `path-label` we can express if any two nodes are an offspring of the other 66 * through a O(1) expression: 67 * 68 * x <: y := 69 * suffix(x, y) == y 70 * 71 * In the above example suffix(x,y) means the suffix of x that is as long as y (right-padded with 72 * $s if x is shorter than y) : 73 * 74 * suffix(x,y) := x(x.length - y.length .. 0] 75 * + repeat($, max(y.length - x.length, 0)) 76 * 77 * A few generalities here to elaborate: 78 * 79 * - There can be at most D levels in the tree. 80 * - Each level L has an alphabet A, and the maximum number of 81 * nodes is determined by |A| 82 * - The alphabet A can be a subset, superset, equal, or unique with respect to the other alphabets 83 * without loss of generality. (In practice it would almost always be a subset of the previous 84 * level’s alphabet as we assume most classes have less children the deeper they are.) 85 * - The `sibling-label` doesn’t need to be stored as an explicit value. It can a temporary when 86 * visiting every immediate child of a node. Only the `path-label` needs to be actually stored for 87 * every node. 88 * 89 * The path can also be reversed, and use a prefix instead of a suffix to define the subchild 90 * relation. 91 * 92 * $ 93 * / | \ \ 94 * A$ B$ C$ D$ 95 * / \ 96 * AA’$ AB’$ 97 * | 98 * AB’A’’$ 99 * | 100 * AB’A’’A’’’$ 101 * | 102 * AB’A’’A’’’A’’’’$ 103 * 104 * x <: y := 105 * prefix(x, y) == y 106 * 107 * prefix(x,y) := x[0 .. y.length) 108 * + repeat($, max(y.length - x.length, 0)) 109 * 110 * In a dynamic tree, new nodes can be inserted at any time. This means if a minimal alphabet is 111 * selected to contain the initial tree hierarchy, later node insertions will be illegal because 112 * there is no more room to encode the path. 113 * 114 * In this simple example with an alphabet A,B,C and max level 1: 115 * 116 * Level 117 * 0: $ 118 * / | \ \ 119 * 1: A$ B$ C$ D$ (illegal) 120 * | 121 * 2: AA$ (illegal) 122 * 123 * Attempting to insert the sibling “D” at Level 1 would be illegal because the Alphabet(1) is 124 * {A,B,C} and inserting an extra node would mean the `sibling-label` is no longer unique. 125 * Attempting to insert “AA$” is illegal because the level 2 is more than the max level 1. 126 * 127 * One solution to this would be to revisit the entire graph, select a larger alphabet to that 128 * every `sibling-label` is unique, pick a larger max level count, and then store the updated 129 * `path-label` accordingly. 130 * 131 * The more common approach would instead be to select a set of alphabets and max levels statically, 132 * with large enough sizes, for example: 133 * 134 * Alphabets = {{A,B,C,D}, {A,B,C}, {A,B}, {A}} 135 * Max Levels = |Alphabets| 136 * 137 * Which would allow up to 4 levels with each successive level having 1 less max siblings. 138 * 139 * Attempting to insert a new node into the graph which does not fit into that level’s alphabet 140 * would be represented by re-using the `path-label` of the parent. Such a `path_label` would be 141 * considered truncated (because it would only have a prefix of the full path from the root to the 142 * node). 143 * 144 * Level 145 * 0: $ 146 * / | \ \ 147 * 1: A$ B$ C$ $ (same as parent) 148 * | 149 * 2: A$ (same as parent) 150 * 151 * The updated relation for offspring is then: 152 * 153 * x <: y := 154 * if !truncated_path(y): 155 * return prefix(x, y) == y // O(1) 156 * else: 157 * return slow_check_is_offspring(x, y) // worse than O(1) 158 * 159 * (Example definition of truncated_path -- any semantically equivalent way to check that the 160 * sibling's `sibling-label` is not unique will do) 161 * 162 * truncated_path(y) := 163 * return y == parent(y) 164 * 165 * (Example definition. Any slower-than-O(1) definition will do here. This is the traversing 166 * superclass hierarchy solution) 167 * 168 * slow_check_is_offspring(x, y) := 169 * if not x: return false 170 * else: return x == y || recursive_is_offspring(parent(x), y) 171 * 172 * In which case slow_check_is_offspring is some non-O(1) way to check if x and is an offspring of y. 173 * 174 * In addition, note that it doesn’t matter if the "x" from above is a unique sibling or not; the 175 * relation will still be correct. 176 * 177 * ------------------------------------------------------------------------------------------------ 178 * 179 * Leveraging truncated paths to minimize path lengths. 180 * 181 * As observed above, for any x <: y, it is sufficient to have a full path only for y, 182 * and x can be truncated (to its nearest ancestor's full path). 183 * 184 * We call a node that stores a full path "Assigned", and a node that stores a truncated path 185 * either "Initialized" or "Overflowed." 186 * 187 * "Initialized" means it is still possible to assign a full path to the node, and "Overflowed" 188 * means there is insufficient characters in the alphabet left. 189 * 190 * In this example, assume that we attempt to "Assign" all non-leafs if possible. Leafs 191 * always get truncated (as either Initialized or Overflowed). 192 * 193 * Alphabets = {{A,B,C,D}, {A,B}} 194 * Max Levels = |Alphabets| 195 * 196 * Level 197 * 0: $ 198 * / | \ \ \ 199 * 1: A$ B$ C$ D$ $ (Overflowed: Too wide) 200 * | | 201 * 2: AA$ C$ (Initialized) 202 * | 203 * 3: AA$ (Overflowed: Too deep) 204 * 205 * (All un-annotated nodes are "Assigned"). 206 * Above, the node at level 3 becomes overflowed because it exceeds the max levels. The 207 * right-most node at level 1 becomes overflowed because there's no characters in the alphabet 208 * left in that level. 209 * 210 * The "C$" node is Initialized at level 2, but it can still be promoted to "Assigned" later on 211 * if we wanted to. 212 * 213 * In particular, this is the strategy we use in our implementation 214 * (SubtypeCheck::EnsureInitialized, SubtypeCheck::EnsureAssigned). 215 * 216 * Since the # of characters in our alphabet (BitString) is very limited, we want to avoid 217 * allocating a character to a node until its absolutely necessary. 218 * 219 * All node targets (in `src <: target`) get Assigned, and any parent of an Initialized 220 * node also gets Assigned. 221 */ 222 namespace art { 223 224 struct MockSubtypeCheck; // Forward declaration for testing. 225 226 // This class is using a template parameter to enable testability without losing performance. 227 // ClassPtr is almost always `mirror::Class*` or `ObjPtr<mirror::Class>`. 228 template <typename ClassPtr /* Pointer-like type to Class */> 229 struct SubtypeCheck { 230 // Force this class's SubtypeCheckInfo state into at least Initialized. 231 // As a side-effect, all parent classes also become Assigned|Overflowed. 232 // 233 // Cost: O(Depth(Class)) 234 // 235 // Post-condition: State is >= Initialized. 236 // Returns: The precise SubtypeCheckInfo::State. EnsureInitializedSubtypeCheck237 static SubtypeCheckInfo::State EnsureInitialized(ClassPtr klass) 238 REQUIRES(Locks::subtype_check_lock_) 239 REQUIRES_SHARED(Locks::mutator_lock_) { 240 return InitializeOrAssign(klass, /*assign=*/false).GetState(); 241 } 242 243 // Force this class's SubtypeCheckInfo state into Assigned|Overflowed. 244 // As a side-effect, all parent classes also become Assigned|Overflowed. 245 // 246 // Cost: O(Depth(Class)) 247 // 248 // Post-condition: State is Assigned|Overflowed. 249 // Returns: The precise SubtypeCheckInfo::State. EnsureAssignedSubtypeCheck250 static SubtypeCheckInfo::State EnsureAssigned(ClassPtr klass) 251 REQUIRES(Locks::subtype_check_lock_) 252 REQUIRES_SHARED(Locks::mutator_lock_) { 253 return InitializeOrAssign(klass, /*assign=*/true).GetState(); 254 } 255 256 // Resets the SubtypeCheckInfo into the Uninitialized state. 257 // 258 // Intended only for the AOT image writer. 259 // This is a static function to avoid calling klass.Depth(), which is unsupported 260 // in some portions of the image writer. 261 // 262 // Cost: O(1). 263 // 264 // Returns: A state that is always Uninitialized. ForceUninitializeSubtypeCheck265 static SubtypeCheckInfo::State ForceUninitialize(ClassPtr klass) 266 REQUIRES(Locks::subtype_check_lock_) 267 REQUIRES_SHARED(Locks::mutator_lock_) { 268 // Trying to do this in a real runtime will break thread safety invariants 269 // of existing live objects in the class hierarchy. 270 // This is only safe as the last step when the classes are about to be 271 // written out as an image and IsSubClass is never used again. 272 DCHECK(Runtime::Current() == nullptr || Runtime::Current()->IsAotCompiler()) 273 << "This only makes sense when compiling an app image."; 274 275 // Directly read/write the class field here. 276 // As this method is used by image_writer on a copy, 277 // the Class* there is not a real class and using it for anything 278 // more complicated (e.g. ObjPtr or Depth call) will fail dchecks. 279 280 // OK. zero-initializing subtype_check_info_ puts us into the kUninitialized state. 281 SubtypeCheckBits scb_uninitialized = SubtypeCheckBits{}; 282 WriteSubtypeCheckBits(klass, scb_uninitialized); 283 284 // Do not use "SubtypeCheckInfo" API here since that requires Depth() 285 // which would cause a dcheck failure. 286 return SubtypeCheckInfo::kUninitialized; 287 } 288 289 // Retrieve the state of this class's SubtypeCheckInfo. 290 // 291 // Cost: O(Depth(Class)). 292 // 293 // Returns: The precise SubtypeCheckInfo::State. GetStateSubtypeCheck294 static SubtypeCheckInfo::State GetState(ClassPtr klass) 295 REQUIRES(Locks::subtype_check_lock_) 296 REQUIRES_SHARED(Locks::mutator_lock_) { 297 return GetSubtypeCheckInfo(klass).GetState(); 298 } 299 300 // Retrieve the path to root bitstring as a plain uintN_t value that is amenable to 301 // be used by a fast check "encoded_src & mask_target == encoded_target". 302 // 303 // Cost: O(Depth(Class)). 304 // 305 // Returns the encoded_src value. Must be >= Initialized (EnsureInitialized). GetEncodedPathToRootForSourceSubtypeCheck306 static BitString::StorageType GetEncodedPathToRootForSource(ClassPtr klass) 307 REQUIRES(Locks::subtype_check_lock_) 308 REQUIRES_SHARED(Locks::mutator_lock_) { 309 DCHECK_NE(SubtypeCheckInfo::kUninitialized, GetSubtypeCheckInfo(klass).GetState()); 310 return GetSubtypeCheckInfo(klass).GetEncodedPathToRoot(); 311 } 312 313 // Retrieve the path to root bitstring as a plain uintN_t value that is amenable to 314 // be used by a fast check "encoded_src & mask_target == encoded_target". 315 // 316 // Cost: O(Depth(Class)). 317 // 318 // Returns the encoded_target value. Must be Assigned (EnsureAssigned). GetEncodedPathToRootForTargetSubtypeCheck319 static BitString::StorageType GetEncodedPathToRootForTarget(ClassPtr klass) 320 REQUIRES(Locks::subtype_check_lock_) 321 REQUIRES_SHARED(Locks::mutator_lock_) { 322 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass); 323 DCHECK_EQ(SubtypeCheckInfo::kAssigned, sci.GetState()); 324 return sci.GetEncodedPathToRoot(); 325 } 326 327 // Retrieve the path to root bitstring mask as a plain uintN_t value that is amenable to 328 // be used by a fast check "encoded_src & mask_target == encoded_target". 329 // 330 // Cost: O(Depth(Class)). 331 // 332 // Returns the mask_target value. Must be Assigned (EnsureAssigned). GetEncodedPathToRootMaskSubtypeCheck333 static BitString::StorageType GetEncodedPathToRootMask(ClassPtr klass) 334 REQUIRES(Locks::subtype_check_lock_) 335 REQUIRES_SHARED(Locks::mutator_lock_) { 336 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass); 337 DCHECK_EQ(SubtypeCheckInfo::kAssigned, sci.GetState()); 338 return sci.GetEncodedPathToRootMask(); 339 } 340 341 // Is the source class a subclass of the target? 342 // 343 // The source state must be at least Initialized, and the target state 344 // must be Assigned, otherwise the result will return kUnknownSubtypeOf. 345 // 346 // See EnsureInitialized and EnsureAssigned. Ideally, 347 // EnsureInitialized will be called previously on all possible sources, 348 // and EnsureAssigned will be called previously on all possible targets. 349 // 350 // Runtime cost: O(Depth(Class)), but would be O(1) if depth was known. 351 // 352 // If the result is known, return kSubtypeOf or kNotSubtypeOf. IsSubtypeOfSubtypeCheck353 static SubtypeCheckInfo::Result IsSubtypeOf(ClassPtr source, ClassPtr target) 354 REQUIRES_SHARED(Locks::mutator_lock_) { 355 SubtypeCheckInfo sci = GetSubtypeCheckInfo(source); 356 SubtypeCheckInfo target_sci = GetSubtypeCheckInfo(target); 357 358 return sci.IsSubtypeOf(target_sci); 359 } 360 361 // Print SubtypeCheck bitstring and overflow to a stream (e.g. for oatdump). DumpSubtypeCheck362 static std::ostream& Dump(ClassPtr klass, std::ostream& os) 363 REQUIRES_SHARED(Locks::mutator_lock_) { 364 return os << GetSubtypeCheckInfo(klass); 365 } 366 WriteStatusSubtypeCheck367 static void WriteStatus(ClassPtr klass, ClassStatus status) 368 REQUIRES_SHARED(Locks::mutator_lock_) { 369 WriteStatusImpl(klass, status); 370 } 371 372 private: GetParentClassSubtypeCheck373 static ClassPtr GetParentClass(ClassPtr klass) 374 REQUIRES_SHARED(Locks::mutator_lock_) { 375 DCHECK(klass->HasSuperClass()); 376 return ClassPtr(klass->GetSuperClass()); 377 } 378 InitializeOrAssignSubtypeCheck379 static SubtypeCheckInfo InitializeOrAssign(ClassPtr klass, bool assign) 380 REQUIRES(Locks::subtype_check_lock_) 381 REQUIRES_SHARED(Locks::mutator_lock_) { 382 if (UNLIKELY(!klass->HasSuperClass())) { 383 // Object root always goes directly from Uninitialized -> Assigned. 384 385 const SubtypeCheckInfo root_sci = GetSubtypeCheckInfo(klass); 386 if (root_sci.GetState() != SubtypeCheckInfo::kUninitialized) { 387 return root_sci; // No change needed. 388 } 389 390 const SubtypeCheckInfo new_root_sci = root_sci.CreateRoot(); 391 SetSubtypeCheckInfo(klass, new_root_sci); 392 393 // The object root is always in the Uninitialized|Assigned state. 394 DCHECK_EQ(SubtypeCheckInfo::kAssigned, GetSubtypeCheckInfo(klass).GetState()) 395 << "Invalid object root state, must be Assigned"; 396 return new_root_sci; 397 } 398 399 // Force all ancestors to Assigned | Overflowed. 400 ClassPtr parent_klass = GetParentClass(klass); 401 size_t parent_depth = InitializeOrAssign(parent_klass, /*assign=*/true).GetDepth(); 402 if (kIsDebugBuild) { 403 SubtypeCheckInfo::State parent_state = GetSubtypeCheckInfo(parent_klass).GetState(); 404 DCHECK(parent_state == SubtypeCheckInfo::kAssigned || 405 parent_state == SubtypeCheckInfo::kOverflowed) 406 << "Expected parent Assigned|Overflowed, but was: " << parent_state; 407 } 408 409 // Read. 410 SubtypeCheckInfo sci = GetSubtypeCheckInfo(klass, parent_depth + 1u); 411 SubtypeCheckInfo parent_sci = GetSubtypeCheckInfo(parent_klass, parent_depth); 412 413 // Modify. 414 const SubtypeCheckInfo::State sci_state = sci.GetState(); 415 // Skip doing any work if the state is already up-to-date. 416 // - assign == false -> Initialized or higher. 417 // - assign == true -> Assigned or higher. 418 if (sci_state == SubtypeCheckInfo::kUninitialized || 419 (sci_state == SubtypeCheckInfo::kInitialized && assign)) { 420 // Copy parent path into the child. 421 // 422 // If assign==true, this also appends Parent.Next value to the end. 423 // Then the Parent.Next value is incremented to avoid allocating 424 // the same value again to another node. 425 sci = parent_sci.CreateChild(assign); // Note: Parent could be mutated. 426 } else { 427 // Nothing to do, already >= Initialized. 428 return sci; 429 } 430 431 // Post-condition: EnsureAssigned -> Assigned|Overflowed. 432 // Post-condition: EnsureInitialized -> Not Uninitialized. 433 DCHECK_NE(sci.GetState(), SubtypeCheckInfo::kUninitialized); 434 435 if (assign) { 436 DCHECK_NE(sci.GetState(), SubtypeCheckInfo::kInitialized); 437 } 438 439 // Write. 440 SetSubtypeCheckInfo(klass, sci); // self 441 SetSubtypeCheckInfo(parent_klass, parent_sci); // parent 442 443 return sci; 444 } 445 ReadFieldSubtypeCheck446 static SubtypeCheckBitsAndStatus ReadField(ClassPtr klass) 447 REQUIRES_SHARED(Locks::mutator_lock_) { 448 SubtypeCheckBitsAndStatus current_bits_and_status; 449 450 int32_t int32_data = klass->GetField32Volatile(klass->StatusOffset()); 451 current_bits_and_status.int32_alias_ = int32_data; 452 453 if (kIsDebugBuild) { 454 SubtypeCheckBitsAndStatus tmp; 455 memcpy(&tmp, &int32_data, sizeof(tmp)); 456 DCHECK_EQ(0, memcmp(&tmp, ¤t_bits_and_status, sizeof(tmp))) << int32_data; 457 } 458 return current_bits_and_status; 459 } 460 WriteSubtypeCheckBitsSubtypeCheck461 static void WriteSubtypeCheckBits(ClassPtr klass, const SubtypeCheckBits& new_bits) 462 REQUIRES(Locks::subtype_check_lock_) 463 REQUIRES_SHARED(Locks::mutator_lock_) { 464 // Use a "CAS" to write the SubtypeCheckBits in the class. 465 // Although we have exclusive access to the bitstrings, because 466 // ClassStatus and SubtypeCheckBits share the same word, another thread could 467 // potentially overwrite that word still. 468 469 SubtypeCheckBitsAndStatus new_value; 470 ClassStatus old_status; 471 SubtypeCheckBitsAndStatus full_old; 472 while (true) { 473 // TODO: Atomic compare-and-swap does not update the 'expected' parameter, 474 // so we have to read it as a separate step instead. 475 SubtypeCheckBitsAndStatus old_value = ReadField(klass); 476 477 { 478 SubtypeCheckBits old_bits = old_value.subtype_check_info_; 479 if (memcmp(&old_bits, &new_bits, sizeof(old_bits)) == 0) { 480 // Avoid dirtying memory when the data hasn't changed. 481 return; 482 } 483 } 484 485 full_old = old_value; 486 old_status = old_value.status_; 487 488 new_value = old_value; 489 new_value.subtype_check_info_ = new_bits; 490 491 if (kIsDebugBuild) { 492 int32_t int32_data = 0; 493 memcpy(&int32_data, &new_value, sizeof(int32_t)); 494 DCHECK_EQ(int32_data, new_value.int32_alias_) << int32_data; 495 496 DCHECK_EQ(old_status, new_value.status_) 497 << "full new: " << bit_cast<uint32_t>(new_value) 498 << ", full old: " << bit_cast<uint32_t>(full_old); 499 } 500 501 if (CasFieldWeakSequentiallyConsistent32(klass, 502 klass->StatusOffset(), 503 old_value.int32_alias_, 504 new_value.int32_alias_)) { 505 break; 506 } 507 } 508 } 509 WriteStatusImplSubtypeCheck510 static void WriteStatusImpl(ClassPtr klass, ClassStatus status) 511 REQUIRES_SHARED(Locks::mutator_lock_) { 512 // Despite not having a lock annotation, this is done with mutual exclusion. 513 // See Class::SetStatus for more details. 514 SubtypeCheckBitsAndStatus new_value; 515 ClassStatus old_status; 516 while (true) { 517 // TODO: Atomic compare-and-swap does not update the 'expected' parameter, 518 // so we have to read it as a separate step instead. 519 SubtypeCheckBitsAndStatus old_value = ReadField(klass); 520 old_status = old_value.status_; 521 522 if (memcmp(&old_status, &status, sizeof(status)) == 0) { 523 // Avoid dirtying memory when the data hasn't changed. 524 return; 525 } 526 527 new_value = old_value; 528 new_value.status_ = status; 529 530 if (CasFieldWeakSequentiallyConsistent32(klass, 531 klass->StatusOffset(), 532 old_value.int32_alias_, 533 new_value.int32_alias_)) { 534 break; 535 } 536 } 537 } 538 CasFieldWeakSequentiallyConsistent32SubtypeCheck539 static bool CasFieldWeakSequentiallyConsistent32(ClassPtr klass, 540 MemberOffset offset, 541 int32_t old_value, 542 int32_t new_value) 543 REQUIRES_SHARED(Locks::mutator_lock_) { 544 if (Runtime::Current() != nullptr && Runtime::Current()->IsActiveTransaction()) { 545 return klass->template CasField32</*kTransactionActive=*/true>(offset, 546 old_value, 547 new_value, 548 CASMode::kWeak, 549 std::memory_order_seq_cst); 550 } else { 551 return klass->template CasField32</*kTransactionActive=*/false>(offset, 552 old_value, 553 new_value, 554 CASMode::kWeak, 555 std::memory_order_seq_cst); 556 } 557 } 558 559 // Get the SubtypeCheckInfo for a klass. O(Depth(Class)) since 560 // it also requires calling klass->Depth. 561 // 562 // Anything calling this function will also be O(Depth(Class)). GetSubtypeCheckInfoSubtypeCheck563 static SubtypeCheckInfo GetSubtypeCheckInfo(ClassPtr klass) 564 REQUIRES_SHARED(Locks::mutator_lock_) { 565 return GetSubtypeCheckInfo(klass, klass->Depth()); 566 } 567 568 // Get the SubtypeCheckInfo for a klass with known depth. GetSubtypeCheckInfoSubtypeCheck569 static SubtypeCheckInfo GetSubtypeCheckInfo(ClassPtr klass, size_t depth) 570 REQUIRES_SHARED(Locks::mutator_lock_) { 571 DCHECK_EQ(depth, klass->Depth()); 572 SubtypeCheckBitsAndStatus current_bits_and_status = ReadField(klass); 573 574 const SubtypeCheckInfo current = 575 SubtypeCheckInfo::Create(current_bits_and_status.subtype_check_info_, depth); 576 return current; 577 } 578 SetSubtypeCheckInfoSubtypeCheck579 static void SetSubtypeCheckInfo(ClassPtr klass, const SubtypeCheckInfo& new_sci) 580 REQUIRES(Locks::subtype_check_lock_) 581 REQUIRES_SHARED(Locks::mutator_lock_) { 582 SubtypeCheckBits new_bits = new_sci.GetSubtypeCheckBits(); 583 WriteSubtypeCheckBits(klass, new_bits); 584 } 585 586 // Tests can inherit this class. Normal code should use static methods. 587 SubtypeCheck() = default; 588 SubtypeCheck(const SubtypeCheck& other) = default; 589 SubtypeCheck(SubtypeCheck&& other) = default; 590 ~SubtypeCheck() = default; 591 592 friend struct MockSubtypeCheck; 593 }; 594 595 } // namespace art 596 597 #endif // ART_RUNTIME_SUBTYPE_CHECK_H_ 598