forked from Imagelibrary/binutils-gdb
Remove gdbarch_bits_big_endian
From what I can tell, set_gdbarch_bits_big_endian has never been used. That is, all architectures since its introduction have simply used the default, which is simply check the architecture's byte-endianness. Because this interferes with the scalar_storage_order code, this patch removes this gdbarch setting entirely. In some places, type_byte_order is used rather than the plain gdbarch. gdb/ChangeLog 2019-12-04 Tom Tromey <tromey@adacore.com> * ada-lang.c (decode_constrained_packed_array) (ada_value_assign, value_assign_to_component): Update. * dwarf2loc.c (rw_pieced_value, access_memory) (dwarf2_compile_expr_to_ax): Update. * dwarf2read.c (dwarf2_add_field): Update. * eval.c (evaluate_subexp_standard): Update. * gdbarch.c, gdbarch.h: Rebuild. * gdbarch.sh (bits_big_endian): Remove. * gdbtypes.h (union field_location): Update comment. * target-descriptions.c (make_gdb_type): Update. * valarith.c (value_bit_index): Update. * value.c (struct value) <bitpos>: Update comment. (unpack_bits_as_long, modify_field): Update. * value.h (value_bitpos): Update comment. Change-Id: I379b5e0c408ec8742f7a6c6b721108e73ed1b018
This commit is contained in:
@@ -146,12 +146,6 @@ extern const struct target_desc * gdbarch_target_desc (struct gdbarch *gdbarch);
|
||||
|
||||
/* The following are initialized by the target dependent code. */
|
||||
|
||||
/* The bit byte-order has to do just with numbering of bits in debugging symbols
|
||||
and such. Conceptually, it's quite separate from byte/word byte order. */
|
||||
|
||||
extern int gdbarch_bits_big_endian (struct gdbarch *gdbarch);
|
||||
extern void set_gdbarch_bits_big_endian (struct gdbarch *gdbarch, int bits_big_endian);
|
||||
|
||||
/* Number of bits in a short or unsigned short for the target machine. */
|
||||
|
||||
extern int gdbarch_short_bit (struct gdbarch *gdbarch);
|
||||
|
||||
Reference in New Issue
Block a user