cfgrtl.c (fixup_reorder_chain): Do not emit barriers to BB_FOOTER.

* cfgrtl.c (fixup_reorder_chain): Do not emit barriers to BB_FOOTER.

	* postreload-gcse.c (bb_has_well_behaved_predecessors): Correct test
	for table jump at the end of a basic block using tablejump_p.
	* targhooks.c (default_invalid_within_doloop): Likewise.
	* config/rs6000/rs6000.c (TARGET_INVALID_WITHIN_DOLOOP): Remove
	target hook implementation that is identical to the default hook.
	(rs6000_invalid_within_doloop): Remove.

	* bb-reorder.c (fix_crossing_unconditional_branches): Remove set but
	unused variable from tablejump_p call.

	* rtl.def (JUMP_TABLE_DATA): New RTX_INSN object.
	* rtl.h (RTX_PREV, RTX_NEXT): Adjust for new JUMP_TABLE_DATA.
	(INSN_DELETED_P): Likewise.
	(emit_jump_table_data): New prototype.
	* gengtype.c (adjust_field_rtx_def): Handle JUMP_TABLE_DATA fields
	after 4th as unused.
	* print-rtl.c (print_rtl): Handle JUMP_TABLE_DATA.
	* sched-vis.c (print_insn): Likewise.
	* emit-rtl.c (active_insn_p): Consider JUMP_TABLE_DATA an active
	insn for compatibility with back ends that use next_active_insn to
	identify jump table data.
	(set_insn_deleted): Remove no longer useful JUMP_TABLE_DATA_P check.
	(remove_insn): Likewise.
	(emit_insn): Do not accept JUMP_TABLE_DATA objects in insn chains
	to be emitted.
	(emit_debug_insn, emit_jump_insn, emit_call_insn, emit_label): Idem.
	(emit_jump_table_data): New function.

	* cfgbuild.c (inside_basic_block_p): A JUMP_INSN is always inside a
	basic block, a JUMP_TABLE_DATA never is.
	(control_flow_insn_p): JUMP_TABLE_DATA is not a control flow insn.
	* cfgrtl.c (duplicate_insn_chain): Split handling of JUMP_TABLE_DATA
	off from code handling real insns.
	* final.c (get_attr_length_1): Simplify for JUMP_INSNs.
	* function.c (instantiate_virtual_regs): Remove JUMP_TABLE_DATA_P
	test, now redundant because JUMP_TABLE_DATA is not an INSN_P insn.
	* gcse.c (insert_insn_end_basic_block): Likewise, JUMP_TABLE_DATA_P
	is not a NONDEBUG_INSN_P.
	* ira-costs.c (scan_one_insn): Likewise.
	* jump.c (mark_all_labels): Likewise.
	(mark_jump_label_1): Likewise.
	* lra-eliminations.c (eliminate_regs_in_insn): Likewise.
	* lra.c (get_insn_freq): Expect all insns reaching here to be in
	a basic block.
	(check_rtl): Remove JUMP_TABLE_DATA_P test, not a NONDEBUG_INSN_P insn.
	* predict.c (expensive_function_p): Use FOR_BB_INSNS.
	* reload1.c (calculate_needs_all_insns): Call set_label_offsets for
	JUMP_TABLE_DATA_P insns.
	(calculate_elim_costs_all_insns): Likewise.
	(set_label_offsets): Recurse on the PATTERN of JUMP_TABLE_DATA insns.
	(elimination_costs_in_insn): Remove redundant JUMP_TABLE_DATA_P test.
	(delete_output_reload): Code style fixups.
	* reorg.c (dbr_schedule): Move JUMP_TABLE_DATA_P up to avoid setting
	insn flags on this non-insn.
	* sched-rgn.c (add_branch_dependences): Treat JUMP_TABLE_DATA insns
	as scheduling barriers, for pre-change compatibility.
	* stmt.c (emit_case_dispatch_table): Emit jump table data not as
	JUMP_INSN objects but instead as JUMP_TABLE_DATA objects.

	* config/alpha/alpha.c (alpha_does_function_need_gp): Remove
	redundant JUMP_TABLE_DATA_P test.
	* config/arm/arm.c (thumb_far_jump_used_p): Likewise.
	* config/frv/frv.c (frv_function_contains_far_jump): Likewise.
	(frv_for_each_packet): Likewise.
	* config/i386/i386.c (min_insn_size): Likewise.
	(ix86_avoid_jump_mispredicts): Likewise.
	* config/m32r/m32r.c (m32r_is_insn): Likewise.
	* config/mep/mep.c (mep_reorg_erepeat): Likewise.
	* config/mips/mips.c (USEFUL_INSN_P): Likewise.
	(mips16_insn_length): Robustify.
	(mips_has_long_branch_p): Remove redundant JUMP_TABLE_DATA_P test.
	(mips16_split_long_branches): Likewise.
	* config/pa/pa.c (pa_combine_instructions): Likewise.
	* config/rs6000/rs6000.c (get_next_active_insn): Treat
	JUMP_TABLE_DATA objects as active insns, like in active_insn_p.
	* config/s390/s390.c (s390_chunkify_start): Treat JUMP_TABLE_DATA
	as contributing to pool range lengths.
	* config/sh/sh.c (find_barrier): Restore check for ADDR_DIFF_VEC.
	Remove redundant JUMP_TABLE_DATA_P test.
	(sh_loop_align): Likewise.
	(split_branches): Likewise.
	(sh_insn_length_adjustment): Likewise.
	* config/spu/spu.c (get_branch_target): Likewise.

From-SVN: r197266
This commit is contained in:
Steven Bosscher 2013-03-30 14:26:42 +00:00
parent 6ab7e76a59
commit 397186076b
36 changed files with 224 additions and 185 deletions

View File

@ -1,3 +1,91 @@
2013-03-30 Steven Bosscher <steven@gcc.gnu.org>
* cfgrtl.c (fixup_reorder_chain): Do not emit barriers to BB_FOOTER.
* postreload-gcse.c (bb_has_well_behaved_predecessors): Correct test
for table jump at the end of a basic block using tablejump_p.
* targhooks.c (default_invalid_within_doloop): Likewise.
* config/rs6000/rs6000.c (TARGET_INVALID_WITHIN_DOLOOP): Remove
target hook implementation that is identical to the default hook.
(rs6000_invalid_within_doloop): Remove.
* bb-reorder.c (fix_crossing_unconditional_branches): Remove set but
unused variable from tablejump_p call.
* rtl.def (JUMP_TABLE_DATA): New RTX_INSN object.
* rtl.h (RTX_PREV, RTX_NEXT): Adjust for new JUMP_TABLE_DATA.
(INSN_DELETED_P): Likewise.
(emit_jump_table_data): New prototype.
* gengtype.c (adjust_field_rtx_def): Handle JUMP_TABLE_DATA fields
after 4th as unused.
* print-rtl.c (print_rtl): Handle JUMP_TABLE_DATA.
* sched-vis.c (print_insn): Likewise.
* emit-rtl.c (active_insn_p): Consider JUMP_TABLE_DATA an active
insn for compatibility with back ends that use next_active_insn to
identify jump table data.
(set_insn_deleted): Remove no longer useful JUMP_TABLE_DATA_P check.
(remove_insn): Likewise.
(emit_insn): Do not accept JUMP_TABLE_DATA objects in insn chains
to be emitted.
(emit_debug_insn, emit_jump_insn, emit_call_insn, emit_label): Idem.
(emit_jump_table_data): New function.
* cfgbuild.c (inside_basic_block_p): A JUMP_INSN is always inside a
basic block, a JUMP_TABLE_DATA never is.
(control_flow_insn_p): JUMP_TABLE_DATA is not a control flow insn.
* cfgrtl.c (duplicate_insn_chain): Split handling of JUMP_TABLE_DATA
off from code handling real insns.
* final.c (get_attr_length_1): Simplify for JUMP_INSNs.
* function.c (instantiate_virtual_regs): Remove JUMP_TABLE_DATA_P
test, now redundant because JUMP_TABLE_DATA is not an INSN_P insn.
* gcse.c (insert_insn_end_basic_block): Likewise, JUMP_TABLE_DATA_P
is not a NONDEBUG_INSN_P.
* ira-costs.c (scan_one_insn): Likewise.
* jump.c (mark_all_labels): Likewise.
(mark_jump_label_1): Likewise.
* lra-eliminations.c (eliminate_regs_in_insn): Likewise.
* lra.c (get_insn_freq): Expect all insns reaching here to be in
a basic block.
(check_rtl): Remove JUMP_TABLE_DATA_P test, not a NONDEBUG_INSN_P insn.
* predict.c (expensive_function_p): Use FOR_BB_INSNS.
* reload1.c (calculate_needs_all_insns): Call set_label_offsets for
JUMP_TABLE_DATA_P insns.
(calculate_elim_costs_all_insns): Likewise.
(set_label_offsets): Recurse on the PATTERN of JUMP_TABLE_DATA insns.
(elimination_costs_in_insn): Remove redundant JUMP_TABLE_DATA_P test.
(delete_output_reload): Code style fixups.
* reorg.c (dbr_schedule): Move JUMP_TABLE_DATA_P up to avoid setting
insn flags on this non-insn.
* sched-rgn.c (add_branch_dependences): Treat JUMP_TABLE_DATA insns
as scheduling barriers, for pre-change compatibility.
* stmt.c (emit_case_dispatch_table): Emit jump table data not as
JUMP_INSN objects but instead as JUMP_TABLE_DATA objects.
* config/alpha/alpha.c (alpha_does_function_need_gp): Remove
redundant JUMP_TABLE_DATA_P test.
* config/arm/arm.c (thumb_far_jump_used_p): Likewise.
* config/frv/frv.c (frv_function_contains_far_jump): Likewise.
(frv_for_each_packet): Likewise.
* config/i386/i386.c (min_insn_size): Likewise.
(ix86_avoid_jump_mispredicts): Likewise.
* config/m32r/m32r.c (m32r_is_insn): Likewise.
* config/mep/mep.c (mep_reorg_erepeat): Likewise.
* config/mips/mips.c (USEFUL_INSN_P): Likewise.
(mips16_insn_length): Robustify.
(mips_has_long_branch_p): Remove redundant JUMP_TABLE_DATA_P test.
(mips16_split_long_branches): Likewise.
* config/pa/pa.c (pa_combine_instructions): Likewise.
* config/rs6000/rs6000.c (get_next_active_insn): Treat
JUMP_TABLE_DATA objects as active insns, like in active_insn_p.
* config/s390/s390.c (s390_chunkify_start): Treat JUMP_TABLE_DATA
as contributing to pool range lengths.
* config/sh/sh.c (find_barrier): Restore check for ADDR_DIFF_VEC.
Remove redundant JUMP_TABLE_DATA_P test.
(sh_loop_align): Likewise.
(split_branches): Likewise.
(sh_insn_length_adjustment): Likewise.
* config/spu/spu.c (get_branch_target): Likewise.
2013-03-29 Jan Hubicka <jh@suse.cz>
* lto-cgraph.c (output_profile_summary, input_profile_summary): Use

View File

@ -1998,14 +1998,14 @@ fix_crossing_unconditional_branches (void)
if (JUMP_P (last_insn)
&& (succ->flags & EDGE_CROSSING))
{
rtx label2, table;
rtx label2;
gcc_assert (!any_condjump_p (last_insn));
/* Make sure the jump is not already an indirect or table jump. */
if (!computed_jump_p (last_insn)
&& !tablejump_p (last_insn, &label2, &table))
&& !tablejump_p (last_insn, &label2, NULL))
{
/* We have found a "crossing" unconditional branch. Now
we must convert it to an indirect jump. First create

View File

@ -54,13 +54,12 @@ inside_basic_block_p (const_rtx insn)
|| ! JUMP_TABLE_DATA_P (insn));
case JUMP_INSN:
return (! JUMP_TABLE_DATA_P (insn));
case CALL_INSN:
case INSN:
case DEBUG_INSN:
return true;
case JUMP_TABLE_DATA:
case BARRIER:
case NOTE:
return false;
@ -84,8 +83,7 @@ control_flow_insn_p (const_rtx insn)
return false;
case JUMP_INSN:
/* Jump insn always causes control transfer except for tablejumps. */
return (! JUMP_TABLE_DATA_P (insn));
return true;
case CALL_INSN:
/* Noreturn and sibling call instructions terminate the basic blocks
@ -109,8 +107,9 @@ control_flow_insn_p (const_rtx insn)
return false;
break;
case JUMP_TABLE_DATA:
case BARRIER:
/* It is nonsense to reach barrier when looking for the
/* It is nonsense to reach this when looking for the
end of basic block, but before dead code is eliminated
this may happen. */
return false;

View File

@ -2488,7 +2488,7 @@ rtl_verify_flow_info (void)
break;
case CODE_LABEL:
/* An addr_vec is placed outside any basic block. */
/* An ADDR_VEC is placed outside any basic block. */
if (NEXT_INSN (x)
&& JUMP_TABLE_DATA_P (NEXT_INSN (x)))
x = NEXT_INSN (x);
@ -3244,7 +3244,7 @@ fixup_reorder_chain (void)
{
gcc_assert (!onlyjump_p (bb_end_insn)
|| returnjump_p (bb_end_insn));
BB_FOOTER (bb) = emit_barrier_after (bb_end_insn);
emit_barrier_after (bb_end_insn);
continue;
}
@ -3604,7 +3604,7 @@ cfg_layout_can_duplicate_bb_p (const_basic_block bb)
rtx
duplicate_insn_chain (rtx from, rtx to)
{
rtx insn, last, copy;
rtx insn, next, last, copy;
/* Avoid updating of boundaries of previous basic block. The
note will get removed from insn stream in fixup. */
@ -3624,15 +3624,19 @@ duplicate_insn_chain (rtx from, rtx to)
case INSN:
case CALL_INSN:
case JUMP_INSN:
copy = emit_copy_of_insn_after (insn, get_last_insn ());
if (JUMP_P (insn) && JUMP_LABEL (insn) != NULL_RTX
&& ANY_RETURN_P (JUMP_LABEL (insn)))
JUMP_LABEL (copy) = JUMP_LABEL (insn);
maybe_copy_prologue_epilogue_insn (insn, copy);
break;
case JUMP_TABLE_DATA:
/* Avoid copying of dispatch tables. We never duplicate
tablejumps, so this can hit only in case the table got
moved far from original jump. */
if (JUMP_TABLE_DATA_P (insn))
{
/* Avoid copying following barrier as well if any
moved far from original jump.
Avoid copying following barrier as well if any
(and debug insns in between). */
rtx next;
for (next = NEXT_INSN (insn);
next != NEXT_INSN (to);
next = NEXT_INSN (next))
@ -3641,13 +3645,6 @@ duplicate_insn_chain (rtx from, rtx to)
if (next != NEXT_INSN (to) && BARRIER_P (next))
insn = next;
break;
}
copy = emit_copy_of_insn_after (insn, get_last_insn ());
if (JUMP_P (insn) && JUMP_LABEL (insn) != NULL_RTX
&& ANY_RETURN_P (JUMP_LABEL (insn)))
JUMP_LABEL (copy) = JUMP_LABEL (insn);
maybe_copy_prologue_epilogue_insn (insn, copy);
break;
case CODE_LABEL:
break;

View File

@ -7454,7 +7454,6 @@ alpha_does_function_need_gp (void)
for (; insn; insn = NEXT_INSN (insn))
if (NONDEBUG_INSN_P (insn)
&& ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& get_attr_usegp (insn))

View File

@ -22654,11 +22654,7 @@ thumb_far_jump_used_p (void)
insn with the far jump attribute set. */
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
{
if (JUMP_P (insn)
/* Ignore tablejump patterns. */
&& ! JUMP_TABLE_DATA_P (insn)
&& get_attr_far_jump (insn) == FAR_JUMP_YES
)
if (JUMP_P (insn) && get_attr_far_jump (insn) == FAR_JUMP_YES)
{
/* Record the fact that we have decided that
the function does use far jumps. */

View File

@ -1409,8 +1409,6 @@ frv_function_contains_far_jump (void)
rtx insn = get_insns ();
while (insn != NULL
&& !(JUMP_P (insn)
/* Ignore tablejump patterns. */
&& ! JUMP_TABLE_DATA_P (insn)
&& get_attr_far_jump (insn) == FAR_JUMP_YES))
insn = NEXT_INSN (insn);
return (insn != NULL);
@ -7480,7 +7478,7 @@ frv_for_each_packet (void (*handle_packet) (void))
frv_start_packet_block ();
}
if (INSN_P (insn) && ! JUMP_TABLE_DATA_P (insn))
if (INSN_P (insn))
switch (GET_CODE (PATTERN (insn)))
{
case USE:

View File

@ -35116,8 +35116,6 @@ min_insn_size (rtx insn)
if (GET_CODE (PATTERN (insn)) == UNSPEC_VOLATILE
&& XINT (PATTERN (insn), 1) == UNSPECV_ALIGN)
return 0;
if (JUMP_TABLE_DATA_P (insn))
return 0;
/* Important case - calls are always 5 bytes.
It is common to have many calls in the row. */
@ -35208,9 +35206,7 @@ ix86_avoid_jump_mispredicts (void)
while (nbytes + max_skip >= 16)
{
start = NEXT_INSN (start);
if ((JUMP_P (start)
&& ! JUMP_TABLE_DATA_P (start))
|| CALL_P (start))
if (JUMP_P (start) || CALL_P (start))
njumps--, isjump = 1;
else
isjump = 0;
@ -35225,9 +35221,7 @@ ix86_avoid_jump_mispredicts (void)
if (dump_file)
fprintf (dump_file, "Insn %i estimated to %i bytes\n",
INSN_UID (insn), min_size);
if ((JUMP_P (insn)
&& ! JUMP_TABLE_DATA_P (insn))
|| CALL_P (insn))
if (JUMP_P (insn) || CALL_P (insn))
njumps++;
else
continue;
@ -35235,9 +35229,7 @@ ix86_avoid_jump_mispredicts (void)
while (njumps > 3)
{
start = NEXT_INSN (start);
if ((JUMP_P (start)
&& ! JUMP_TABLE_DATA_P (start))
|| CALL_P (start))
if (JUMP_P (start) || CALL_P (start))
njumps--, isjump = 1;
else
isjump = 0;

View File

@ -1308,7 +1308,6 @@ static int
m32r_is_insn (rtx insn)
{
return (NONDEBUG_INSN_P (insn)
&& ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER);
}

View File

@ -5511,7 +5511,6 @@ mep_reorg_erepeat (rtx insns)
for (insn = insns; insn; insn = NEXT_INSN (insn))
if (JUMP_P (insn)
&& ! JUMP_TABLE_DATA_P (insn)
&& mep_invertable_branch_p (insn))
{
if (dump_file)

View File

@ -99,7 +99,6 @@ along with GCC; see the file COPYING3. If not see
moved to rtl.h. */
#define USEFUL_INSN_P(INSN) \
(NONDEBUG_INSN_P (INSN) \
&& ! JUMP_TABLE_DATA_P (INSN) \
&& GET_CODE (PATTERN (INSN)) != USE \
&& GET_CODE (PATTERN (INSN)) != CLOBBER)
@ -14654,8 +14653,10 @@ mips16_insn_length (rtx insn)
rtx body = PATTERN (insn);
if (GET_CODE (body) == ADDR_VEC)
return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 0);
if (GET_CODE (body) == ADDR_DIFF_VEC)
else if (GET_CODE (body) == ADDR_DIFF_VEC)
return GET_MODE_SIZE (GET_MODE (body)) * XVECLEN (body, 1);
else
gcc_unreachable ();
}
return get_attr_length (insn);
}
@ -16184,7 +16185,6 @@ mips_has_long_branch_p (void)
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
FOR_EACH_SUBINSN (subinsn, insn)
if (JUMP_P (subinsn)
&& USEFUL_INSN_P (subinsn)
&& get_attr_length (subinsn) > normal_length
&& (any_condjump_p (subinsn) || any_uncondjump_p (subinsn)))
return true;
@ -16286,7 +16286,6 @@ mips16_split_long_branches (void)
something_changed = false;
for (insn = get_insns (); insn; insn = NEXT_INSN (insn))
if (JUMP_P (insn)
&& USEFUL_INSN_P (insn)
&& get_attr_length (insn) > 8
&& (any_condjump_p (insn) || any_uncondjump_p (insn)))
{

View File

@ -9134,7 +9134,6 @@ pa_combine_instructions (void)
/* We only care about INSNs, JUMP_INSNs, and CALL_INSNs.
Also ignore any special USE insns. */
if ((! NONJUMP_INSN_P (anchor) && ! JUMP_P (anchor) && ! CALL_P (anchor))
|| JUMP_TABLE_DATA_P (anchor)
|| GET_CODE (PATTERN (anchor)) == USE
|| GET_CODE (PATTERN (anchor)) == CLOBBER)
continue;
@ -9159,8 +9158,7 @@ pa_combine_instructions (void)
continue;
/* Anything except a regular INSN will stop our search. */
if (! NONJUMP_INSN_P (floater)
|| JUMP_TABLE_DATA_P (floater))
if (! NONJUMP_INSN_P (floater))
{
floater = NULL_RTX;
break;
@ -9220,8 +9218,7 @@ pa_combine_instructions (void)
continue;
/* Anything except a regular INSN will stop our search. */
if (! NONJUMP_INSN_P (floater)
|| JUMP_TABLE_DATA_P (floater))
if (! NONJUMP_INSN_P (floater))
{
floater = NULL_RTX;
break;

View File

@ -1290,9 +1290,6 @@ static const struct attribute_spec rs6000_attribute_table[] =
#undef TARGET_FUNCTION_OK_FOR_SIBCALL
#define TARGET_FUNCTION_OK_FOR_SIBCALL rs6000_function_ok_for_sibcall
#undef TARGET_INVALID_WITHIN_DOLOOP
#define TARGET_INVALID_WITHIN_DOLOOP rs6000_invalid_within_doloop
#undef TARGET_REGISTER_MOVE_COST
#define TARGET_REGISTER_MOVE_COST rs6000_register_move_cost
#undef TARGET_MEMORY_MOVE_COST
@ -18778,22 +18775,6 @@ rs6000_function_ok_for_sibcall (tree decl, tree exp)
return false;
}
/* NULL if INSN insn is valid within a low-overhead loop.
Otherwise return why doloop cannot be applied.
PowerPC uses the COUNT register for branch on table instructions. */
static const char *
rs6000_invalid_within_doloop (const_rtx insn)
{
if (CALL_P (insn))
return "Function call in the loop.";
if (JUMP_TABLE_DATA_P (insn))
return "Computed branch in the loop.";
return NULL;
}
static int
rs6000_ra_ever_killed (void)
{
@ -23940,7 +23921,7 @@ get_next_active_insn (rtx insn, rtx tail)
return NULL_RTX;
if (CALL_P (insn)
|| JUMP_P (insn)
|| JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER

View File

@ -6867,7 +6867,7 @@ s390_chunkify_start (void)
}
}
if (JUMP_P (insn) || LABEL_P (insn))
if (JUMP_P (insn) || JUMP_TABLE_DATA_P (insn) || LABEL_P (insn))
{
if (curr_pool)
s390_add_pool_insn (curr_pool, insn);

View File

@ -5213,7 +5213,8 @@ find_barrier (int num_mova, rtx mova, rtx from)
if (found_si > count_si)
count_si = found_si;
}
else if (JUMP_TABLE_DATA_P (from))
else if (JUMP_TABLE_DATA_P (from)
&& GET_CODE (PATTERN (from)) == ADDR_DIFF_VEC)
{
if ((num_mova > 1 && GET_MODE (prev_nonnote_insn (from)) == VOIDmode)
|| (num_mova
@ -5247,7 +5248,7 @@ find_barrier (int num_mova, rtx mova, rtx from)
/* There is a possibility that a bf is transformed into a bf/s by the
delay slot scheduler. */
if (JUMP_P (from) && !JUMP_TABLE_DATA_P (from)
if (JUMP_P (from)
&& get_attr_type (from) == TYPE_CBRANCH
&& ! sequence_insn_p (from))
inc += 2;
@ -5973,7 +5974,6 @@ sh_loop_align (rtx label)
if (! next
|| ! INSN_P (next)
|| GET_CODE (PATTERN (next)) == ADDR_DIFF_VEC
|| recog_memoized (next) == CODE_FOR_consttable_2)
return 0;
@ -6494,9 +6494,7 @@ split_branches (rtx first)
so transform it into a note. */
SET_INSN_DELETED (insn);
}
else if (JUMP_P (insn)
/* Don't mess with ADDR_DIFF_VEC */
&& ! JUMP_TABLE_DATA_P (insn))
else if (JUMP_P (insn))
{
enum attr_type type = get_attr_type (insn);
if (type == TYPE_CBRANCH)
@ -10122,8 +10120,7 @@ sh_insn_length_adjustment (rtx insn)
if (((NONJUMP_INSN_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER)
|| CALL_P (insn)
|| (JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn)))
|| CALL_P (insn) || JUMP_P (insn))
&& ! sequence_insn_p (insn)
&& get_attr_needs_delay_slot (insn) == NEEDS_DELAY_SLOT_YES)
return 2;
@ -10131,7 +10128,7 @@ sh_insn_length_adjustment (rtx insn)
/* SH2e has a bug that prevents the use of annulled branches, so if
the delay slot is not filled, we'll have to put a NOP in it. */
if (sh_cpu_attr == CPU_SH2E
&& JUMP_P (insn) && !JUMP_TABLE_DATA_P (insn)
&& JUMP_P (insn)
&& get_attr_type (insn) == TYPE_CBRANCH
&& ! sequence_insn_p (insn))
return 2;

View File

@ -2171,10 +2171,6 @@ get_branch_target (rtx branch)
if (GET_CODE (PATTERN (branch)) == RETURN)
return gen_rtx_REG (SImode, LINK_REGISTER_REGNUM);
/* jump table */
if (JUMP_TABLE_DATA_P (branch))
return 0;
/* ASM GOTOs. */
if (extract_asm_operands (PATTERN (branch)) != NULL)
return NULL;

View File

@ -3268,6 +3268,7 @@ int
active_insn_p (const_rtx insn)
{
return (CALL_P (insn) || JUMP_P (insn)
|| JUMP_TABLE_DATA_P (insn) /* FIXME */
|| (NONJUMP_INSN_P (insn)
&& (! reload_completed
|| (GET_CODE (PATTERN (insn)) != USE
@ -3900,7 +3901,7 @@ add_insn_before (rtx insn, rtx before, basic_block bb)
void
set_insn_deleted (rtx insn)
{
if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn))
if (INSN_P (insn))
df_insn_delete (insn);
PUT_CODE (insn, NOTE);
NOTE_KIND (insn) = NOTE_INSN_DELETED;
@ -3968,7 +3969,7 @@ remove_insn (rtx insn)
}
/* Invalidate data flow information associated with INSN. */
if (INSN_P (insn) && !JUMP_TABLE_DATA_P (insn))
if (INSN_P (insn))
df_insn_delete (insn);
/* Fix up basic block boundaries, if necessary. */
@ -4661,6 +4662,7 @@ emit_insn (rtx x)
break;
#ifdef ENABLE_RTL_CHECKING
case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
@ -4707,6 +4709,7 @@ emit_debug_insn (rtx x)
break;
#ifdef ENABLE_RTL_CHECKING
case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
@ -4749,6 +4752,7 @@ emit_jump_insn (rtx x)
break;
#ifdef ENABLE_RTL_CHECKING
case JUMP_TABLE_DATA:
case SEQUENCE:
gcc_unreachable ();
break;
@ -4785,6 +4789,7 @@ emit_call_insn (rtx x)
#ifdef ENABLE_RTL_CHECKING
case SEQUENCE:
case JUMP_TABLE_DATA:
gcc_unreachable ();
break;
#endif
@ -4809,6 +4814,20 @@ emit_label (rtx label)
return label;
}
/* Make an insn of code JUMP_TABLE_DATA
and add it to the end of the doubly-linked list. */
rtx
emit_jump_table_data (rtx table)
{
rtx jump_table_data = rtx_alloc (JUMP_TABLE_DATA);
INSN_UID (jump_table_data) = cur_insn_uid++;
PATTERN (jump_table_data) = table;
BLOCK_FOR_INSN (jump_table_data) = NULL;
add_insn (jump_table_data);
return jump_table_data;
}
/* Make an insn of code BARRIER
and add it to the end of the doubly-linked list. */

View File

@ -391,17 +391,7 @@ get_attr_length_1 (rtx insn, int (*fallback_fn) (rtx))
return 0;
case CALL_INSN:
length = fallback_fn (insn);
break;
case JUMP_INSN:
body = PATTERN (insn);
if (JUMP_TABLE_DATA_P (insn))
{
/* Alignment is machine-dependent and should be handled by
ADDR_VEC_ALIGN. */
}
else
length = fallback_fn (insn);
break;

View File

@ -1915,8 +1915,7 @@ instantiate_virtual_regs (void)
{
/* These patterns in the instruction stream can never be recognized.
Fortunately, they shouldn't contain virtual registers either. */
if (JUMP_TABLE_DATA_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
if (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT)
continue;

View File

@ -2148,20 +2148,10 @@ insert_insn_end_basic_block (struct expr *expr, basic_block bb)
&& (!single_succ_p (bb)
|| single_succ_edge (bb)->flags & EDGE_ABNORMAL)))
{
#ifdef HAVE_cc0
rtx note;
#endif
/* If this is a jump table, then we can't insert stuff here. Since
we know the previous real insn must be the tablejump, we insert
the new instruction just before the tablejump. */
if (JUMP_TABLE_DATA_P (insn))
insn = prev_active_insn (insn);
#ifdef HAVE_cc0
/* FIXME: 'twould be nice to call prev_cc0_setter here but it aborts
if cc0 isn't set. */
note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX);
rtx note = find_reg_note (insn, REG_CC_SETTER, NULL_RTX);
if (note)
insn = XEXP (note, 0);
else

View File

@ -1219,6 +1219,8 @@ adjust_field_rtx_def (type_p t, options_p ARG_UNUSED (opt))
t = scalar_tp, subname = "rt_int";
else if (i == SYMBOL_REF && aindex == 2)
t = symbol_union_tp, subname = "";
else if (i == JUMP_TABLE_DATA && aindex >= 5)
t = scalar_tp, subname = "rt_int";
else if (i == BARRIER && aindex >= 3)
t = scalar_tp, subname = "rt_int";
else if (i == ENTRY_VALUE && aindex == 0)

View File

@ -1269,8 +1269,7 @@ scan_one_insn (rtx insn)
int i, k;
bool counted_mem;
if (!NONDEBUG_INSN_P (insn)
|| JUMP_TABLE_DATA_P (insn))
if (!NONDEBUG_INSN_P (insn))
return insn;
pat_code = GET_CODE (PATTERN (insn));

View File

@ -274,19 +274,13 @@ mark_all_labels (rtx f)
basic blocks. If those non-insns represent tablejump data,
they contain label references that we must record. */
for (insn = BB_HEADER (bb); insn; insn = NEXT_INSN (insn))
if (INSN_P (insn))
{
gcc_assert (JUMP_TABLE_DATA_P (insn));
if (JUMP_TABLE_DATA_P (insn))
mark_jump_label (PATTERN (insn), insn, 0);
}
for (insn = BB_FOOTER (bb); insn; insn = NEXT_INSN (insn))
if (INSN_P (insn))
{
gcc_assert (JUMP_TABLE_DATA_P (insn));
if (JUMP_TABLE_DATA_P (insn))
mark_jump_label (PATTERN (insn), insn, 0);
}
}
}
else
{
rtx prev_nonjump_insn = NULL;
@ -296,6 +290,8 @@ mark_all_labels (rtx f)
;
else if (LABEL_P (insn))
prev_nonjump_insn = NULL;
else if (JUMP_TABLE_DATA_P (insn))
mark_jump_label (PATTERN (insn), insn, 0);
else if (NONDEBUG_INSN_P (insn))
{
mark_jump_label (PATTERN (insn), insn, 0);

View File

@ -767,8 +767,7 @@ eliminate_regs_in_insn (rtx insn, bool replace_p)
if (icode < 0 && asm_noperands (PATTERN (insn)) < 0 && ! DEBUG_INSN_P (insn))
{
lra_assert (JUMP_TABLE_DATA_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
lra_assert (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT);
return;

View File

@ -1619,18 +1619,10 @@ add_regs_to_insn_regno_info (lra_insn_recog_data_t data, rtx x, int uid,
static int
get_insn_freq (rtx insn)
{
basic_block bb;
basic_block bb = BLOCK_FOR_INSN (insn);
if ((bb = BLOCK_FOR_INSN (insn)) != NULL)
gcc_checking_assert (bb != NULL);
return REG_FREQ_FROM_BB (bb);
else
{
lra_assert (lra_insn_recog_data[INSN_UID (insn)]
->insn_static_data->n_operands == 0);
/* We don't care about such insn, e.g. it might be jump with
addr_vec. */
return 1;
}
}
/* Invalidate all reg info of INSN with DATA and execution frequency
@ -1997,7 +1989,6 @@ check_rtl (bool final_p)
FOR_EACH_BB (bb)
FOR_BB_INSNS (bb, insn)
if (NONDEBUG_INSN_P (insn)
&& ! JUMP_TABLE_DATA_P (insn)
&& GET_CODE (PATTERN (insn)) != USE
&& GET_CODE (PATTERN (insn)) != CLOBBER
&& GET_CODE (PATTERN (insn)) != ASM_INPUT)

View File

@ -918,7 +918,7 @@ bb_has_well_behaved_predecessors (basic_block bb)
if ((pred->flags & EDGE_ABNORMAL_CALL) && cfun->has_nonlocal_label)
return false;
if (JUMP_TABLE_DATA_P (BB_END (pred->src)))
if (tablejump_p (BB_END (pred->src), NULL, NULL))
return false;
}
return true;

View File

@ -2748,8 +2748,7 @@ expensive_function_p (int threshold)
{
rtx insn;
for (insn = BB_HEAD (bb); insn != NEXT_INSN (BB_END (bb));
insn = NEXT_INSN (insn))
FOR_BB_INSNS (bb, insn)
if (active_insn_p (insn))
{
sum += bb->frequency;

View File

@ -778,6 +778,7 @@ print_rtl (FILE *outf, const_rtx rtx_first)
case CALL_INSN:
case NOTE:
case CODE_LABEL:
case JUMP_TABLE_DATA:
case BARRIER:
for (tmp_rtx = rtx_first; tmp_rtx != 0; tmp_rtx = NEXT_INSN (tmp_rtx))
{

View File

@ -1490,7 +1490,7 @@ calculate_needs_all_insns (int global)
include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see
what effects this has on the known offsets at labels. */
if (LABEL_P (insn) || JUMP_P (insn)
if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (INSN_P (insn) && REG_NOTES (insn) != 0))
set_label_offsets (insn, insn, 0);
@ -1620,7 +1620,7 @@ calculate_elim_costs_all_insns (void)
include REG_LABEL_OPERAND and REG_LABEL_TARGET), we need to see
what effects this has on the known offsets at labels. */
if (LABEL_P (insn) || JUMP_P (insn)
if (LABEL_P (insn) || JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (INSN_P (insn) && REG_NOTES (insn) != 0))
set_label_offsets (insn, insn, 0);
@ -2404,6 +2404,10 @@ set_label_offsets (rtx x, rtx insn, int initial_p)
return;
case JUMP_TABLE_DATA:
set_label_offsets (PATTERN (insn), insn, initial_p);
return;
case JUMP_INSN:
set_label_offsets (PATTERN (insn), insn, initial_p);
@ -3234,11 +3238,10 @@ eliminate_regs_in_insn (rtx insn, int replace)
if (! insn_is_asm && icode < 0)
{
gcc_assert (JUMP_TABLE_DATA_P (insn)
gcc_assert (DEBUG_INSN_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT
|| DEBUG_INSN_P (insn));
|| GET_CODE (PATTERN (insn)) == ASM_INPUT);
if (DEBUG_INSN_P (insn))
INSN_VAR_LOCATION_LOC (insn)
= eliminate_regs (INSN_VAR_LOCATION_LOC (insn), VOIDmode, insn);
@ -3644,11 +3647,10 @@ elimination_costs_in_insn (rtx insn)
if (! insn_is_asm && icode < 0)
{
gcc_assert (JUMP_TABLE_DATA_P (insn)
gcc_assert (DEBUG_INSN_P (insn)
|| GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
|| GET_CODE (PATTERN (insn)) == ASM_INPUT
|| DEBUG_INSN_P (insn));
|| GET_CODE (PATTERN (insn)) == ASM_INPUT);
return;
}
@ -8866,8 +8868,7 @@ delete_output_reload (rtx insn, int j, int last_reload_reg, rtx new_reload_reg)
since if they are the only uses, they are dead. */
if (set != 0 && SET_DEST (set) == reg)
continue;
if (LABEL_P (i2)
|| JUMP_P (i2))
if (LABEL_P (i2) || JUMP_P (i2))
break;
if ((NONJUMP_INSN_P (i2) || CALL_P (i2))
&& reg_mentioned_p (reg, PATTERN (i2)))
@ -8891,8 +8892,7 @@ delete_output_reload (rtx insn, int j, int last_reload_reg, rtx new_reload_reg)
delete_address_reloads (i2, insn);
delete_insn (i2);
}
if (LABEL_P (i2)
|| JUMP_P (i2))
if (LABEL_P (i2) || JUMP_P (i2))
break;
}

View File

@ -3700,14 +3700,14 @@ dbr_schedule (rtx first)
{
rtx target;
if (JUMP_P (insn))
INSN_ANNULLED_BRANCH_P (insn) = 0;
INSN_FROM_TARGET_P (insn) = 0;
/* Skip vector tables. We can't get attributes for them. */
if (JUMP_TABLE_DATA_P (insn))
continue;
if (JUMP_P (insn))
INSN_ANNULLED_BRANCH_P (insn) = 0;
INSN_FROM_TARGET_P (insn) = 0;
if (num_delay_slots (insn) > 0)
obstack_ptr_grow (&unfilled_slots_obstack, insn);

View File

@ -64,7 +64,8 @@ along with GCC; see the file COPYING3. If not see
RTX_BITFIELD_OPS
an rtx code for a bit-field operation (ZERO_EXTRACT, SIGN_EXTRACT)
RTX_INSN
an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN)
an rtx code for a machine insn (INSN, JUMP_INSN, CALL_INSN) or
data that will be output as assembly pseudo-ops (DEBUG_INSN)
RTX_MATCH
an rtx code for something that matches in insns (e.g, MATCH_DUP)
RTX_AUTOINC
@ -137,6 +138,13 @@ DEF_RTL_EXPR(JUMP_INSN, "jump_insn", "iuuBeiie0", RTX_INSN)
All other fields ( rtx->u.fld[] ) have exact same meaning as INSN's. */
DEF_RTL_EXPR(CALL_INSN, "call_insn", "iuuBeiiee", RTX_INSN)
/* Placeholder for tablejump JUMP_INSNs. The pattern of this kind
of rtx is always either an ADDR_VEC or an ADDR_DIFF_VEC. These
placeholders do not appear as real instructions inside a basic
block, but are considered active_insn_p instructions for historical
reasons, when jump table data was represented with JUMP_INSNs. */
DEF_RTL_EXPR(JUMP_TABLE_DATA, "jump_table_data", "iuuBe0000", RTX_INSN)
/* A marker that indicates that control will not flow through. */
DEF_RTL_EXPR(BARRIER, "barrier", "iuu00000", RTX_EXTRA)
@ -214,8 +222,12 @@ DEF_RTL_EXPR(UNSPEC, "unspec", "Ei", RTX_EXTRA)
/* Similar, but a volatile operation and one which may trap. */
DEF_RTL_EXPR(UNSPEC_VOLATILE, "unspec_volatile", "Ei", RTX_EXTRA)
/* Vector of addresses, stored as full words. */
/* Each element is a LABEL_REF to a CODE_LABEL whose address we want. */
/* ----------------------------------------------------------------------
Table jump addresses.
---------------------------------------------------------------------- */
/* Vector of addresses, stored as full words.
Each element is a LABEL_REF to a CODE_LABEL whose address we want. */
DEF_RTL_EXPR(ADDR_VEC, "addr_vec", "E", RTX_EXTRA)
/* Vector of address differences X0 - BASE, X1 - BASE, ...
@ -240,7 +252,6 @@ DEF_RTL_EXPR(ADDR_VEC, "addr_vec", "E", RTX_EXTRA)
The third, fourth and fifth operands are only valid when
CASE_VECTOR_SHORTEN_MODE is defined, and only in an optimizing
compilation. */
DEF_RTL_EXPR(ADDR_DIFF_VEC, "addr_diff_vec", "eEee0", RTX_EXTRA)
/* Memory prefetch, with attributes supported on some targets.

View File

@ -363,6 +363,7 @@ struct GTY((chain_next ("RTX_NEXT (&%h)"),
*/
#define RTX_PREV(X) ((INSN_P (X) \
|| NOTE_P (X) \
|| JUMP_TABLE_DATA_P (X) \
|| BARRIER_P (X) \
|| LABEL_P (X)) \
&& PREV_INSN (X) != NULL \
@ -469,9 +470,7 @@ struct GTY((variable_size)) rtvec_def {
#define BARRIER_P(X) (GET_CODE (X) == BARRIER)
/* Predicate yielding nonzero iff X is a data for a jump table. */
#define JUMP_TABLE_DATA_P(INSN) \
(JUMP_P (INSN) && (GET_CODE (PATTERN (INSN)) == ADDR_VEC || \
GET_CODE (PATTERN (INSN)) == ADDR_DIFF_VEC))
#define JUMP_TABLE_DATA_P(INSN) (GET_CODE (INSN) == JUMP_TABLE_DATA)
/* Predicate yielding nonzero iff X is a return or simple_return. */
#define ANY_RETURN_P(X) \
@ -849,8 +848,8 @@ extern void rtl_check_failed_flag (const char *, const_rtx, const char *,
/* 1 if RTX is an insn that has been deleted. */
#define INSN_DELETED_P(RTX) \
(RTL_FLAG_CHECK7("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \
CALL_INSN, JUMP_INSN, \
(RTL_FLAG_CHECK8("INSN_DELETED_P", (RTX), DEBUG_INSN, INSN, \
CALL_INSN, JUMP_INSN, JUMP_TABLE_DATA, \
CODE_LABEL, BARRIER, NOTE)->volatil)
/* 1 if RTX is a call to a const function. Built from ECF_CONST and
@ -1881,6 +1880,7 @@ extern rtx emit_debug_insn (rtx);
extern rtx emit_jump_insn (rtx);
extern rtx emit_call_insn (rtx);
extern rtx emit_label (rtx);
extern rtx emit_jump_table_data (rtx);
extern rtx emit_barrier (void);
extern rtx emit_note (enum insn_note);
extern rtx emit_note_copy (rtx);

View File

@ -2449,7 +2449,7 @@ add_branch_dependences (rtx head, rtx tail)
insn = tail;
last = 0;
while (CALL_P (insn)
|| JUMP_P (insn)
|| JUMP_P (insn) || JUMP_TABLE_DATA_P (insn)
|| (NONJUMP_INSN_P (insn)
&& (GET_CODE (PATTERN (insn)) == USE
|| GET_CODE (PATTERN (insn)) == CLOBBER
@ -2536,7 +2536,7 @@ add_branch_dependences (rtx head, rtx tail)
possible improvement for handling COND_EXECs in this scheduler: it
could remove always-true predicates. */
if (!reload_completed || ! JUMP_P (tail))
if (!reload_completed || ! (JUMP_P (tail) || JUMP_TABLE_DATA_P (tail)))
return;
insn = tail;

View File

@ -666,6 +666,11 @@ print_insn (pretty_printer *pp, const_rtx x, int verbose)
case CODE_LABEL:
pp_printf (pp, "L%d:", INSN_UID (x));
break;
case JUMP_TABLE_DATA:
pp_string (pp, "jump_table_data{\n");
print_pattern (pp, PATTERN (x), verbose);
pp_string (pp, "}");
break;
case BARRIER:
pp_string (pp, "barrier");
break;

View File

@ -2025,12 +2025,13 @@ emit_case_dispatch_table (tree index_expr, tree index_type,
emit_label (table_label);
if (CASE_VECTOR_PC_RELATIVE || flag_pic)
emit_jump_insn (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE,
gen_rtx_LABEL_REF (Pmode, table_label),
emit_jump_table_data (gen_rtx_ADDR_DIFF_VEC (CASE_VECTOR_MODE,
gen_rtx_LABEL_REF (Pmode,
table_label),
gen_rtvec_v (ncases, labelvec),
const0_rtx, const0_rtx));
else
emit_jump_insn (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE,
emit_jump_table_data (gen_rtx_ADDR_VEC (CASE_VECTOR_MODE,
gen_rtvec_v (ncases, labelvec)));
/* Record no drop-through after the table. */

View File

@ -474,7 +474,7 @@ default_invalid_within_doloop (const_rtx insn)
if (CALL_P (insn))
return "Function call in loop.";
if (JUMP_TABLE_DATA_P (insn))
if (tablejump_p (insn, NULL, NULL) || computed_jump_p (insn))
return "Computed branch in the loop.";
return NULL;