The Power Rating of a Resistor
The power rating of a resistor is the specification, meaning the maximum amount of power that a resistor can withstand. Thus, if a resistor has a power rating of 1/4 watts, 1/4 watts is the maximum amount of power that should be fed into the resistor.
When current passes through electrical components, it normally generates heat. If the current is small enough and can be withstood by the circuit, then this heat won’t influence the circuit. But if the current is large enough, it can create a substantial amount of heat in a circuit. The current can melt components and possibly create shorts in a circuit. This is why resistors are given power ratings— to specify the maximum allowable amount of power that can pass through it. If this wattage of power is exceeded, the resistor may not be able to withstand the power and may melt and can create a short in a circuit, which can lead to even greater hazards for the circuit.
Usually in electronic circuits, the power rating is not considered because the standard 0.25W resistor is normally suitable, since electronic circuits, for the vast majority, deal with low voltage and low current; and thus low power. But in cases of circuits with high voltage and low resistance (high power), power ratings of resistors should be carefully chosen since more power is delivered in the circuit. Always choose a resistor with a higher power rating than the power being used in the circuit so that the resistor isn't destroyed by excess heat; this would only serve to cause other hazards or malfunctions in the circuit.
The common standard power ratings of resistors are 0.25W, 0.5W, 1W, 2W, 5W, and 25W. So the circuit designer must choose accordingly for the circuit.