The ohm is defined as a resistance between two points of a conductor when a constant potential difference of 1 volt, applied to these points, produces in the conductor a current of 1 ampere, the conductor not being the seat of any electromotive force.
In many cases the resistance of a conductor in ohms is approximately constant within a certain range of voltages, temperatures, and other parameters; one speaks of linear resistors. In other cases resistance varies (e.g., thermistors).
Commonly used multiples and submultiples in electrical and electronic usage are the milliohm, ohm, kilohm, and megohm.
In alternating current circuits, electrical impedance is also measured in ohms.
This formula is applicable to devices whose resistance varies with current.
Care should be taken when preparing documents (including HTML documents) which make use of the symbol Ω. Some document editing software will attempt to use the symbol typeface to render the character. Where the font is not supported, a W is displayed instead (a "10 W" resistor instead of a "10 Ω" resistor, for instance). As this represents the SI unit of power, not resistance, this can lead to confusion.
Unicode encodes an ohm symbol (U+2126, Ω) distinct from Greek omega among letterlike symbols, but it is only included for backwards compatibility and the Greek uppercase omega character (U+03A9, Ω) is preferred.