Definition - What does Foot Job mean?
A foot job refers to using the feet to stimulate a partner's genitals for sexual arousal and possibly orgasm. Foot jobs are mainly performed by those who have a foot fetish or find feet to be sexually arousing.
Kinkly explains Foot Job
Most people think that a foot job only applies to using the feet to stimulate the penis but stimulating the breasts and female genitalia in this way is also considered a foot job. A foot job is a form of outercourse and presents no risk of pregnancy or STDs.