In LLMs, is it true that long prompts require more encoding time than decoding?

Pandi could not find an answer in 2 sources. Alternatives: