Prepare for the CompTIA Network+ Exam. Utilize interactive quizzes and multiple-choice questions with explanations. Boost your readiness and achieve exam success now!

Each practice test/flash card set has 50 randomly selected questions from a bank of over 500. You'll get a new set of questions each time!

Practice this question and more.


What does jitter refer to in a network?

  1. The length of time taken for data to travel from source to destination

  2. The time lag between frames on a network

  3. The total delay in network response time

  4. The number of packets sent per second

The correct answer is: The time lag between frames on a network

Jitter in a network specifically refers to the variation in the time delay between packets being transmitted over a network. This means that it addresses the inconsistencies or fluctuations in latency, which can significantly affect the performance of time-sensitive applications like VoIP or video conferencing. When packets do not arrive at consistent intervals, it can lead to choppy audio or video quality since the playback cannot keep a steady flow of data. The other options relate to different aspects of network performance. For instance, the length of time taken for data to travel from source to destination refers to latency. Total delay in network response time encompasses all types of delays in the network but does not specifically address the variability that jitter measures. The number of packets sent per second relates to bandwidth or throughput but does not provide insight into the timing of those packets and therefore does not describe jitter. Understanding jitter is crucial for ensuring a smooth and reliable experience in networks, especially in multimedia applications.