wood Posted September 14, 2003 Share Posted September 14, 2003 7.1 On a computer system with a single 32-bit 100Mhz bus (10 ns cycle), the disk controller uses DMA to transfer data to/from memory at a rate of 40Mb per second. Assume the computer fetches and executes one 32-bit instruction every cycle when no cycles are stolen. By what percentage will the disk controller slow down instruction execution? Quote Link to comment Share on other sites More sharing options...
rafi_dery Posted September 15, 2003 Share Posted September 15, 2003 The bus is capable of 3200Mb/s (32bit * 100M/s) The controller takes 40Mb/s Slowdown = 40/3200 = 1.25% Quote Link to comment Share on other sites More sharing options...
wood Posted September 15, 2003 Author Share Posted September 15, 2003 Good reasoning... but nope! Try again... Hint: bit is not the same as byte! :D:D .. I know, I know.. it sounds pretty dumb, however I believe this hint will help you figuring out the problem on your solution! Quote Link to comment Share on other sites More sharing options...
rafi_dery Posted September 16, 2003 Share Posted September 16, 2003 You mean 40Mb is 40 Mega Bytes? (MB) Quote Link to comment Share on other sites More sharing options...
wood Posted September 16, 2003 Author Share Posted September 16, 2003 Actually, you have a point. The book says 40Mb, but I believe it is using 40 MB. Here's the book explanation: "Given the bus is 32 bits wide, the controller can transfer 4 bytes 10,000,000 times per second or 4 bytes every 100 ns. The controller will steal a cycle once every 10 instruction fetches for a slowdown of 10%." Quote Link to comment Share on other sites More sharing options...
rafi_dery Posted September 16, 2003 Share Posted September 16, 2003 I guess you are right! apparently it is 40MB Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.