GNU social JP
  • FAQ
  • Login
GNU social JPは日本のGNU socialサーバーです。
Usage/ToS/admin/test/Pleroma FE
  • Public

    • Public
    • Network
    • Groups
    • Featured
    • Popular
    • People

Conversation

Notices

  1. Embed this notice
    Julia Evans (b0rk@social.jvns.ca)'s status on Wednesday, 08-Mar-2023 22:17:25 JST Julia Evans Julia Evans

    everyone keeps asking me why x86 uses an 8-bit byte, but I'm struggling to find an explanation that makes sense to me. Can any of you help?

    what I've found so far
    - it looks like x86 evolved from the intel 8008 (from 1972), which was an 8-bit CPU
    - the 8008 came after the 4004, which was 4-bit

    some questions I have:
    - was the reason to build an 8-bit CPU to increase the size of the instruction set? or something else?
    - did x86 really directly evolve from the intel 8008?

    would love any links!

    In conversation Wednesday, 08-Mar-2023 22:17:25 JST from social.jvns.ca permalink
    • clacke repeated this.
    • Embed this notice
      Robin (eythian@teh.entar.net)'s status on Wednesday, 08-Mar-2023 22:17:15 JST Robin Robin
      in reply to
      • Palin

      @b0rk @palin
      Not an answer, but a conjecture: 4 bytes can store one digit in BCD format, but 3 can't. And there was a time (probably still is, cos banks) that BCD was a thing.

      In conversation Wednesday, 08-Mar-2023 22:17:15 JST permalink
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Wednesday, 08-Mar-2023 22:17:15 JST clacke clacke
      in reply to
      • Robin
      • Palin
      @eythian @palin @b0rk Yes, from playing with 6502 back in the day I've always had the impression (yeah, sorry) that the nibble was relevant because it was a hex or BCD digit.
      In conversation Wednesday, 08-Mar-2023 22:17:15 JST permalink
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Wednesday, 08-Mar-2023 22:17:17 JST Julia Evans Julia Evans
      in reply to
      • Palin

      @palin I think we still have the distinction between bytes and words, but nobody really talks about nibbles anymore

      (I actually don't understand why people ever talked about nibbles in the first place -- I'd be curious to understand that too)

      In conversation Wednesday, 08-Mar-2023 22:17:17 JST permalink
    • Embed this notice
      Palin (palin@livellosegreto.it)'s status on Wednesday, 08-Mar-2023 22:17:24 JST Palin Palin
      in reply to

      @b0rk well back in the days we had a distinction between bytes (8 bits), nibbles (4 bits) and words, which were different based on how many bits the cpu processed in a single instruction (8 bits had 8 bit words, 16 bits 16, and so on).

      Never occurred to me there were non 8-bit bytes.

      In conversation Wednesday, 08-Mar-2023 22:17:24 JST permalink
    • Embed this notice
      Peter Drake, he/him, LFHCfS 🔥 (peterdrake@qoto.org)'s status on Wednesday, 08-Mar-2023 22:17:28 JST Peter Drake, he/him, LFHCfS 🔥 Peter Drake, he/him, LFHCfS 🔥
      in reply to

      @b0rk I don't have any useful links other than the one you probably looked at first:

      https://en.wikipedia.org/wiki/Byte

      I remember (but can't find) a list of alternate historical sizes. My favorite was the 13-bit "baker's byte".

      Just-so stories I can't back up with sources:

      A power of two is convenient as a memory size. It takes three bits to address the bits in an 8-bit byte, but it would take four (with the last one partially wasted) to address the bits in a 10-bit byte.

      If you want a character set that includes upper- and lower-case English letters, digits, and some punctuation marks, you're going to need at least 7 bits. I believe the 8th bit was originally used for error detection.

      In conversation Wednesday, 08-Mar-2023 22:17:28 JST permalink
      clacke likes this.
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Wednesday, 08-Mar-2023 22:17:29 JST Julia Evans Julia Evans
      in reply to

      or maybe the reason was that the 8008 was a popular microprocessor, and it happened to use an 8-bit byte so it became the foundation for all of intel’s future microprocessors, but in theory they could have also built a microprocessor with a 10-bit byte instead and that would have been fine too?

      In conversation Wednesday, 08-Mar-2023 22:17:29 JST permalink
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Wednesday, 08-Mar-2023 22:17:34 JST clacke clacke
      in reply to
      @b0rk As for the 8086 lineage, en.wikipedia.org/wiki/Intel_80… answers a lot of the question. 4004 -> 8008 -> 8080 -> 8085 -> 8086 were a sequence of processor generations, each building on the lessons from the previous one, but only 8085 was binary-compatible with the previous step in the chain.
      In conversation Wednesday, 08-Mar-2023 22:17:34 JST permalink

      Attachments


    • Embed this notice
      Eʟʟ (c9a@cathode.church)'s status on Wednesday, 08-Mar-2023 22:18:49 JST Eʟʟ Eʟʟ
      in reply to

      @b0rk This is one of my favorite parts of this whole thing! I don't think there's a clear technical reason, given that we had 6-bit computers as well - even numbers are nice because you can split them into even nibbles, and powers of two feel familiar, but since bits aren't individually addressable, I don't think there are any material reasons to prefer powers of two.

      I don't have any links on this, but one pressure keeping bytes from 10 bits is wasted space - how often are you going to use those top two bits? 256 is a pretty good balance, you have room for Latin text with some accents

      In conversation Wednesday, 08-Mar-2023 22:18:49 JST permalink
      clacke likes this.
    • Embed this notice
      Steve Bellovin (stevebellovin@mastodon.lawprofs.org)'s status on Wednesday, 08-Mar-2023 22:18:52 JST Steve Bellovin Steve Bellovin
      in reply to
      • Norman Wilson

      @b0rk @oclsc You might want to read this bio of Fred Brooks (https://en.wikipedia.org/wiki/Fred_Brooks)—he said that the decision to use 8-bit bytes on the IBM S/360, to permit use of lower-case letters, was his most important. (Aside: Brooks, who died last November, was one of my mentors. Saturday, I was in North Carolina for a "professional memorial" to him, about which I may post more later. For more on my relationship with him, see https://www.cs.columbia.edu/~smb/blog/2022-11/2022-11-18.html).

      In conversation Wednesday, 08-Mar-2023 22:18:52 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: upload.wikimedia.org
        Fred Brooks
        Frederick Phillips Brooks Jr. (April 19, 1931 – November 17, 2022) was an American computer architect, software engineer, and computer scientist, best known for managing the development of IBM's System/360 family of computers and the OS/360 software support package, then later writing candidly about the process in his seminal book The Mythical Man-Month.In 1976, Brooks was elected as a member into the National Academy of Engineering for "contributions to computer system design and the development of academic programs in computer sciences".Brooks received many awards, including the National Medal of Technology in 1985 and the Turing Award in 1999. Education Born on April 19, 1931, in Durham, North Carolina, he attended Duke University, graduating in 1953 with a Bachelor of Science degree in physics, and he received a Ph.D. in applied mathematics (computer science) from Harvard University in 1956, supervised by Howard Aiken.Brooks served as the graduate teaching assistant for Ken Iverson at Harvard's graduate program in "automatic data processing...
      2. Domain not in remote thumbnail source whitelist: www.cs.columbia.edu
        SMBlog -- 18 November 2022
      clacke likes this.
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Wednesday, 08-Mar-2023 22:19:07 JST clacke clacke
      in reply to
      • Steve Bellovin
      @SteveBellovin A beautiful read. Thank you for writing it. ❤️
      In conversation Wednesday, 08-Mar-2023 22:19:07 JST permalink
    • Embed this notice
      Diane 🕵 (alienghic@octodon.social)'s status on Wednesday, 08-Mar-2023 22:19:08 JST Diane 🕵 Diane 🕵
      in reply to

      @b0rk early machines had other wacky word sizes. Like 10, 12, or 36

      https://en.m.wikipedia.org/wiki/Word_(computer_architecture)

      The page also suggests the 8 bit byte became popular after ASCII was standardized

      In conversation Wednesday, 08-Mar-2023 22:19:08 JST permalink
      clacke likes this.
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Wednesday, 08-Mar-2023 22:19:10 JST Julia Evans Julia Evans
      in reply to

      so far the reasoning I'm getting for the 8-bit byte seems to be:

      1. you want your byte size to be a power of 2. This is EXTREMELY believable to me, but I don't understand why exactly you want this, my understanding of CPU design is very bad. Maybe it's because of busses? (what's a bus?)
      2. 4 bits is too small, you can't fit a character into 4 bits
      3. you also don't want your bytes to be too big, and 8 bits was working well, so the byte size never got bigger after the move from 4 to 8

      In conversation Wednesday, 08-Mar-2023 22:19:10 JST permalink
      clacke likes this.
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Wednesday, 08-Mar-2023 22:19:14 JST clacke clacke
      in reply to
      • Steve Bellovin
      @SteveBellovin My favorite anecdote of his is when he was on a flight and noticed the person next to him reading "The Mythical Man-Month".

      Being careful not to reveal who he was, he asked if the book was any good.
      "Yeah, I don't know", said the reader, "it's mostly a bunch of stuff that everybody already knows".
      On hearing this, Brooks was pleased.
      In conversation Wednesday, 08-Mar-2023 22:19:14 JST permalink
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Wednesday, 08-Mar-2023 22:19:16 JST clacke clacke
      in reply to
      The direct quote: "The most important single decision I ever made was to change the IBM 360 series from a 6-bit byte to an 8-bit byte, thereby enabling the use of lowercase letters. That change propagated everywhere."

      www.wired.com/2010/07/ff-fred-…
      /via en.wikipedia.org/wiki/Fred_Bro…

      and I found it referenced again
      /via www.infoq.com/news/2022/12/fre…
      In conversation Wednesday, 08-Mar-2023 22:19:16 JST permalink

      Attachments




    • Embed this notice
      Dan Lyke (danlyke@researchbuzz.masto.host)'s status on Wednesday, 08-Mar-2023 22:19:25 JST Dan Lyke Dan Lyke
      in reply to

      @b0rk Place where this got super expensive: The Apple ][ used 7 bit words in its hi-res framebuffer (even one color, odd another, high bit chose palette for those pixels). The hoops we jumped through to try to divide by 7, or build sprite animations around 7 or 14 pixel cycles, were legendary. Pretty much everybody did it as a lookup table, and that's 280 bytes.

      2/fin

      In conversation Wednesday, 08-Mar-2023 22:19:25 JST permalink
      clacke likes this.
    • Embed this notice
      Dan Lyke (danlyke@researchbuzz.masto.host)'s status on Wednesday, 08-Mar-2023 22:19:27 JST Dan Lyke Dan Lyke
      in reply to

      @b0rk I thought I saw this addressed earlier, and my head is fuzzy from allergies or a cold, so apologies if this reply is redundant, but something like 10 bits means that to, say, find the 13th bit you actually have to divide by 10. Dividing by 8 you can do with an AND (for the remainder) and an SHR for the divide.

      And in hardware, those operations map to transistors and circuit complexity.

      1/n

      In conversation Wednesday, 08-Mar-2023 22:19:27 JST permalink
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Wednesday, 08-Mar-2023 22:19:28 JST Julia Evans Julia Evans
      in reply to

      my main question about this 8-bit byte thing right now is -- everyone says that's it's good to have a byte size that's a power of 2. That sounds very plausible, but why exactly is it important to use a power of 2?

      (obviously it's not 100% _necessary_ -- there have been computers in the past that used byte sizes that weren’t a power of 2)

      In conversation Wednesday, 08-Mar-2023 22:19:28 JST permalink
    • Embed this notice
      Robin (eythian@teh.entar.net)'s status on Thursday, 09-Mar-2023 16:41:37 JST Robin Robin
      in reply to
      • Palin

      @b0rk @palin
      Oh, and I also remember "nibble copiers" from the C64 days, but didn't know enough at the time to know why working in 4 bits would be an advantage to getting past copy protection over 8 bits. And haven't looked it up since.

      In conversation Thursday, 09-Mar-2023 16:41:37 JST permalink
    • Embed this notice
      Vertigo #$FF (vertigo@hackers.town)'s status on Thursday, 09-Mar-2023 16:41:37 JST Vertigo #$FF Vertigo #$FF
      in reply to
      • Robin
      • Palin

      @eythian @b0rk @palin The disk encoding uses "group coded recording", or GCR, on the disk surface. These are 5-bit codes with certain properties useful for magnetic media. Of the possible 32 binary values that have these properties, only 16 are valid. Hence, each five bit group records a single nybble.

      If you copied a disk one group at a time, you could overcome some copy protection schemes that rely upon invalid group representations.

      Software that focused on copying disks this way are called nybblers because they focus on copying data one group at a time, in effect performing error correction at the individual group (hence, nybble) level.

      In conversation Thursday, 09-Mar-2023 16:41:37 JST permalink
      clacke likes this.
    • Embed this notice
      🇺🇦 haxadecimal (brouhaha@mastodon.social)'s status on Thursday, 09-Mar-2023 18:42:41 JST 🇺🇦 haxadecimal 🇺🇦 haxadecimal
      in reply to

      @b0rk Many early computers used word sizes that were multiples of six bits, and used five or six bit character codes (predating ASCII and EBCDIC). 36 bits was common for big computers, 18 for medium, and 12 bits for small.

      In conversation Thursday, 09-Mar-2023 18:42:41 JST permalink
      clacke likes this.
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Thursday, 09-Mar-2023 18:42:54 JST clacke clacke
      in reply to
      • Vertigo #$FF
      • 🇺🇦 haxadecimal
      • Advent of Computing
      • Toby Jaffey 🏳️‍🌈
      > Both of these first microprocessors were complete CPUs-on-a-chip and had similar characteristics. But because the 4004 was designed for serial BCD arithmetic while the 8008 was made for 8-bit character handling, their instruction sets were quite different.
      Sounds like by 1971 the 8-bit byte was a foregone conclusion already unless you were making a calculator (HP calculators were 4-bit even into the 90s).

      The S/360 had consolidated the 8-bit byte in 1964 and it seems it really "propagated everywhere" from there[0], even though the 36-bit-word PDP-10 had been released in 1966.

      See the mastodon.social/@brouhaha/1099… subthread on the STRETCH's influence on the S/360.
      @brouhaha

      @vertigo @tobyjaffey @adventofcomputing @b0rk

      [0] libranet.de/display/0b6b25a8-1…
      In conversation Thursday, 09-Mar-2023 18:42:54 JST permalink

      Attachments


    • Embed this notice
      Vertigo #$FF (vertigo@hackers.town)'s status on Thursday, 09-Mar-2023 18:42:55 JST Vertigo #$FF Vertigo #$FF
      in reply to
      • Advent of Computing
      • Toby Jaffey 🏳️‍🌈

      @adventofcomputing @tobyjaffey @b0rk We do, though. See this document by the 8086's inventor, Steve Morse.

      Intel Microprocessors: 8008 to 8086 - SteveMorse.org https://stevemorse.org/8086history/8086history.pdf

      In conversation Thursday, 09-Mar-2023 18:42:55 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: stevemorse.org
        One-Step Webpages by Stephen P. Morse
        Stephen P. Morse's One-Step tools for finding immigration records, census records, vital records, and for dealing with calendars, maps, foreign alphabets, and numerous other applications.

    • Embed this notice
      Advent of Computing (adventofcomputing@bitbang.social)'s status on Thursday, 09-Mar-2023 18:42:56 JST Advent of Computing Advent of Computing
      in reply to
      • Toby Jaffey 🏳️‍🌈

      @tobyjaffey @b0rk I've been summoned! I'm also a little perplexed by the standardization on 8-bit bytes in the x86 architecture specifically. Since the 8086 was initially a stop-gap project we don't have super super good details on it's development history. That said, I think the place to look is the 8008. That chip was designed as part of a contract for CTC, which was a terminal company. The 8008 would have powered the Datapoint 2200. That machine called for 8-bit bytes:
      https://history-computer.com/Library/2200_Reference_Manual.pdf

      In conversation Thursday, 09-Mar-2023 18:42:56 JST permalink

      Attachments


    • Embed this notice
      Toby Jaffey 🏳️‍🌈 (tobyjaffey@mastodon.me.uk)'s status on Thursday, 09-Mar-2023 18:43:03 JST Toby Jaffey 🏳️‍🌈 Toby Jaffey 🏳️‍🌈
      in reply to
      • Advent of Computing

      @b0rk On the "4 bits is too small", Baudot's 5 bit code was the standard for the telegraph. 8 bits is the next smallest power of 2. https://en.wikipedia.org/wiki/Baudot_code
      @adventofcomputing Had a great episode on the history of it, https://www.youtube.com/watch?v=ixPDr5mX8dg

      In conversation Thursday, 09-Mar-2023 18:43:03 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: upload.wikimedia.org
        Baudot code
        The Baudot code [boˈdo] is an early character encoding for telegraphy invented by Émile Baudot in the 1870s. It was the predecessor to the International Telegraph Alphabet No. 2 (ITA2), the most common teleprinter code in use until the advent of ASCII. Each character in the alphabet is represented by a series of five bits, sent over a communication channel such as a telegraph wire or a radio signal by asynchronous serial communication. The symbol rate measurement is known as baud, and is derived from the same name. History Baudot code (ITA1) In the below table, Columns I, II, III, IV, and V show the code; the Let. and Fig. columns show the letters and numbers for the Continental and UK versions; and the sort keys present the table in the order: alphabetical, Gray and UK Baudot developed his first multiplexed telegraph in 1872 and patented it in 1874. In 1876, he changed...
      2. Episode 65 - Teletype, Teleprint, and Telegrams
        from Advent Of Computing
        In today's episode we take a long hard look at the telegraph, and try to see how character encoding developed. We are dealing with 100% pre-computing technol...
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:08 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk another reason for using power of two sizes is that clock dividers that work based on halving are trivial to design (a flip flop whose input is driven by the inverse of its output will produce a square wave at half the clock frequency you drive it at), so if you think about designing a circuit where some data is sent serially (bit by bit) down a single wire and you want to trigger some event at the end of each byte, you can use three /2 clock dividers in series to get /8.

      In conversation Thursday, 09-Mar-2023 18:46:08 JST permalink
      clacke likes this.
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:09 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk there are lots of cases where computers perform operations around power-of-2 sized "objects" (e.g. page table translation, bitmaps, etc.) that rely on a single element being representable by a single bit inside a packed array. so having the base elements also fit within that sizing scheme makes a lot of sense.

      In conversation Thursday, 09-Mar-2023 18:46:09 JST permalink
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:11 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk the bus size shouldn't matter too much - many things are non-power-of-2 sized in terms of data and address buses inside processors and other electronics.

      one benefit of power-of-2 sizes is that the size of the element in bits itself encodes as a single set bit. 8 in binary is 0b00001000. 16 is 0b00010000. if you had a 10-bit byte the size (10 in decimal) would encode as 0b0000001010. so reasoning about the size of things in a computer for non-power-of-2 bytes gets trickier.

      In conversation Thursday, 09-Mar-2023 18:46:11 JST permalink
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:12 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk if you're struggling to visualise the clock divider thing I can smash together a quick Falstad sim to show it off if you want? (It's web based, I just post a link and you don't need to install anything)

      In conversation Thursday, 09-Mar-2023 18:46:12 JST permalink
      clacke likes this.
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:14 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk but if your data is 10 bits long then you need a way to divide the clock rate by 10, which is doable but far less simple, and leads to more problems with things like propagation delay.

      In conversation Thursday, 09-Mar-2023 18:46:14 JST permalink
    • Embed this notice
      Dan Lyke (danlyke@researchbuzz.masto.host)'s status on Thursday, 09-Mar-2023 18:46:16 JST Dan Lyke Dan Lyke
      in reply to

      @b0rk hey, just a note to tell you I appreciate these cool trips down memory lane. It's fun to have all of these reminders about what I used to find so compelling about computing.

      In conversation Thursday, 09-Mar-2023 18:46:16 JST permalink
      clacke likes this.
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Thursday, 09-Mar-2023 18:46:18 JST Julia Evans Julia Evans
      in reply to
      • Graham Sutherland / Polynomial

      @gsuberland no pressure -- i really appreciate all the answers, thank you!

      In conversation Thursday, 09-Mar-2023 18:46:18 JST permalink
    • Embed this notice
      Graham Sutherland / Polynomial (gsuberland@chaos.social)'s status on Thursday, 09-Mar-2023 18:46:19 JST Graham Sutherland / Polynomial Graham Sutherland / Polynomial
      in reply to

      @b0rk I'm heading to bed now but will put something together in the morning :)

      In conversation Thursday, 09-Mar-2023 18:46:19 JST permalink
    • Embed this notice
      Julia Evans (b0rk@social.jvns.ca)'s status on Thursday, 09-Mar-2023 18:46:20 JST Julia Evans Julia Evans
      in reply to
      • Graham Sutherland / Polynomial

      @gsuberland yes I'd love that!

      In conversation Thursday, 09-Mar-2023 18:46:20 JST permalink
    • Embed this notice
      Willem Van den Ende - Writing (mostalive@mastodon.social)'s status on Thursday, 09-Mar-2023 18:46:24 JST Willem Van den Ende - Writing Willem Van den Ende - Writing
      in reply to
      • Steve Bellovin
      • Norman Wilson

      @SteveBellovin @b0rk @oclsc That is a great and loving obituary. Thank you. (I've never met Fred Brooks, but have benefited from his writings).
      And a great answer. Didn't know that about 8-bit.

      In conversation Thursday, 09-Mar-2023 18:46:24 JST permalink
      clacke likes this.
    • Embed this notice
      The Blue Wizard (thebluewizard@hackers.town)'s status on Thursday, 09-Mar-2023 18:46:43 JST The Blue Wizard The Blue Wizard
      in reply to
      • Vertigo #$FF
      • Palin

      @vertigo @b0rk @palin In early days of electronic computing there used to be having various sizes like 12 bit, 36 bits, etc. Then IBM rolled out the seminal computer called S/360 which standardized on the 8-bit, 16-bit, etc. as groups of bits. That has a huge influence on the subsequent development The 4004 used 4-bit since it was hard to develop more complicated circuitry on silicon back then; besides it was meant for use in calculator (speed isn't important...humans are slow! ?)

      See https://en.wikipedia.org/wiki/IBM_System/360#Influential_features

      See also https://retrocomputing.stackexchange.com/questions/15305/why-did-ibm-7030-or-ibm-360-use-byte-and-word-addressing-simultaneously

      In conversation Thursday, 09-Mar-2023 18:46:43 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: upload.wikimedia.org
        IBM System
        IBM System - is a common name for IBM products. Digit-named series Hardware IBM Personal System/1 IBM Personal System/note IBM Personal System/2 IBM Personal System/2 note IBM Personal System/2 laptop IBM System/3 IBM System/4 Pi IBM Office System/6 IBM System/7 IBM System/23 IBM System/32 IBM System/34 IBM System/36 IBM System/38 IBM Personal System/55 IBM Personal System/55 note IBM System/88 IBM System/360 IBM System/370 IBM System/390 IBM System/390 Multiprise IBM Advanced System/400 IBM System Cluster 1350 IBM 1800 Data Acquisition and Control System IBM 3850 Mass Storage System IBM 5520 Administrative System IBM RISC System/6000 IBM 7700 Data Acquisition System IBM 8100 Information System IBM System 9000 IBM Enterprise System/9000...
      2. Domain not in remote thumbnail source whitelist: cdn.sstatic.net
        Why did IBM 7030 or IBM 360 use byte and word addressing simultaneously
        In 1950s machines had a 36 bit words. And in this word we could pack symbols using 6 bits. And to fetch this symbols from the word programmer should do it using bit manipulations. In 1961 IBM relea...
      clacke likes this.
    • Embed this notice
      Vertigo #$FF (vertigo@hackers.town)'s status on Thursday, 09-Mar-2023 18:46:44 JST Vertigo #$FF Vertigo #$FF
      in reply to
      • Palin

      @b0rk @palin For the same reason we talk about "digits" in decimal numbers.

      A "bit" in a binary (base-2) number can be 0 or 1.

      An "triad" in an octal (base-8) number can be 0, 1, 2, ..., 6, or 7.

      A "digit" in a decimal (base-10) number can be 0, 1, 2, ..., 8, or 9.

      A "quartet" or "nybble" in a hex (base-16) number can be 0, 1, 2, ..., E, or F. Nybble is "half a byte," making it something of a joke.

      An alternative name for what we call a "byte" today is, as you'd probably guess, "octet."

      Basically, these terms are ways of communicating two pieces of information at the same time: that you're talking about a single digit in some numerical base of some kind, and what the base actually is.

      In conversation Thursday, 09-Mar-2023 18:46:44 JST permalink

      Attachments



    • Embed this notice
      🇺🇦 haxadecimal (brouhaha@mastodon.social)'s status on Thursday, 09-Mar-2023 18:47:27 JST 🇺🇦 haxadecimal 🇺🇦 haxadecimal
      in reply to
      • clacke
      • Vertigo #$FF
      • Advent of Computing
      • Toby Jaffey 🏳️‍🌈

      @clacke @vertigo @tobyjaffey @adventofcomputing @b0rk The PDP-10 was a reimplementation of the PDP-6, with only minor architectural changes, so it's a pre-IBM/360 architecture, not post.

      In conversation Thursday, 09-Mar-2023 18:47:27 JST permalink
      clacke likes this.
    • Embed this notice
      clacke (clacke@libranet.de)'s status on Thursday, 09-Mar-2023 18:47:28 JST clacke clacke
      in reply to
      • Vertigo #$FF
      • 🇺🇦 haxadecimal
      • Advent of Computing
      • Toby Jaffey 🏳️‍🌈
      @brouhaha @vertigo @tobyjaffey @adventofcomputing @b0rk Thanks! Good point.
      In conversation Thursday, 09-Mar-2023 18:47:28 JST permalink
    • Embed this notice
      Matt Austern (austern@sfba.social)'s status on Thursday, 09-Mar-2023 18:48:38 JST Matt Austern Matt Austern
      in reply to

      @b0rk I wonder to what extent it's less about some specific engineering trade-off than just because people like round numbers. Numbers like 8 and 32 sure feel rounder and more natural to me than numbers like 7 or 11 or 36.

      In conversation Thursday, 09-Mar-2023 18:48:38 JST permalink
      clacke likes this.
    • Embed this notice
      Vertigo #$FF (vertigo@hackers.town)'s status on Thursday, 09-Mar-2023 18:53:08 JST Vertigo #$FF Vertigo #$FF
      in reply to
      • Jens Finkhäuser
      • Palin

      @jens @b0rk @palin According to ISO standards, an octet is explicitly defined to always be eight bits.

      I'd love to know the context in which "a-group-of-eight" (octet) isn't. ;)

      (Brought to you by the same people who made December the 12th month.)

      In conversation Thursday, 09-Mar-2023 18:53:08 JST permalink
      clacke likes this.
    • Embed this notice
      Jens Finkhäuser (jens@social.finkhaeuser.de)'s status on Thursday, 09-Mar-2023 18:53:10 JST Jens Finkhäuser Jens Finkhäuser
      in reply to
      • Vertigo #$FF
      • Palin

      @vertigo @b0rk @palin Ironically, it's been pointed out to me that in RFCs, an octet isn't guaranteed to be 8 bits. A good RFC should clarify that.

      I forget the context. There's at least one RFC where it matters, it seems.

      In conversation Thursday, 09-Mar-2023 18:53:10 JST permalink
    • Embed this notice
      CMDR Yojimbosan UTC+(12|13) (yojimbo@hackers.town)'s status on Thursday, 09-Mar-2023 18:53:12 JST CMDR Yojimbosan UTC+(12|13) CMDR Yojimbosan UTC+(12|13)
      in reply to
      • Jens Finkhäuser
      • Vertigo #$FF
      • Palin

      @jens @vertigo @b0rk @palin Certainly a 'byte' isn't necessarily 8 bits (qv https://retrocomputing.stackexchange.com/questions/15512/did-any-computer-use-a-7-bit-byte) but I can't see "octet" being anything except 8, IIRC that's the whole reason the word was coined.

      In conversation Thursday, 09-Mar-2023 18:53:12 JST permalink

      Attachments

      1. Domain not in remote thumbnail source whitelist: cdn.sstatic.net
        Did any computer use a 7-bit byte?
        In an answer to Why did IBM System 360 have byte addressable RAM I wrote regarding the choice of byte size: 7 bits would be a perfect match for ASCII, but engineers would instinctively recoil from
      clacke likes this.

Feeds

  • Activity Streams
  • RSS 2.0
  • Atom
  • Help
  • About
  • FAQ
  • TOS
  • Privacy
  • Source
  • Version
  • Contact

GNU social JP is a social network, courtesy of GNU social JP管理人. It runs on GNU social, version 2.0.2-dev, available under the GNU Affero General Public License.

Creative Commons Attribution 3.0 All GNU social JP content and data are available under the Creative Commons Attribution 3.0 license.