Why are "degrees" and "bytes" not considered base units
The radian (not the degree) is the SI unit of angle, and it's defined in terms of lengths: it is that angle for which the length of a circular arc subtending that angle is equal to the radius of the circle. Since this definition refers to the relative ratio of two lengths, the SI considers it to be a "dimensionless derived unit", rather than a base unit.1
As far as bytes go: Defining a unit amounts to specifying a certain amount of a quantity that we call "one unit". Physical quantities such as mass, length, time, etc., are (effectively) continuous quantities, and so there is no "natural" unit for us to use. We therefore have to make an arbitrary choice about how much of each quantity is equal to one unit.
Digital information, on the other hand, is inherently discrete. All methods of quantifying data simply amount to counting bits; and you don't need to make an arbitrary choice of unit if you can simply count a quantity. There is therefore no need to define a unit for digital information, because there is already a natural unit (the bit).
It's important to note that not every measurable quantity is inherently definable in terms of SI base units. If I count the number of people in my office building right now, and tell you that there are "12 people" in the building right now, then "people" is not expressible in terms of meters, kilograms, and seconds. But I don't need to worry that you're going to come along and use some different unit to count the people in this building, because a natural unit (1 person) exists. It's only when we are measuring a quantity that can take on any real-numbered value (e.g., the mass of all the people in this building) that it becomes important to define a unit; otherwise, you and I have no basis for comparison. Any system of units is essentially a set of these arbitrary choices; "natural" units of quantities that are inherently discrete are unnecessary simply because they're understood to be the obvious choice.
1 It's worth noting that the radian was officially a "supplementary unit" in SI until 1995, when they were reclassified as "dimensionless derived units". A bit of the discussion surrounding this change can be found on p. 210 of the Proceedings of the 20th Conférence Générale des Poids et Mesures (warning: large PDF). Reading between the lines, I suspect that the name "dimensionless derived unit" was something of a compromise between those who thought it should be thought of as a derived unit and those who didn't think it should be thought of as a unit at all; but I wouldn't want to speculate further than that.
Another answer (and a linked question) addresses the fact that that the SI derived unit for angles is the radian, which is a ratio of lengths. See e.g.
The bit/byte question is interesting. In information theory, the bit is a unit of entropy. A system which is equally likely to be in one of two states has a thermodynamic entropy of $$S = k_B\ln \Omega = k_B\ln 2 = \rm1\,bit \approx 10^{-23}\,J/K,$$ which must be reduced to zero if you "write" to the bit so that its state is no longer uncertain. That's such a tiny amount of entropy that no one (apart from textbook authors) really thinks about its thermodynamic consequences, which is fine.
A byte is a particular number of bits -- usually eight nowadays, but some computers in the past have used a different number. So when you say "I have two bytes of data" what you mean is "these bits of data: I have sixteen of them." The SI does have a unit for expressing collections of many identical objects: it's the mole, which is just like a dozen, only bigger. So I suppose that you could could say that one eight-bit byte is the same as roughly $\rm 13\, yoctomoles$ of bits. I would not recommend this.
Units are required to count something which is not obviously countable.
You do not need units to count apples, because you can just do: one apple, two apples, three apples, ... . Just replace "apple" by "bit" and you can count them as easily. A "byte" is just a word we invented to refer to a group of eight bits, as we invented the word "dozen" to refer to twelve objects. Technically, "bits" and "bytes" are as much a unit as "apples" or "cats". I would recommend to consider them as countable objects instead. And of course, countable means that you can talk about fractions too. Half an apple is obviously meaningful, but also half a bit is perfectly fine and useful e.g. in information theory.
However, you cannot count distance/mass/etc. as they are inherently continuous without an obvious subdivision. There is no one distance, two distances, ..., but you need to split distances into finite comparable parts to make it countable. That is what units are for. Back then, this was done with "arbitrary" subdivisions like $1/40,000$ of the circumference of the earths equator ($ \approx$ one kilometer). But the modern way is to look for fundamentally given subdivisions, like e.g. the distance light travels in a second, or the mass of an elementary particle.
Angles, while also continues, have a natural subdivision as we can count them in chunks and fractions of "whole turns".