Consider a hypothetical algorithm that looked at video camera images from a cop car and advised the cop how to treat the person based on how they were driving, type of car, etc. If it turns out that cops are advised to be extra alert for people who turn out to be from a certain racial group, that is interesting, but much less likely to be racist than if the cop just has a hunch. That's really the whole point of the book and I'm surprised he missed this.
Okay dude, there's a term for this, it's called statistical discrimination, and while it might allow us to make easier or faster predictions about people, it is unethical and more importantly unlawful in cases of racial profiling. You are an idiot.
If there was some way for me to send you a message about this on Amazon instead of making passive aggressive comments on my lj, I WOULD'VE.