Over the past decade, auto insurance companies have made great strides toward improving their image, largely through excellent public relations campaigns and memorable television commercials. While these companies may seem kinder and gentler than their counterparts of the past, make no mistake about it: these are major corporations that are trying to maximize their profits. Theres nothing wrong with that - as long as these insurance companies play by the books.