what do universities teach in business as a major? I don't know much because I am from PCM and Economics
Hi,
The universities teach about business environment, public relations, HR, finance, project management,etc. They will provide you with overall knowledge about the management and businesses. They will also provide you with real world experience of business environment.
Thank you.