NVIDIA's dominance in the AI semiconductor market continues, but challengers are becoming increasingly active. Google's TPU, Amazon's Inferentia/Trainium, and numerous startups are entering the market with new architectures. Particularly noteworthy are Groq's LPU (Language Processing Unit) and Cerebras's wafer-scale engine. These companies are maximizing AI inference performance with an entirely different approach from traditional GPUs. Experts predict that the future AI chip market will diverge into training and inference segments, and they analyze that new strong players are likely to emerge in the inference market.