kimdonga Thank you for sharing your opinion! At the moment we are not very enthusiastic about adding this feature, as Spinebot responded, because we do not think it would be very easy to add a more useful lipsync feature to the Spine editor than rhubarb.
Coincidentally, Erika demonstrated making lip-sync with rhubarb on a Twitch stream about two months ago, and this video may be of interest to you:
In this video, smooth mouth movements are achieved by making further adjustments to the animation after importing the lipsync results generated by rhubarb, and this mouth movement adjustment was completed in a fairly short time. The whole process may be complicated because the rig has been a long time in the making, but rhubarb is pretty good at what it does.
Also, if lip-syncing is required in a game, it should be possible to program the animation to play according to the results of the audio analysis. I guess there are a variety of ways to do this, so if you have any questions about this, you might want to post them in a new thread, along with what runtime your project is using.