#ifacialmocap
Explore tagged Tumblr posts
Text
i was trying something new, there's a facial tracker now in NVidia Broadcast. and a program that can translate it into iFacialMocap protocol
it looks really cool and promising but when i connected it with VSeeFace it was lagging like hell and could only track the general facing direction. no expressions whatsoever. but that might be a me / my model problem
buuut it's cool that an iPhone might not be required in the future for detailed tracking!!!
5 notes
·
View notes
Text
Hand tracking!!
#xabane#Vtuber#3d vtuber#vtube model#vtuber uprising#vtubers#xabanevt#Twitch#Vtuber clips#hand tracking#ifacialmocap
2 notes
·
View notes
Text
Any vtubers know how to resolve this error in VSeeFace? :<
I've been working on this avatar all week and haven't had any issues until the end of my day yesterday- I was using VSeeFace all day and this error message popped up randomly and hasn't gone away since.
I've followed all the troubleshooting steps outlined in the error message multiple times, tried running the program as an adminstrator, updated my iFacialMocap app, checked that my iPhone/iFacialMocap work in Warudo, etc. but still this error persists.
It's so frustrating bc I was hoping to finish this project today but I really can't do any more work on it until this issue is fixed ?
I'm scared bc all the threads I can find about this error message on Reddit seem to end with the OP not finding a solution, even months later oaifjd UPDATE: Solved! Some combination of changing the camera in my general settings a few times, resetting motion weights in iFacialMocap, updating the app, and restarting my iPhone.
17 notes
·
View notes
Text
FAQ Y ADVERTENCIAS DE COMISIONES!
PREGUNTAS FRECUENTES!
Qué tipo de vtubers vendes? VROID MODIFICADOS EN BLENDER! Es la opción más económica y rápida!
Precio? Gracias por el interés! el precio de hacer un avatar completo depende mucho de la complejidad de tu diseño!
Qué son las Blendshapes? En pocas palabras: Movimientos musculares faciales que NO poseén los avatares por si solos, es un trabajo que se debe hacer en blender. Un vroid tiene por predeterminado las letras A, E , I , O , U, y emociones que se activan MANUALMENTE. Con las blendshapes puedes controlar un montón de musculos faciales extra para ser más efectivo en las emociones e intenciones usando solo tu cara! De igual manera SIEMPRE agrego emociones manuales por si sientes la necesidad de exagerar el momento.
Qué estilo puedo elegir? ¡Eso depende de ti! debes tener la mayor cantidad de referencias posibles antes de comisionar!
Ejemplos:
El estilo final general de tu personaje
Estilo final general de la ROPA
Forma general de cuerpo y estatura
Forma de la cara DE FRENTE Y PERFIL
Forma final del cabello DE FRENTE, PERFIL Y POR ATRÁS
Los colores EXACTOS que quieres en TODO
Qué tipo de sombreado y si lleva delineado (tipo 2D) o no.
Puedo solo comprar BlendShapes y texturas para un vroid ya existente?
Claro que sí!
Cuanto tardas en entregar la comisión? Un máximo de 2 semanas si es un personaje completo! depende mucho de cuantos cambios quieras que haga en el proceso, pero por le general no suelo tardar.
Un máximo de 3 días si solo es un trabajo en un vroid existente.
Cuantos cambios puedo hacer? Los cambios los puedes empezar a hacer una vez que te de los avances finales de las texturas! Tienes derecho a:
3 cambios en la forma de la cara
2 cambios en el pelo
2 cambios en la ropa
Cada cambio nuevo después de agotar tus 2 cambios tiene un costo extra de $5USD
Por donde puedo pagar? Si eres de México acepto depositos y transferencias! Si eres del resto del mundo, Paypal!
Debo pagar todo en el momento? No! con el 50% del pago final empiezo a hacer tu trabajo! Cuando llegue el momento de entregarte el archivo final deberás liquidar en su totalidad el otro 50%
Haces reembolsos? No, debes estar completamente seguro de esta inversión antes de hacer el depósito, empiezo a hacer el trabajo el mismo día que pagas.
Con qué usas tú a los vtubers? Trackeo de cara: Con un iphone 12 App de trackeo en celular: iFacialMocap Programa en PC: VSEEFACE
Es posible usar mi vtuber con android? Sí! es posible! La calidad de trackeo no es tan precisa como iPhone, pero definitivamente es funcional!
Puedo usar mi vtuber con camara web? Si! Programas como:
Webcam Motion Capture
VUP
Leen la MAYORÍA de las blendshapes incluso usando solo la camara web, Prueba diferentes opciónes de trackeo en casa!
ATENCIÓN:
Al momento de comisionar 8l00my tiene permiso de usar la imagen y nombre del personaje para agregarlo a un portafolio de trabajo. Tambien de hacer una MUESTRA NO MONETIZADA del modelo en Youtube (ver ESTE VIDEO para ejemplo). 8L00MY se compromete a NO USAR TU MODELO de manera maliciosa ni de REVENDER O DIVULGAR el archivo del personaje.
7 notes
·
View notes
Text
youtube
Vtuber Cinnameg Chibi Live2d Model
Recorded using VtubeStudio, iFacialMocap, and Vbridger
Vtuber: Cinnameg
• https://www.twitch.tv/Cinnamegart_
• https://twitter.com/Cinnamegart_
Model Illustration and L2D Rig: Cinnameg_
• https://twitter.com/Cinnamegart_
Art Programs: CSP, Live2D, Davinci Resolve
BGM: Waiting For You composed by Kei Morimoto https://www.youtube.com/watch?v=sHPwO...
2 notes
·
View notes
Video
youtube
I was in charge of Live2D Rigging for Abyzab! Full Body Rigging only - check credits for the wonderful Artist / Video Editor / Composer!
Vtuber is: https://www.twitch.tv/abyzab https://twitter.com/Abyzab_ https://linktr.ee/Abyzab
Video Editor: https://twitter.com/KaeVeo Model Artist: https://twitter.com/KANADE_616 Music Composer: https://twitter.com/BonesNoize
Tracking: VBridger/VTubeStudio/ifacialMoCap
6 notes
·
View notes
Text
expression test with my new vtuber model plus a comparison to the old model... here are some lessons learned and some thoughts down below...
quite frankly i didn't do a good test because i forgot to do certain expressions or whatever with my face there but whatever. for this model, it was all about challenging myself and applying what i've learned...anyways let's get down to what is New and Improved:
Better proportions - the old face felt very small and scrunched up. this time i kinda based it off of Punishing Gray Raven's proportions for their male characters (Lee & Noctis & Chrome were some of my references lol...) i tried a new thing with making ears by the way (i still hate making ears)
Hair - referenced how fuyumidori makes hair for their 3D models... their stuff is so fucking good it's crazy... anyways i followed what they did and how they used the Solidify modifier to get the job done and idk how i didn't think of that either. anime hair is pretty hard to do and short hair is even more challenging.
Eyebrow - addressed how the eyebrows would clip through in the old model. also my eyelashes are prettier now yaay
Painting - after some brush hunting in Clip Studio Paint and some referencing off of tutorials and other artists... challenged myself to paint hair like to shade it and everything considering what I usually do for like every model is leave it all flat hrjghjfg... the eyes are usually the only thing I'd try really hard on. anyways the lesson here... is to not be afraid of Trying...
Physics - i understand how physics work YAAAAY... a tip here... if you were suffering like me... be mindful of the scale of your model because if it is too big... it just. won't work i don't know why spring bones and colliders are dependent on that but it is what it is i guess... i guess adding physics was also useless considering how i don't plan on doing any full body tracking anytime soon (unless someone here wants to give me money for LeapMotion teehee :3)
New Design - okay well. the video doesn't show it but i've redesigned my 3D model... perhaps someday i'll make a character sheet. he is no longer wearing a dress F
now some.....Critiques... and some potential issues... if you are reading perhaps you can offer advice...
Bad lip movement? - not sure if this has to do with my settings in VSeeFace or if it's my phone that is fucked up or my microphone or what not. However.... i feel like compared to the old model, my new model is Very Stiff... like i said I don't know if I need to adjust settings in terms of that lipsync thing in VSeeFace or if it has something to do with iFacialMocap... but i am moving my mouth in silly ways or in the same way as the old model, but it's only going off of what's being picked up on my Yeti microphone so...idk. ngl it could also potentially the way i set up the shapekeys but idk
Teeth clipping? - related to the above...very ironically now that i have fixed the eyebrows clipping, when i speak sometimes it will clip up...
Wide mouth movement - oki feel like this more has to do with the way I set up certain shape keys because i know for some i probably just stretched the mouth a little too far but it is what it is i guess.
Shirt sleeve clipping - when I adjust the arms in VSeeFace, the sleeves of the shirt come out. i'll probably just cheat and delete vertices teehee
that being said...another year is coming to an end and in regards to modeling, i feel like i have learned so much. i overcame what i found intimidating. who knew i would be making mmd + vtuber models like this.....this year, i made a short video for a friends birthday (hiii dako if you're reading this) and i then challenged myself further with that miku 16th birthday video by animating individual shots and editing a bit of the motion data. yaaay for learning animation, video editing, and direction
i hope to reopen commissions someday... i opened and then closed them this year LOL but i didn't really get anything anyways and it's fine because on one hand, i do feel like i should improve a bit and get it right and then open commissions but it did suck not getting anything hrjghfjg. since this pandemic started i basically was forced to just rush into 3d modeling so in some ways... i felt like i was not adequate.
i think i am capable and i know what to do, i just need to practice. i have a bunch of tutorials + references backed up so it's important to refer back to those too. i do enjoy making art and i enjoy making 3d models, it feels like solving a puzzle and it is satisfying to see them come to life by animating something small i just think it sucks not having income
so uhmmm... lessons learned i guess:
don't be hard on yourself
try
always look back at references, seek out tutorials, tips, etc.
speaking of challenges... a challenge for me is to be more organized and to double check things. in this case i had to redo some work because i forgot one iPhone shapekey and it was jawForward lol... yay for going back to past saves. also i'd say i'm good at organization it's more like. just making sure everything is in order before moving on to the next step
practice
maybe it's because of this year ending i'm feeling sentimental and emotional but thank you for supporting me and watching me on this 3d modeling journey. see you next time with a new model... it is time to let go of perfection and set this model/project free yaaay (and hopefully i can actually stream with it sometime LOL)
1 note
·
View note
Text
Hey have you tried vnyan? warudo? live3d? Luppet? Tracking World? VMagicMirror? Oh you use VseeFace, so you need to install iFacialMocap or Waidayo or Facemotion3d or etc so you can track your face from your phone then you need to use a program like VRM Posing Desktop to position your character but then they'll be stiff you better use a different program also you're going to need a different one if you want tts and oh you want throwable objects or chat interraction? Well you better install a 3rd of 4th program, don't forget you need obs open so you can stream everything what do you MEAN you want to play a modern game have you tried turning all settings to low or maybe buying a better computer?
original x
1K notes
·
View notes
Text
and that's our stream! It feels good to be back to my base model! We got some good progress on the ear rigging and physics, and a few other things. Also got a chance to test out iFacialMocap, but I think I need to tune my model a bit better to take advantage of it
2 notes
·
View notes
Text
testing a new hair, emotes installed, base created
just need to adjust the arkit blendshapes, they're a little too heavy but that's intentional so you can adjust them.
the emotes
the blendshapes
pics taken using vseeface and ifacialmocap
4 notes
·
View notes
Text
Vsee for linux
Vsee for linux update#
Old versions can be found in the release archive here. If necessary, V4 compatiblity can be enabled from VSeeFace’s advanced settings.
Vsee for linux update#
If you use a Leap Motion, update your Leap Motion software to V5.2 or newer! Just make sure to uninstall any older versions of the Leap Motion software first. To update VSeeFace, just delete the old folder or overwrite it when unpacking the new version. For those, please check out VTube Studio or PrprLive. Please note that Live2D models are not supported. If that doesn’t help, feel free to contact me, Emiliana_vt! If you have any questions or suggestions, please first check the FAQ. Running four face tracking programs (OpenSeeFaceDemo, Luppet, Wakaru, Hitogata) at once with the same camera input. In this comparison, VSeeFace is still listed under its former name OpenSeeFaceDemo. You can see a comparison of the face tracking performance compared to other popular vtuber applications here. For the optional hand tracking, a Leap Motion device is required. Capturing with native transparency is supported through OBS’s game capture, Spout2 and a virtual camera.įace tracking, including eye gaze, blink, eyebrow and mouth tracking, is done through a regular webcam. VSeeFace can send, receive and combine tracking data using the VMC protocol, which also allows support for tracking through Virtual Motion Capture, Tracking World, Waidayo and more. Perfect sync is supported through iFacialMocap/ FaceMotion3D/ VTube Studio/ MeowFace. VSeeFace runs on Windows 8 and above (64 bit only). VSeeFace offers functionality similar to Luppet, 3tene, Wakaru and similar programs. VSeeFace is a free, highly configurable face and hand tracking VRM and VSFAvatar avatar puppeteering program for virtual youtubers with a focus on robust tracking and high image quality.
0 notes
Note
If you don’t mind my asking how do you find working in the vtuber program you use? I have an ancient copy of facerig but when I saw I had to make like over 50 animations that it blends between I was like “well I don’t have time for that” but never really looked around any further because I like don’t stream often At All and it was a passing “it’d be fun to make something like that” curiosity but man. I’m thinking about it again
howdy! :D
I use VSeeFace on my PC and iFacialMocap on my iPhone for vtuber tracking and streaming! I really like it, but it's lots and lots of trial and error haha- that's partly because all my avatars are non-humanoid/are literally animals with few to no human features, and vtuber tracking programs are designed to measure the movements of a human face and translate them onto a human vtuber avatar, so I have to make lots of little workarounds. If you're modeling a human/humanoid avatar, I would imagine things are more straightforward
I think as long as you're comfortable with 3D modeling and are ready to be patient, it's definitely worth exploring! I really like using my model for streaming and narration of YouTube videos, Tiktoks, etc. For me it's just a lot of fun! :D It's a bit like being a puppeteer haha
If you're looking to get started making vtubers, I really recommend this series of tutorials by MakoRay on YouTube, it's super informative and goes step-by-step:
youtube
Best of luck! :D
16 notes
·
View notes
Text
La única manera que hemos encontrado de minimizar (no evitar) el clipeo y reacción en este caso es editar manualmente los valores de cada movimiento problemático dentro de Meow Face o en el caso más extremo hacer que ciertos movimientos faciales sean activables manualmente con el uso del teclado. (como sacar la lengua o hinchar las mejillas).
No hay mucho que se pueda hacer de mi parte al llegar a este punto.
Recomiendo el uso de productos apple por el momento para evitar estos problemas.
Nota: Probablemente funcione mejor que este ejemplo un celular con Android más actual, no puedo asegurar nada.
3 notes
·
View notes
Text
youtube
Vtuber: ClaricalVT
• https://www.twitch.tv/Claricalvt
• https://twitter.com/Claricalvt
Model Illustration and L2D Rig: Cinnamegart_
• https://twitter.com/cinnamegart_
Art Programs: CSP, Live2D, VTS, Vbridger, iFacialMocap, Davinci Resolve
________________________
Cinnameg’s Commission Info:
•Website: https://cinnmegl2d.carrd.co
Cinnameg’s Socials:
•Twitter: https://twitter.com/Cinnamegart_
0 notes
Video
youtube
Blender Facial tracking with iFacialMocap (PC+Android)
0 notes
Text
also here the video tutorial i was following :3 (it only works if you have an RTX btw!!!)
i was trying something new, there's a facial tracker now in NVidia Broadcast. and a program that can translate it into iFacialMocap protocol
it looks really cool and promising but when i connected it with VSeeFace it was lagging like hell and could only track the general facing direction. no expressions whatsoever. but that might be a me / my model problem
buuut it's cool that an iPhone might not be required in the future for detailed tracking!!!
5 notes
·
View notes