BadMouth Pcomp/ICM

So finally done with finals. Well almost done… We do start off thinking about a certain idea and we definitely spend a certain amount of time
trying to imagine the outcome. It usually is great in theory, when it comes to making it happen, challenges rises in different aspects of the project.

In my first blog for BadMouth, I broke down the project to 5 categories.

I will go through the five sections to see how I have ended up developing the projects. Methods I changed, Methods I learned, Methods I dropped…

Speech/Voice:

Speech recognition

After research and time put on checking speech /voice recognition and trying out different libraries, the easiest way for me was to go back to P5js. It’s not just about easy, it’s about what I can accomplish in a certain amount of time with my knowledge. The Speech library is a bit challenging, it is less developed then other libraries and has less examples of usage… It took some time to start it off ( thanks to the ICM help session support that I was able to set it off). It also only works in Chrome for continuous recognition, so the desktop editor P5js was not a possibility. I only had a choice to use the web editor version which I like less.
Through the process of trying out libraries, doing research and also trying to work through Python instead of P5js… I got to learn a lot of things about Speech recognition, the difference with voice recognition, some history of how it all started and the AI possibilities/limitation of the future… All amazing stuff to read and learn about.  If it weren’t for anything learning only about the limitations to whats out there and what can be done and what can’t, is a very good start. On that path
One great reference was of course given by the amazing Allison Parish: wit.ai simple and easy to follow, they let you develop the scenario easily and give it you in Json files for formats easy to use in coding. So with more time in hand one can really  go into creating a full personal library based on personal scenarios.
So that process made me go “again” through all the tutorials out there linked to Jsons and API and how to get data in coding. I had to go through many examples and apply tutorials over and over to really get to do it.
When I first started working with P5js I only new how to repeat examples given, work a bit through them, develop some… but it was a lot of copy pasting; the thing that is making me happy though it’s not much but I felt for the first time in the final while doing the speech coding that I was able to finally put myself some logic on how to do it: now I could use an If statement or a true/false calling… something I wasn’t able to logically apply… good to know it’s get better.

Other then making the library work was the part to connect it serially. Though I had done different times serial communication through Pcomp homeworks, but I guess It also has to do with the different sensors we use. The set up is the same but the logic is sometimes different from one sensor to another getting the right values in P5 was tiring.

I had worked different sketches for BadMouth, below two sketches presented in ICM as finals:
The first is a speech that BadMouth does: talking random shit to people (a bit long):

https://alpha.editor.p5js.org/renanou/sketches/ryz4jmPmg

The second is a made-up conversation made up with BadMouth and Myself:

audio sample of the conversation

https://alpha.editor.p5js.org/renanou/sketches/SygueGqQl

When It came to combining them serially I had to change the strategy of how I’m using the scenarios. The fact that Im using Ultrasonic sensor, which not only detects the presence but also majorly the distance… so it was only logical to work with what the sensor is doing, meaning assigning text linked to the distance people are standing before BadMouth. BadMouth will still be Bad but instead if people are close he will talk about this proximity mentioning that they are close, If people are far, he’d call them to come closer…. this logic makes more sense with an Ultrasonic then to just assign a random speech without taking in consideration the role of the sensor. Below the sketch that I worked it by creating different arrays of words for different ranges to be detected in the ultrasonic depending on the distance or the proximity of a person:

video below testing

ranges ultrasonic

Below the sketch in P5js :

https://alpha.editor.p5js.org/renanou/sketches/Sy8ArBRXl

As for the conversation with badMouth I had presented in ICM a made up scenario to show speech recognition… but the scenario doesn’t make sense to the project itself so I had decided to work  it in a way to respond  to one thing people might do or say, in  this scenario it was logical to insult BadMouth: “Fuck you”, Fuck you BadMouth or Fuck you BadMouth you are a loser!” . This conversation is not linked to the sensor as it’s the people’s output and initiative and not the role of the sensor.

Below the sketch in P5js :

https://alpha.editor.p5js.org/renanou/sketches/HybnhbDml

 

 

Motion detector:

Initially I had in mind to work with the PIR, once discussed in class with my great teacher Benedetta Piantella, she suggested Ultrasonic and camera detection instead of PIR and of course as she knows better she was right. I went through different testing posted on the blog earlier… working with PIR was really boring and not satisfying it takes a lot of time to reset when detecting so a hand could be moving and the responses don’t follow up really because between one movement and another it has to reset… moving to the ultrasonic, it was  great indeed. The values are detailed and it has a really good margin of getting creative with it (if time is available), I had tried different sketches in the Arduino and always got great values which is why i decided to adopt it for this project. The camera ideally would have been great as well because as Benedetta Piantella suggested, you could assign for BadMouth to make comments when it sees a certain color. Let’s say someone wearing a yellow shirt standing at a certain proximity of BadMouth… BadMouth can comment something regarding the yellow shirt. but when I tried testing with the camera I found it complicated, I had decided to keep it tll last cause I was running out of time and I needed to decide and work the project. In order to makeup for that color detecting idea I also tested with the color sensor, I thought it could be cool to combine Ultrasonic and color sensor instead of camera. color sensor values and testing were satisfying but the issue with ultrasonic is that you really need to be at a very close proximity to make it work… I don’t think it make sense in the scenario of BadMouth where people aren’t supposed to be that close… Nevertheless working with added another cool learning to all that.

Ultrasonic details:

HC – SR04 Ultrasonic Distance Measuring Sensor Module for Arduino / NewPing Library

Ultrasonic has a NewPing library that makes the values and working with it greater. I went through using it as well.

different sketches I used and tried out from different sources but changed them and worked through them  for the ultrasonic that has two pins to consider, The trigger pin and the echo pin:

One with the NewPing library and one without:

sketch one

sketch two

Mouth Motion:

In this section I wanted to make the lips of BadMouth move while he talks. Unfortunately due to time constrains I took so much time on the motion detection section and the speech recognition section that this section was jeopardized. I am planning though to complete it during winter break if it isn’t for anything at least for my proper satisfaction and for my proper learning. The challenge was to work through the mechanics of making the lips move up and down. I thought if I mounted two motors to clips at each of their end and then I could hand each clip to one lip it will not make the motors spin completely because it got attached to a physical restraint and yet it will probably do some kind of a random movement (trying to spin its way out) but that would be exactly what i need, just a small movement of the lips, good enough to make the impression. But unfortunately realistically my imagination to the process was off. the motors worked but once attached to the lip they didn’t make a small random movement but instead didn’t move at all… Because this was experienced at the last minute didn’t have the chance to try out plan be which i think could be the way to do it: through the Gripper kit tool and a servo motor. This Gripper kit tool is simply some kind of small clips that are wired through a servo, unlike the motor i was using , they don’t spin but instead do exactly the motion I need which is the opening and the closing of both ends of the clip.

samples of what I tried to do:

that was a sample with batteries not linked yet to Arduino just to test if it’ll work with BadMouth.

video of the motor/clip spinning (before attaching it to the lips).

IMG_2462

What I think the plan that I should experience with next:

Gripper tool kit and servo

here’s reference from servoCity on how it work…

Mouth Design

The design of the mouth should have been made from scratch if time was available, I had to find solutions to get the feel of a realistic looking mouth for now.
I have found a realistic mask made from latex and i decided to work with it for the time being. I laser cut panel to make an opening for the mouth to sit (stuck in wall symbol) and another above it for the Ultrasonic motion sensor to be placed.

Pictures below:

Scenario for BadMouth

Scenario for BadMouth ideally should be written personally and recorded with a certain voice character specially assigned to BadMouth. For now I just worked with Cliche, cheesy funny quotes you could find on the net and assembled them together to make sense while injecting in between some humanistic sounds or words we say in our daily routine to give BadMouth a more humanistic approach, also injected some sentences that I made up to be able to put a certain draft scenario together for now.

Thank you

-Allison Parish and Benedetta Piantella  for the time you give, the great teaching, the help & support but mostly for being who you are as individuals a great inspiration to us all.

-Ben Light for the advise and help you give even when it’s nothing linked to your course.

-Mithru Vigneshwara for supporting and helping out with the coding  and Manning Qu for brainstorming with me over motors mechanics and fabrication.

Bad Mouth Final

BadMouth started off as an installation that has no purpose in life but to talk shit to people. It is still the scenario however, when it gets to executions reality checks between how you imagine a project will go and how it ends. Not that the ending is negative but there is definitely detours, challenges and limitations to our expectations.

I probably had in mine Artificial Intelligence. I ended up with a fun text to speech in P5js. Eventually it’s all a learning and an experimentation process that one has to go through to  eventually reach the best and simplest form to code the project.
First I experimented a bit with Python… went through the tutorials of it. And through speech recognition sketches. It turned out to be an impossible mission.I then conducted search with Js, APIs, different libraries available online… Till I went back to P5js. working with the Library in P5js was a challenge it has some aspects that needs reviewing so people would be able to work with it.
One great site that the amazing Allison Parrish told me about Wit.ai, effectively a great site to create your own scenario for speech recognition and assign it to any bot or application you are starting to work with. It gets clean Json files that could be used in P5js library.

I have prepared two sketches, one that dictates the speech that bad mouth will be telling when he senses people approaching through the ultrasonic sensor. and another that is a dictated conversation with BadMouth. Eventually this sketch will be assigned so people can insult BadMouth and get different results.

Final videos for the serial communication and the BadMouth installation will be available next week as soon as the Pcomp part is over.

For now below two coding platform I will be using for BadMouth.

conversation with BadMouth

https://alpha.editor.p5js.org/renanou/sketches/HybnhbDml

BadMouth speech

https://alpha.editor.p5js.org/renanou/sketches/ryz4jmPmg

 

 

Bad Mouth – Final Project

Bad Mouth project will be worked for my ICM and Pcomp class.

Description:

Bad Mouth is a purposeless installation of a pissed off mouth who talks shit to people all day long (excuse my language)because he is stuck behind the wall, Bad Mouth will interact with them through some limited set of  dialogue. As long as he is receiving a comment and sensing talking input, he will generate expressions.

Bad Mouth has a Character: Like an angry stand up comedian, the expressions will be Mean Nasty Vulgar (sometimes) but always Funny and Smart.

Bad mouth are silicone lips of 10″ to 12″ coming out from a wall. behind the wall the lips are attached to a motor that gets them to move slowly mimicking the mouth action when we speak.
Bad mouth will have some sort of a sensor that detects the presence of people, or a live camera that detects the presence of people when they are near him.

 

Below some pictures as a moodboard to the possibilities of how Bad Mouth will look like:

il_570xn-257678124

details-about-new-silicone-rubber-face-slimmer-mouth-muscle-tightener-anti-wrinkle-anti-aging-1creativando-home-humidifier-red-lips-1humumgakisstoy-2

Times square’s insanity

screen-shot-2016-11-03-at-3-15-16-pm

So this is Times square’s madness. It’s an interactive piece where people can play around. I Worked with sound, images, videos, slides, shapes, loops, random… planning to add more button and clicks for people to play. It is still in progress..

Trump/Hillary (week5-6)

This post is a fun piece that sublimely suggest Hillary over Trump. When you click on Trump it turns to Hillary.

http://localhost:3000/

This post dragged over two weeks because when I started coding/designing it in P5js I couldn’t get to finish it, I got stuck in making the Trump text turn into Hillary, eventually It was turning to Hillary but coded in the wrong way (should have called an if statement true/false). Thanks to the amazing Allison Parrish  who showed me how to do it properly in class. So I went back to it and was able to rework it properly and enhancing the design aspect of it, added the stars (rotating), and decided to go all the way and make it a fun patriotic piece with the flag inspired by the crazy events happening regarding the elections of 2016.

 

Phase 3| final phase:Link in P5js:
http://localhost:3000/

 

screen-shot-2016-10-20-at-9-40-54-pm

screen-shot-2016-10-20-at-9-40-25-pm

screen-shot-2016-10-20-at-9-40-41-pm

Below link sample and screen shots of the previous draft sketches made:

Phase 1:
https://alpha.editor.p5js.org/projects/SkU7FaU1e

Phase 2:
https://alpha.editor.p5js.org/projects/SkU7FaU1e

Phase 2 (pics):

screen-shot-2016-10-25-at-10-36-00-pm

screen-shot-2016-10-25-at-10-36-18-pm

Phase 1 (pics):

screen-shot-2016-10-25-at-10-50-55-pm

Could be slot machines in some very abstract world…

I tried in this project to work with “functions” while incorporating other mediums we used before like nested loops, bouncing balls, color mapping with mouseX, etc. However, balls  left me eventually and didn’t come back after bouncing  :0, don’t know why…
https://alpha.editor.p5js.org/projects/Sk14RE7R

screen-shot-2016-10-06-at-11-20-24-am

This is how I started at first: below I worked on the nested loop function, making patterns and grids, playing with mouseX and map too…
https://alpha.editor.p5js.org/projects/rkJyIE7R

screen-shot-2016-10-06-at-11-22-50-am

Animated patterns: many and random

_icm-patterns-animated

These are my funky animated patterns I enjoyed playing around, Links below to each animation:

music notes ( i guess)
* https://alpha.editor.p5js.org/projects/SkAYb8Yp

chosen option
* https://alpha.editor.p5js.org/projects/SkaN1Wda
* https://alpha.editor.p5js.org/projects/HkeSCuva

bold, soft and blue
* https://alpha.editor.p5js.org/projects/Byx4LYDp
* https://alpha.editor.p5js.org/projects/S1UKHtwp
* https://alpha.editor.p5js.org/projects/rk3ZQFDp

playful 1 & 2
* https://alpha.editor.p5js.org/projects/ByOJV_wp
* https://alpha.editor.p5js.org/projects/SkO3Twwp

minimal bubbles and stripes
* https://alpha.editor.p5js.org/projects/rJ320Jvp
* https://alpha.editor.p5js.org/projects/Hyy4prF6

Blowing bubbles

https://alpha.editor.p5js.org/projects/SJdTJPZp

blowing-bubbles

So Blowing Bubbles is an attempt to blow bubbles in P5js.
It was kind off fun to animate, however I wasn’t able to stop the action or loop it.It feels like actions are still out of control. I used variables to do the bubbles, and give assign to them actions.
I was also able to use the random action to some of the bubbles, it was kind off interesting to be able to give them this shaky feel because i actually restrained the randomness in a certain space, so the effect ended up like a shake perfect for this scenario.
Also when I played a bit with the shadow of the blower, it gave a better feel, more depth for the illustration.

ICM week 1

screen-shot-2016-09-15-at-11-31-31-am

It looks simple but took plenty of time!
Was hard to compose in coding for the first time, however i found some joy doing it. The idea of being able eventually to compose not because you are using a set of tools people assigned them to you is pretty exciting.
I used Translate and rotate transformations to be able to repeat patterns and change angles, i struggled to figure things out, and still not sure how to use them. After i made my composition, I wasn’t able to delete elements i didn’t want anymore. Every time i did the whole composition shifted and rotated differently, figured out eventually that i had to use Push and Pop to be able to control the transformations afterward.. I tried to insert them at the end, but would have to rework horrible details that took time to color, place, rotate…
Another issue, I understand the X and y axis in design however im finding it very hard to place a shape at a certain place, all was done with trial and error not because i was directly able to spot the right location, i mean no rulers, no guides whatsoever for the axis?
Same with the rotating angles..
As for colors in order to get my dirty colors palette mixed with greys, I had to use the photoshop color palette picker, pick a color take the number and put it in code. It worked, but is there a palette color in P5js?
then i wanted some transparency so I used the 5 digit color code (alpha) i guess…