BadMouth Pcomp/ICM

So finally done with finals. Well almost done… We do start off thinking about a certain idea and we definitely spend a certain amount of time
trying to imagine the outcome. It usually is great in theory, when it comes to making it happen, challenges rises in different aspects of the project.

In my first blog for BadMouth, I broke down the project to 5 categories.

I will go through the five sections to see how I have ended up developing the projects. Methods I changed, Methods I learned, Methods I dropped…

Speech/Voice:

Speech recognition

After research and time put on checking speech /voice recognition and trying out different libraries, the easiest way for me was to go back to P5js. It’s not just about easy, it’s about what I can accomplish in a certain amount of time with my knowledge. The Speech library is a bit challenging, it is less developed then other libraries and has less examples of usage… It took some time to start it off ( thanks to the ICM help session support that I was able to set it off). It also only works in Chrome for continuous recognition, so the desktop editor P5js was not a possibility. I only had a choice to use the web editor version which I like less.
Through the process of trying out libraries, doing research and also trying to work through Python instead of P5js… I got to learn a lot of things about Speech recognition, the difference with voice recognition, some history of how it all started and the AI possibilities/limitation of the future… All amazing stuff to read and learn about.  If it weren’t for anything learning only about the limitations to whats out there and what can be done and what can’t, is a very good start. On that path
One great reference was of course given by the amazing Allison Parish: wit.ai simple and easy to follow, they let you develop the scenario easily and give it you in Json files for formats easy to use in coding. So with more time in hand one can really  go into creating a full personal library based on personal scenarios.
So that process made me go “again” through all the tutorials out there linked to Jsons and API and how to get data in coding. I had to go through many examples and apply tutorials over and over to really get to do it.
When I first started working with P5js I only new how to repeat examples given, work a bit through them, develop some… but it was a lot of copy pasting; the thing that is making me happy though it’s not much but I felt for the first time in the final while doing the speech coding that I was able to finally put myself some logic on how to do it: now I could use an If statement or a true/false calling… something I wasn’t able to logically apply… good to know it’s get better.

Other then making the library work was the part to connect it serially. Though I had done different times serial communication through Pcomp homeworks, but I guess It also has to do with the different sensors we use. The set up is the same but the logic is sometimes different from one sensor to another getting the right values in P5 was tiring.

I had worked different sketches for BadMouth, below two sketches presented in ICM as finals:
The first is a speech that BadMouth does: talking random shit to people (a bit long):

https://alpha.editor.p5js.org/renanou/sketches/ryz4jmPmg

The second is a made-up conversation made up with BadMouth and Myself:

audio sample of the conversation

https://alpha.editor.p5js.org/renanou/sketches/SygueGqQl

When It came to combining them serially I had to change the strategy of how I’m using the scenarios. The fact that Im using Ultrasonic sensor, which not only detects the presence but also majorly the distance… so it was only logical to work with what the sensor is doing, meaning assigning text linked to the distance people are standing before BadMouth. BadMouth will still be Bad but instead if people are close he will talk about this proximity mentioning that they are close, If people are far, he’d call them to come closer…. this logic makes more sense with an Ultrasonic then to just assign a random speech without taking in consideration the role of the sensor. Below the sketch that I worked it by creating different arrays of words for different ranges to be detected in the ultrasonic depending on the distance or the proximity of a person:

video below testing

ranges ultrasonic

Below the sketch in P5js :

https://alpha.editor.p5js.org/renanou/sketches/Sy8ArBRXl

As for the conversation with badMouth I had presented in ICM a made up scenario to show speech recognition… but the scenario doesn’t make sense to the project itself so I had decided to work  it in a way to respond  to one thing people might do or say, in  this scenario it was logical to insult BadMouth: “Fuck you”, Fuck you BadMouth or Fuck you BadMouth you are a loser!” . This conversation is not linked to the sensor as it’s the people’s output and initiative and not the role of the sensor.

Below the sketch in P5js :

https://alpha.editor.p5js.org/renanou/sketches/HybnhbDml

 

 

Motion detector:

Initially I had in mind to work with the PIR, once discussed in class with my great teacher Benedetta Piantella, she suggested Ultrasonic and camera detection instead of PIR and of course as she knows better she was right. I went through different testing posted on the blog earlier… working with PIR was really boring and not satisfying it takes a lot of time to reset when detecting so a hand could be moving and the responses don’t follow up really because between one movement and another it has to reset… moving to the ultrasonic, it was  great indeed. The values are detailed and it has a really good margin of getting creative with it (if time is available), I had tried different sketches in the Arduino and always got great values which is why i decided to adopt it for this project. The camera ideally would have been great as well because as Benedetta Piantella suggested, you could assign for BadMouth to make comments when it sees a certain color. Let’s say someone wearing a yellow shirt standing at a certain proximity of BadMouth… BadMouth can comment something regarding the yellow shirt. but when I tried testing with the camera I found it complicated, I had decided to keep it tll last cause I was running out of time and I needed to decide and work the project. In order to makeup for that color detecting idea I also tested with the color sensor, I thought it could be cool to combine Ultrasonic and color sensor instead of camera. color sensor values and testing were satisfying but the issue with ultrasonic is that you really need to be at a very close proximity to make it work… I don’t think it make sense in the scenario of BadMouth where people aren’t supposed to be that close… Nevertheless working with added another cool learning to all that.

Ultrasonic details:

HC – SR04 Ultrasonic Distance Measuring Sensor Module for Arduino / NewPing Library

Ultrasonic has a NewPing library that makes the values and working with it greater. I went through using it as well.

different sketches I used and tried out from different sources but changed them and worked through them  for the ultrasonic that has two pins to consider, The trigger pin and the echo pin:

One with the NewPing library and one without:

sketch one

sketch two

Mouth Motion:

In this section I wanted to make the lips of BadMouth move while he talks. Unfortunately due to time constrains I took so much time on the motion detection section and the speech recognition section that this section was jeopardized. I am planning though to complete it during winter break if it isn’t for anything at least for my proper satisfaction and for my proper learning. The challenge was to work through the mechanics of making the lips move up and down. I thought if I mounted two motors to clips at each of their end and then I could hand each clip to one lip it will not make the motors spin completely because it got attached to a physical restraint and yet it will probably do some kind of a random movement (trying to spin its way out) but that would be exactly what i need, just a small movement of the lips, good enough to make the impression. But unfortunately realistically my imagination to the process was off. the motors worked but once attached to the lip they didn’t make a small random movement but instead didn’t move at all… Because this was experienced at the last minute didn’t have the chance to try out plan be which i think could be the way to do it: through the Gripper kit tool and a servo motor. This Gripper kit tool is simply some kind of small clips that are wired through a servo, unlike the motor i was using , they don’t spin but instead do exactly the motion I need which is the opening and the closing of both ends of the clip.

samples of what I tried to do:

that was a sample with batteries not linked yet to Arduino just to test if it’ll work with BadMouth.

video of the motor/clip spinning (before attaching it to the lips).

IMG_2462

What I think the plan that I should experience with next:

Gripper tool kit and servo

here’s reference from servoCity on how it work…

Mouth Design

The design of the mouth should have been made from scratch if time was available, I had to find solutions to get the feel of a realistic looking mouth for now.
I have found a realistic mask made from latex and i decided to work with it for the time being. I laser cut panel to make an opening for the mouth to sit (stuck in wall symbol) and another above it for the Ultrasonic motion sensor to be placed.

Pictures below:

Scenario for BadMouth

Scenario for BadMouth ideally should be written personally and recorded with a certain voice character specially assigned to BadMouth. For now I just worked with Cliche, cheesy funny quotes you could find on the net and assembled them together to make sense while injecting in between some humanistic sounds or words we say in our daily routine to give BadMouth a more humanistic approach, also injected some sentences that I made up to be able to put a certain draft scenario together for now.

Thank you

-Allison Parish and Benedetta Piantella  for the time you give, the great teaching, the help & support but mostly for being who you are as individuals a great inspiration to us all.

-Ben Light for the advise and help you give even when it’s nothing linked to your course.

-Mithru Vigneshwara for supporting and helping out with the coding  and Manning Qu for brainstorming with me over motors mechanics and fabrication.

Final Project part ll testing and coding

PIR SENSOR + PIEZO + LED:

This Sensor is not the best, It is very limiting and the responses are slow because it needs time to reset every time.

The first trials the PIr was unresponsive, wrongly wired.
VIDEO LINK PIR: pir-wired-wrong

Second trials it worked but too slow and not exciting as a sensor to work with:

VIDEO LINK PIR:img_2322
screen-shot-2016-11-30-at-10-51-24-am

ULTRASONIC SENSOR:

Working with Ultrasonic after PIR was great, though it’s a hard Sensor cause there are many values to code and I had to download the NewPing library for it to get better results and for that I needed to update all my systems but eventually it worked. The serial communication also is hard because of the values and numbers. But at least it’s clearly responsive.

VIDEO links with different codes:

img_2340

ultrasonic-dist

 

screen-shot-2016-11-30-at-6-24-57-am

 

COLOR SENSOR:

Color sensor js cool actually to work with. it gives good writings and numbers.

Video Link: color-sensor

screen-shot-2016-11-30-at-7-27-24-am

Camera with Arduino:

Camera is really hard to work with. So instead I decided to work with Ultrasonic and colors to assign for each color for Badmouth to say something linked to the color it will sense.

In parallel to all the sensor wiring , testing and making the serial communication, I’ve been working on the speech, sound codes trying with Python first… too hard to code with it as I don’t know Python, so I moved back to do it through P5js. I am experimenting with assigning it as sounds, also working on a second code with API chatter bots. The code and the speech recognition is really hard and complex so It’s taking some time to learn it> I will be soon uploading it with the sensors.

final project

Bad Mouth project will be worked for my ICM and Pcomp class.

Description:

Bad Mouth is a purposeless installation of a pissed off mouth who talks shit to people all day long (excuse my language)because he is stuck behind the wall, Bad Mouth will interact with them through some limited set of  dialogue. As long as he is receiving a comment and sensing talking input, he will generate expressions.

Bad Mouth has a Character: Like an angry stand up comedian, the expressions will be Mean Nasty Vulgar (sometimes) but always Funny and Smart.

Bad mouth are silicone lips of 10″ to 12″ coming out from a wall. behind the wall the lips are attached to a motor that gets them to move slowly mimicking the mouth action when we speak.
Bad mouth will have some sort of a sensor that detects the presence of people, or a live camera that detects the presence of people when they are near him.

Below some pictures as a moodboard to the possibilities of how Bad Mouth will look like:

il_570xn-257678124

details-about-new-silicone-rubber-face-slimmer-mouth-muscle-tightener-anti-wrinkle-anti-aging-1creativando-home-humidifier-red-lips-1humumgakisstoy-2

Breakdown of the project into 5 sections:

1.Speech/voice:
in this section I will explore how to make BadMouth talk and talk back (with speech recognition).
this week I spent it on this section mostly because this is the major section for this project, as soon as i
figure out exactly what is the right process to follow i can move to explore other sections. Right now I will
start testing on two different approaches to see the best results possible while taking in consideration the limitations of that
process:
First option is to use

speech recognition through Arduino:
-voice recognition module TS1299
-and Mp3 SD shield
-Usb TTL module
however the TS 1299 only takes 15 commands divided into 3 sets.
I can assign the voice I want the mouth to say through the Mp3 SD
shield.

or use
Python SpeechRecogition 3.1.3
chatter bot Api (cleverbot
pandorabot)

That late option seem good however I don’t know Python so I’m not sure I can pull it out. I have to try and test…

2.Motion Detector:

In this section I will explore how to getthe mouth detect the presence of people around, so he’d start talking bullshit to them…

I worked briefly on this section this week but I think

Testing will be through motion sensor with arduino or through live camera sensor

_PIR motion sensor

or
_Camera module for Arduino ov7670

3.Mouth Motion:

I didn’t get to this section yet this week. but overall this section will be related on how to physically make the mouth move slightly when it speaks to people ( ideally some lipsing or at least synching with silent periods and talking periods)..

4.Mouth Installation/Design/fabrication:

In this section I will explore the design of the mouth and how to make it.
I already have an idea on how the design will look like a brief mood is added in the description.
the mouth will be from silicone coming out from a wall (with no face) a surreal instalation.

5.Scenario for BadMouth:
In this section I will research and come up with different scenarios to what Badmouth will say to people and what will he reply when he hears certain words coming from people.
BadMouth is a pissed off character. His comments are mean, vulgar nasty but always funny and smart. Ideally like an ironic stand-up comedian. This section is being explored and researched every week in parallel to other sections.

Below a chart for the sections (this chart will be updated each week):

diagram-for-pcomp

 

Serial communication Midterm project (team of 2)

So my partner in crime for this midterm Pcomp project is Michael Kripchak. It was amazing working with him as we come from different backgrounds I got to learn from him a lot. Probably one of the good assets of ITP the fact that we get to work and learn from each other as each has different experience in work and life.

Michael’s Blog:

PhysComp: MidTerm – Let it Snow! – Michael Kripchak’s NYU-Tisch Blog
kripchak.com/2016/10/26/physcomp-midterm

We initially decided to take this project from a beautiful sketch Kripchak coded in ICM class:https://alpha.editor.p5js.org/projects/SJw7eev1g

This project went through different phase, we had first agree on connecting serially this cool sketch to a Sound Sensor in which people can change the wind direction and the snow flow as they sing through the sensor: We went with this initial plan and chose to work with the: max9812 microphone amplifier.
Below pics:
s-l1600

After many trials trying to make it work and trying to get logical reading of numbers through the Arduino, and after many trials trying to make it work serially as well in P5js, it kept on going nuts on us. We then decided to shift to a different sensor in which we can blow instead of singing (suggested by our friend Carlie) and get the metaphor of the wind work in parallel in P5js. We also decided to pursue further the idea and add to our sketch a fan blowing physically fake snow in a transparent glass jar.

I experimented serially on a pot after giving up on the microphone amplifier, made some enhancement to the sketch added a background & sound and eventually connected it serially.

Initial Sketch connected to a pot:

Trying out different backgrounds and adding sound:

img_1826

While Kripchak worked deeply on connecting to our circuit to a motor, to give the fan enough boost to make the fake snow blow and getting the right readings through Arduino. That gave us hard time and kept on interfering with the different sensors we were trying over and over again. It is good to mention that the Potentiometer was the smoothest of all, it worked amazingly:

img_1828 img_1811img_1811
img_1814 img_1812 img_1830

We ended up choosing the Piezo vibration Sensor in which we could blow and tap and get the wind change its direction serially, though it was hard to fix all the noise we were getting in the circuit but it eventually worked.

Piezo sensor
690635_bb_00_fb-eps-1

Through out the trial an errors we went also trying the flex sensor and 2 motors one of 5v and one of 12v. We also tried different fans. We also had to do some soldering to connect the motors(first time for me), It was good go through the experience and learn it, though the result was a bit funny…

Flex Sensor:

screen-shot-2016-10-26-at-1-04-57-am

Soldering:

img_1810

Finally we get to the final result, where we were able to make the serial communication happen through the piezo sensor tapping it or blowing on it displaying the sketch of the snow blowing in p5js, while simultaneously make the fan spin  (through a motor) and get it to make the fake snows in the jar to fly. Below a demonstration:

img_1853

 

 

 

 

Serial input with P5js

 

First option  is manipulated with a potentiometer. In P5js I chose to work a nested loop and assigned to it a sliding layer on top of it to complete the pattern. This layer is controlled through the potentiometer that is also controlling simultaneously the map color from blue to pink-red. the color of the pattern changes as we go playing with the Pot.

moodboard-pcomp-sheet-1

screen-shot-2016-10-19-at-5-34-20-am screen-shot-2016-10-19-at-5-33-13-am

Second option  is manipulated with a potentiometer as well. In P5js I experimented with the random function that was harder to control because it is random it gives unexpected results, but eventually it worked out after MANY trials…

moodboard-pcomp-sheet-2

screen-shot-2016-10-19-at-5-35-26-am

 

Labs for Input into P5js:

While it was fun to do a creative input in p5js. Labs with the graph gave weird horrible results. At first my potentiometer wasn’t responding well with the P5js, shaky number value, then too slow and not following, then weirdly when in lab they asked us to play with the delay in Arduino  from (1) to (100) my potentiometer stabilized completely and i was finally able to work on the creative output in P5js.

However though i checked the the code over and over the screen shot that was put in the lab for the graph didn’t match at all my results. It gave at first a weird looking pulse that wasn’t even matching with the potentiometer, but after playing with the delay, it followed the pot but kept weird regarding the visual of the graph. I don’t understand why…
below two videos showing two (of many trials) trying to fix it).

Video of Pot going nuts

img_1780resized

Video of Pot going less nuts

img_1783-resied

 

 

Labs | Servo || Tone || Melody

Below the labs for this week, Servo and Tone. Both made sense perfectly when reworking them and trying to code them. However when it came to play with the Tone and create some Melody though I understand the process however coding it is still an impossible mission. I eventually got a preset code from Arduino and went through it to learn it. I think it’ll take some time before being able to do it all from scratch.

 

  1. Servo Lab

2.Tone Lab

3.Tone-Melody Lab

Observation- Paying at stores

I choose to talk about how we pay at stores because I think it’s one of these daily recurrent annoyances that we endure sometimes even different times per day.

*”Swipe your card Sir”
(man tries)
*”no from the other side”
(man tries again)*” no flip it Sir”
(man tries one more time)
* “here let me help you”
(she takes the card and swipe it)
*”Oh! it’s debit card Sir?”
(man: yes..)
*”OK Sir please insert it”
*” don’t remove it”
*”Now please put your pin”
*”Sir please sign”

Does it seem a familiar scenario to you? Well, most people face this hassle at cashiers. And most of the time the line piles up because of this scenario happening with almost everyone. Sometimes it’s a bit shorter, but still if you want to sign with your finger they give you the pen, and if you ask for the pen they tell you sign with your finger. And sometimes it’s a credit card but you have to insert it like debit cards. and sometimes it’s a debit card but you can swipe it like credit cards… endless crazy guessing of when to swipe or insert, how to swipe, where to sign: digitally or on paper? with your finger or your pen? Not to forget how many times the server slows down when it’s very crowded and you also have to wait for it…
I went into different places Wellgreens, Macy’s, Tgmax, Groceries… etc Now small groceries and restaurant are actually hassle free because they take your card and do it for you. you don,t have to interact directly with the machine… Which makes me think that before the technology of having to swipe your card yourself, things were easier as you give your card they do it for you and make you sign a paper… Whereas now it’s crazy every time we have to pay. The swiping machine looks very modern with these blue light indicators to where to swipe… but is showing people where to swipe enough? maybe the solution doesn’t have to be on the machine itself, maybe on the credit cards a small indicator… Maybe we need to move to simple paying with Androids or Apple… it’s probably easier… but we can’t find the service everywhere and weirdly in many big surfaces you can’t pay with your phone… They probably enjoy the sight of people lining up to pay everywhere.
There must be a better solution for paying money in 2016, with all these new technologies, we really need to figure out a better interaction solution, or simply less interaction.

Pcom week 3-Sensors

Blink and play:

 

 

Trying out the different labs, blinking leds, sensors, switchers, etc..

Potentiometer and an RGB LED:

The two video below shows how I experimented with controlling an RGB LED with a Potentiometer, after a certain amount of hours I succeeded. However I tried different potentiometer and both gave different results yet not complete ones. The code is set to make the Led turn from red to green to blue. Weirdly (at least for me), the first potentiometer gave only red and green but no blue, the second gave red, green and violet but no red… No matter how much i tried to play with the code color specs and numbers given, the potentiometer wouldn’t change the range of colors.

Potentiometer and a Fan:

After working with the RGB Led, got very excited to see how I can control a Fan’s speed with the potentiometer; a bit too excited maybe… cause this failed miserably even after 4 hours. Below video and pictures. I think the first circuit in the video is horribly wrong. In the last picture I guess I reached a better place and at some point I thought it worked but for some reason it wasn’t very clear why I could see clearly the speed slowing down when I turn my potentiometer but then instead of speeding back up again it stops.

img_1236 img_1237 img_1240img_1239

Pcom Week two Labs

img_1024 img_1025-1img_1048 img_1031

Below a simple circuit in series, and in parallel. Then felt like experimenting with something organic, Lemons, Apples, leafs… (was thinking about the switch). It was kinda fun!… To be able to light the Led with a Lemon.

But I ended up doing for my switch a puppet. A “sock” puppet. I like the fact that it gives the feeling when the puppet is talking the Led is tilting on and off. I put the wires in what supposedly is “the mouth” of the puppet. Below some reference.

 

Readings summary

The Art of Interactive Design, Crawford. ch 1, 2
A brief rant on the Future of Interaction Design. (blog)

 

Class 1: resume to readings.

Crawford, defines Interaction in an action-reaction system that usually involves two actors engaging with each other. To be able to talk about a real and rightful interaction we need to explicitly  discern 3 major components of this back and forth game happening between two players “ listen, think and speak” it  is a metaphor to what we call Input, process and output.
“we must use the terms listen, think, and speak metaphorically. If we want to get academic, I suppose we could replace listen, think, and speak with input, process, and output, but that’s so gauchely techie.”
Crawford also explains the difference degrees of interactive design and the difference between Interactive designers and the design interface people.

Bret Victor, encourage Interactive Designers and invite them to think and research further what seems to be trendy now  the finger manipulation of flat screens he calls it “ picture behind glass”. Victor, reminds people of the powerful tool they possess within themselves “ the hands” he thinks the future reside in searching for interaction through this very smart and powerful tool called the hand (eventually the body) overall. He makes a valid point in showing readers the candid and silly thinking that it can only be limited to a finger touch where as the human nature need a much more complex research as our need to manipulate with our hands will surface and it can be restraint to a single touch.