aboutsummaryrefslogtreecommitdiffstats
path: root/README.md
blob: 5482724ce46bfbc0a9fbb1c39ed96fc866aa9d51 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
# Video Tuneup

# Screenshot

![](https://github.com/bcjordan/Video-Tuneup/raw/master/screenshot.png)

# Generating music based on video
  * mp3 and add audio-ducking (potentially covered by iMovie)
  * midi generation (tough algorithmically, maybe more of an april project or entire thesis)
  * very hard, high likelihood of bad-sounding music
  
* get a movie, get frames from the movie
  * number of attribute variables from video

  * (maybe just analyze images?)
  * build music from images

# options:
## build movie image thing, do music later

* Use EchoNest API to chop up songs into bars and allow users to mix and match, creating a new remixed song.  Pick n' Mixer

** Pros -- potentially fun to play with. Musically oriented
** Cons -- similar to garage band? Might not split up automatically though...

# Echonest technique:

## Movie
1. User picks video 
2. User specifies song from library or URL (or we give list of available remote songs)
3. (automatically) API call to EchoNest and beat match songs
4. (automatically or user) build order of bars to fit video length (longer or shorter)
5. (automatically) duck music volume based on video volume
6. export back to library

3 songs:
a. 1  2  3 [ … n-3]  n-2  n-1 n (bars)
1 2 3 n-2 n-1 n (correct length)

b.  1  2  3  4  5 
c.  

(interface)
movie -> image -> variables -> music

# Variables from music (via echo nest):
* time signature
* key signature


# variables from movie:
## Video import/headers
* length of film (NSString *const MPMediaItemPropertyPlaybackDuration;)?
  - Tracks -> Time -> [1][value]

## With image analysis
* fade-out
* color content

# Effects (output)
* duck audio
* pitch transpose
* switch clips

## Video:
* shake orientation
* discolor
* slow down / speed up


# Resources

## EchoNest / Remix / Chopping code
* Scissor - auto -  https://github.com/youpy/scissor

* Python remix examples (see reverse) https://github.com/echonest/remix

Maybe we can build a web-based version and api…


## iOS sample code

* AVMovieExporter - imports and from Asset/Media library, changes some metadata and re-exports as different filetype - https://developer.apple.com/library/ios/#samplecode/AVMovieExporter/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011364

* MoviePlayer - movie playback, playback controls, scaling and repeat - https://developer.apple.com/library/ios/#samplecode/MoviePlayer_iPhone/Introduction/Intro.html#//apple_ref/doc/uid/DTS40007798

* AVPlayer - play video from Camera roll - https://developer.apple.com/library/ios/#samplecode/AVPlayerDemo/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010101

*auriotouch - waveform display in openGL https://developer.apple.com/library/ios/#samplecode/aurioTouch2/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011369

* StopNGo - capture images to live stream, re-export as movie https://developer.apple.com/library/ios/#samplecode/StopNGo/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011123

## Pizzaz
* Squarecam - live camera, face detection and drawing https://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40011190


## AssetLibrary
* http://www.icodeblog.com/2010/07/08/asset-libraries-and-blocks-in-ios-4/
* http://www.youtube.com/watch?v=-5kAPVGYMf4

## Multipart NSMutableURLRequest
* http://stackoverflow.com/questions/8042360/nsdata-and-uploading-images-via-post-in-ios
* http://zcentric.com/2008/08/29/post-a-uiimage-to-the-web/
* http://stackoverflow.com/questions/10051150/multiform-data-send-to-server-using-iphone-sdk
* http://www.iphonedevsdk.com/forum/iphone-sdk-development/14919-how-upload-download-http-server.html