Skip to content
Projects
Groups
Snippets
Help
This project
Loading...
Sign in / Register
Toggle navigation
M
Material
Overview
Overview
Details
Activity
Cycle Analytics
Repository
Repository
Files
Commits
Branches
Tags
Contributors
Graph
Compare
Charts
Issues
0
Issues
0
List
Board
Labels
Milestones
Merge Requests
0
Merge Requests
0
CI / CD
CI / CD
Pipelines
Jobs
Schedules
Charts
Wiki
Wiki
Snippets
Snippets
Members
Collapse sidebar
Close sidebar
Activity
Graph
Charts
Create a new issue
Jobs
Commits
Issue Boards
Open sidebar
Dmitriy Stepanets
Material
Commits
0f4f4ba0
Commit
0f4f4ba0
authored
Sep 18, 2015
by
Daniel Dahan
Browse files
Options
Browse Files
Download
Email Patches
Plain Diff
removed commented out AV Capture library
parent
8b825830
Show whitespace changes
Inline
Side-by-side
Showing
2 changed files
with
916 additions
and
916 deletions
+916
-916
Source/Capture.swift
+656
-656
Source/CapturePreview.swift
+260
-260
No files found.
Source/Capture.swift
View file @
0f4f4ba0
////
//// Copyright (C) 2015 GraphKit, Inc. <http://graphkit.io> and other GraphKit contributors.
////
//// This program is free software: you can redistribute it and/or modify
//// it under the terms of the GNU Affero General Public License as published
//// by the Free Software Foundation, either version 3 of the License, or
//// (at your option) any later version.
////
//// This program is distributed in the hope that it will be useful,
//// but WITHOUT ANY WARRANTY; without even the implied warranty of
//// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
//// GNU Affero General Public License for more details.
////
//// You should have received a copy of the GNU Affero General Public License
//// along with this program located at the root of the software package
//// in a file called LICENSE. If not, see <http://www.gnu.org/licenses/>.
////
//
//
//import UIKit
// Copyright (C) 2015 GraphKit, Inc. <http://graphkit.io> and other GraphKit contributors.
//import AVFoundation
//
//import AssetsLibrary
// This program is free software: you can redistribute it and/or modify
//
// it under the terms of the GNU Affero General Public License as published
//@objc(CaptureDelegate)
// by the Free Software Foundation, either version 3 of the License, or
//public protocol CaptureDelegate {
// (at your option) any later version.
// optional func captureDeviceConfigurationFailed(capture: Capture, error: NSError!)
//
// optional func captureMediaCaptureFailed(capture: Capture, error: NSError!)
// This program is distributed in the hope that it will be useful,
// optional func captureAsetLibraryWriteFailed(capture: Capture, error: NSError!)
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// optional func capture(capture: Capture, assetLibraryDidWrite image: UIImage!)
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
//}
// GNU Affero General Public License for more details.
//
//
//public class Capture: NSObject, AVCaptureFileOutputRecordingDelegate {
// You should have received a copy of the GNU Affero General Public License
// //
// along with this program located at the root of the software package
// // :name: activeVideoInput
// in a file called LICENSE. If not, see <http://www.gnu.org/licenses/>.
// // :description: The video input that is currently active.
//
// //
// private var activeVideoInput: AVCaptureDeviceInput?
import
UIKit
//
import
AVFoundation
// //
import
AssetsLibrary
// // :name: imageOutput
// // :description: When the session is taking a photo, this is the output manager.
@objc(CaptureDelegate)
// //
public
protocol
CaptureDelegate
{
// private lazy var imageOutput: AVCaptureStillImageOutput = AVCaptureStillImageOutput()
optional
func
captureDeviceConfigurationFailed
(
capture
:
Capture
,
error
:
NSError
!
)
//
optional
func
captureMediaCaptureFailed
(
capture
:
Capture
,
error
:
NSError
!
)
// //
optional
func
captureAsetLibraryWriteFailed
(
capture
:
Capture
,
error
:
NSError
!
)
// // :name: movieOutput
optional
func
capture
(
capture
:
Capture
,
assetLibraryDidWrite
image
:
UIImage
!
)
// // :description: When the session is shooting a video, this is the output manager.
}
// //
// private lazy var movieOutput: AVCaptureMovieFileOutput = AVCaptureMovieFileOutput()
public
class
Capture
:
NSObject
,
AVCaptureFileOutputRecordingDelegate
{
//
//
// //
// :name: activeVideoInput
// // :name: movieOutputURL
// :description: The video input that is currently active.
// // :description: The output URL of the movie file.
//
// //
private
var
activeVideoInput
:
AVCaptureDeviceInput
?
// private var movieOutputURL: NSURL?
//
//
// //
// :name: imageOutput
// // :name: queue
// :description: When the session is taking a photo, this is the output manager.
// // :description: Async job queue.
//
// //
private
lazy
var
imageOutput
:
AVCaptureStillImageOutput
=
AVCaptureStillImageOutput
()
// private lazy var queue: dispatch_queue_t = {
// return dispatch_queue_create("io.graphkit.Capture", nil)
//
// }()
// :name: movieOutput
//
// :description: When the session is shooting a video, this is the output manager.
// //
//
// // :name: CaptureAdjustingExposureContext
private
lazy
var
movieOutput
:
AVCaptureMovieFileOutput
=
AVCaptureMovieFileOutput
()
// // :description: Used for KVO observation context.
// //
//
// public var CaptureAdjustingExposureContext: NSString?
// :name: movieOutputURL
//
// :description: The output URL of the movie file.
// /**
//
// * cameraCount
private
var
movieOutputURL
:
NSURL
?
// * The number of available cameras on the device.
// */
//
// public var cameraCount: Int {
// :name: queue
// return AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo).count
// :description: Async job queue.
// }
//
//
private
lazy
var
queue
:
dispatch_queue_t
=
{
// /**
return
dispatch_queue_create
(
"io.graphkit.Capture"
,
nil
)
// * session
}()
// * An AVCaptureSession that manages all inputs and outputs in that session.
// */
//
// public lazy var session: AVCaptureSession = AVCaptureSession()
// :name: CaptureAdjustingExposureContext
//
// :description: Used for KVO observation context.
// /**
//
// * delegate
public
var
CaptureAdjustingExposureContext
:
NSString
?
// * An optional instance of CaptureDelegate to handle events that are triggered during various
// * stages in the session.
/**
// */
* cameraCount
// public weak var delegate: CaptureDelegate?
* The number of available cameras on the device.
//
*/
// /**
public
var
cameraCount
:
Int
{
// * prepareSession
return
AVCaptureDevice
.
devicesWithMediaType
(
AVMediaTypeVideo
)
.
count
// * A helper method that prepares the session with the various available inputs and outputs.
}
// * @param preset: String, default: AVCaptureSessionPresetHigh
// * @return A boolean value, true if successful, false otherwise.
/**
// */
* session
// public func prepareSession(preset: String = AVCaptureSessionPresetHigh) -> Bool {
* An AVCaptureSession that manages all inputs and outputs in that session.
// session.sessionPreset = preset
*/
//
public
lazy
var
session
:
AVCaptureSession
=
AVCaptureSession
()
// // setup default camera device
// let videoDevice: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
/**
// let videoInput: AVCaptureDeviceInput? = try? AVCaptureDeviceInput(device: videoDevice)
* delegate
//
* An optional instance of CaptureDelegate to handle events that are triggered during various
// if nil == videoInput {
* stages in the session.
// return false
*/
// }
public
weak
var
delegate
:
CaptureDelegate
?
//
// if session.canAddInput(videoInput) {
/**
// session.addInput(videoInput)
* prepareSession
// activeVideoInput = videoInput
* A helper method that prepares the session with the various available inputs and outputs.
// }
* @param preset: String, default: AVCaptureSessionPresetHigh
//
* @return A boolean value, true if successful, false otherwise.
// let audioDevice: AVCaptureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeAudio)
*/
// let audioInput: AVCaptureDeviceInput? = try? AVCaptureDeviceInput(device: audioDevice)
public
func
prepareSession
(
preset
:
String
=
AVCaptureSessionPresetHigh
)
->
Bool
{
//
session
.
sessionPreset
=
preset
// if nil == audioInput {
// return false
// setup default camera device
// }
let
videoDevice
:
AVCaptureDevice
=
AVCaptureDevice
.
defaultDeviceWithMediaType
(
AVMediaTypeVideo
)
//
let
videoInput
:
AVCaptureDeviceInput
?
=
try
?
AVCaptureDeviceInput
(
device
:
videoDevice
)
// if session.canAddInput(audioInput) {
// session.addInput(audioInput)
if
nil
==
videoInput
{
// }
return
false
//
}
// imageOutput.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG]
//
if
session
.
canAddInput
(
videoInput
)
{
// if session.canAddOutput(imageOutput) {
session
.
addInput
(
videoInput
)
// session.addOutput(imageOutput)
activeVideoInput
=
videoInput
// }
}
//
// if session.canAddOutput(movieOutput) {
let
audioDevice
:
AVCaptureDevice
=
AVCaptureDevice
.
defaultDeviceWithMediaType
(
AVMediaTypeAudio
)
// session.addOutput(movieOutput)
let
audioInput
:
AVCaptureDeviceInput
?
=
try
?
AVCaptureDeviceInput
(
device
:
audioDevice
)
// }
//
if
nil
==
audioInput
{
// return true
return
false
// }
}
//
// /**
if
session
.
canAddInput
(
audioInput
)
{
// * startSession
session
.
addInput
(
audioInput
)
// * Starts the capture session if it is not already running.
}
// */
// public func startSession() {
imageOutput
.
outputSettings
=
[
AVVideoCodecKey
:
AVVideoCodecJPEG
]
// if !session.running {
// dispatch_async(queue) {
if
session
.
canAddOutput
(
imageOutput
)
{
// self.session.startRunning()
session
.
addOutput
(
imageOutput
)
// }
}
// }
// }
if
session
.
canAddOutput
(
movieOutput
)
{
//
session
.
addOutput
(
movieOutput
)
// /**
}
// * stopSession
// * Stops the capture session if it is already running.
return
true
// */
}
// public func stopSession() {
// if session.running {
/**
// dispatch_async(queue) {
* startSession
// self.session.stopRunning()
* Starts the capture session if it is not already running.
// }
*/
// }
public
func
startSession
()
{
// }
if
!
session
.
running
{
//
dispatch_async
(
queue
)
{
// /**
self
.
session
.
startRunning
()
// * cameraWithPosition
}
// * @param position: AVCaptureDevicePosition
}
// * @return An AVCaptureDevice optional.
}
// */
// public func cameraWithPosition(position: AVCaptureDevicePosition) -> AVCaptureDevice? {
/**
// for device in AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) {
* stopSession
// if position == device.position {
* Stops the capture session if it is already running.
// return device as? AVCaptureDevice
*/
// }
public
func
stopSession
()
{
// }
if
session
.
running
{
// return nil
dispatch_async
(
queue
)
{
// }
self
.
session
.
stopRunning
()
//
}
// /**
}
// * activeCamera
}
// * @return The active cameras video input device.
// */
/**
// public var activeCamera: AVCaptureDevice {
* cameraWithPosition
// get {
* @param position: AVCaptureDevicePosition
// return activeVideoInput!.device
* @return An AVCaptureDevice optional.
// }
*/
// }
public
func
cameraWithPosition
(
position
:
AVCaptureDevicePosition
)
->
AVCaptureDevice
?
{
//
for
device
in
AVCaptureDevice
.
devicesWithMediaType
(
AVMediaTypeVideo
)
{
// /**
if
position
==
device
.
position
{
// * inactiveCamera
return
device
as?
AVCaptureDevice
// * @return The inactive cameras video input device.
}
// */
}
// public var inactiveCamera: AVCaptureDevice? {
return
nil
// get {
}
// var device: AVCaptureDevice?
// if 1 < cameraCount {
/**
// device = activeCamera.position == .Back ? cameraWithPosition(.Front) : cameraWithPosition(.Back)
* activeCamera
// }
* @return The active cameras video input device.
// return device
*/
// }
public
var
activeCamera
:
AVCaptureDevice
{
// }
get
{
//
return
activeVideoInput
!.
device
// /**
}
// * canSwitchCameras
}
// * Checks whether the camera can be switched. This would require at least two cameras.
// * @return A boolean of the result, true if yes, false otherwise.
/**
// */
* inactiveCamera
// public var canSwitchCameras: Bool {
* @return The inactive cameras video input device.
// return 1 < cameraCount
*/
// }
public
var
inactiveCamera
:
AVCaptureDevice
?
{
//
get
{
// /**
var
device
:
AVCaptureDevice
?
// * switchCamera
if
1
<
cameraCount
{
// * If it is possible to switch cameras, then the camera will be switched from the opposite facing camera.
device
=
activeCamera
.
position
==
.
Back
?
cameraWithPosition
(
.
Front
)
:
cameraWithPosition
(
.
Back
)
// * @return A boolean of the result, true if switched, false otherwise.
}
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
return
device
// */
}
// public func switchCamera() -> Bool {
}
// if !canSwitchCameras {
// return false
/**
// }
* canSwitchCameras
//
* Checks whether the camera can be switched. This would require at least two cameras.
// let videoDevice: AVCaptureDevice? = inactiveCamera
* @return A boolean of the result, true if yes, false otherwise.
// let videoInput: AVCaptureDeviceInput? = try? AVCaptureDeviceInput(device: videoDevice)
*/
//
public
var
canSwitchCameras
:
Bool
{
// if nil == videoInput {
return
1
<
cameraCount
// session.beginConfiguration()
}
// session.removeInput(activeVideoInput)
//
/**
// if session.canAddInput(videoInput) {
* switchCamera
// activeVideoInput = videoInput
* If it is possible to switch cameras, then the camera will be switched from the opposite facing camera.
// } else {
* @return A boolean of the result, true if switched, false otherwise.
// session.addInput(activeVideoInput)
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// }
*/
//
public
func
switchCamera
()
->
Bool
{
// session.commitConfiguration()
if
!
canSwitchCameras
{
// } else {
return
false
// delegate?.captureDeviceConfigurationFailed?(self, error: nil)
}
// return false
// }
let
videoDevice
:
AVCaptureDevice
?
=
inactiveCamera
//
let
videoInput
:
AVCaptureDeviceInput
?
=
try
?
AVCaptureDeviceInput
(
device
:
videoDevice
)
// return true
// }
if
nil
==
videoInput
{
//
session
.
beginConfiguration
()
// /**
session
.
removeInput
(
activeVideoInput
)
// * cameraHasFlash
// * Checks whether the camera supports flash.
if
session
.
canAddInput
(
videoInput
)
{
// * @return A boolean of the result, true if yes, false otherwise.
activeVideoInput
=
videoInput
// */
}
else
{
// public var cameraHasFlash: Bool {
session
.
addInput
(
activeVideoInput
)
// return activeCamera.hasFlash
}
// }
//
session
.
commitConfiguration
()
// /**
}
else
{
// * flashMode
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
nil
)
// * A mutator and accessor for the flashMode property.
return
false
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
}
// */
// public var flashMode: AVCaptureFlashMode {
return
true
// get {
}
// return activeCamera.flashMode
// }
/**
// set(value) {
* cameraHasFlash
// let device: AVCaptureDevice = activeCamera
* Checks whether the camera supports flash.
// if flashMode != device.flashMode && device.isFlashModeSupported(flashMode) {
* @return A boolean of the result, true if yes, false otherwise.
// var error: NSError?
*/
// do {
public
var
cameraHasFlash
:
Bool
{
// try device.lockForConfiguration()
return
activeCamera
.
hasFlash
// device.flashMode = flashMode
}
// device.unlockForConfiguration()
// } catch let error1 as NSError {
/**
// error = error1
* flashMode
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
* A mutator and accessor for the flashMode property.
// }
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// }
*/
// }
public
var
flashMode
:
AVCaptureFlashMode
{
// }
get
{
//
return
activeCamera
.
flashMode
// /**
}
// * cameraHasTorch
set
(
value
)
{
// * Checks whether the device supports torch feature.
let
device
:
AVCaptureDevice
=
activeCamera
// * @return A boolean of the result, true if yes, false otherwise.
if
flashMode
!=
device
.
flashMode
&&
device
.
isFlashModeSupported
(
flashMode
)
{
// */
var
error
:
NSError
?
// public var cameraHasTorch: Bool {
do
{
// get {
try
device
.
lockForConfiguration
()
// return activeCamera.hasTorch
device
.
flashMode
=
flashMode
// }
device
.
unlockForConfiguration
()
// }
}
catch
let
error1
as
NSError
{
//
error
=
error1
// /**
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// * torchMode
}
// * A mutator and accessor for the torchMode property.
}
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
}
// */
}
// public var torchMode: AVCaptureTorchMode {
// get {
/**
// return activeCamera.torchMode
* cameraHasTorch
// }
* Checks whether the device supports torch feature.
// set(value) {
* @return A boolean of the result, true if yes, false otherwise.
// let device: AVCaptureDevice = activeCamera
*/
// if torchMode != device.torchMode && device.isTorchModeSupported(torchMode) {
public
var
cameraHasTorch
:
Bool
{
// var error: NSError?
get
{
// do {
return
activeCamera
.
hasTorch
// try device.lockForConfiguration()
}
// device.torchMode = torchMode
}
// device.unlockForConfiguration()
// } catch let error1 as NSError {
/**
// error = error1
* torchMode
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
* A mutator and accessor for the torchMode property.
// }
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// }
*/
// }
public
var
torchMode
:
AVCaptureTorchMode
{
// }
get
{
//
return
activeCamera
.
torchMode
// /**
}
// * cameraSupportsTapToFocus
set
(
value
)
{
// * Checks whether the device supports tap to focus.
let
device
:
AVCaptureDevice
=
activeCamera
// * @return A boolean of the result, true if yes, false otherwise.
if
torchMode
!=
device
.
torchMode
&&
device
.
isTorchModeSupported
(
torchMode
)
{
// */
var
error
:
NSError
?
// public var cameraSupportsTapToFocus: Bool {
do
{
// get {
try
device
.
lockForConfiguration
()
// return activeCamera.focusPointOfInterestSupported
device
.
torchMode
=
torchMode
// }
device
.
unlockForConfiguration
()
// }
}
catch
let
error1
as
NSError
{
//
error
=
error1
// /**
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// * focusAtpoint
}
// * Sets the point to focus at on the screen.
}
// * @param point: CGPoint
}
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
}
// */
// public func focusAtPoint(point: CGPoint) {
/**
// let device: AVCaptureDevice = activeCamera
* cameraSupportsTapToFocus
// if device.focusPointOfInterestSupported && device.isFocusModeSupported(.AutoFocus) {
* Checks whether the device supports tap to focus.
// var error: NSError?
* @return A boolean of the result, true if yes, false otherwise.
// do {
*/
// try device.lockForConfiguration()
public
var
cameraSupportsTapToFocus
:
Bool
{
// device.focusPointOfInterest = point
get
{
// device.focusMode = .AutoFocus
return
activeCamera
.
focusPointOfInterestSupported
// device.unlockForConfiguration()
}
// } catch let error1 as NSError {
}
// error = error1
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
/**
// }
* focusAtpoint
// }
* Sets the point to focus at on the screen.
// }
* @param point: CGPoint
//
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// /**
*/
// * cameraSupportsTapToExpose
public
func
focusAtPoint
(
point
:
CGPoint
)
{
// * Checks whether the device supports tap to expose.
let
device
:
AVCaptureDevice
=
activeCamera
// * @return A boolean of the result, true if yes, false otherwise.
if
device
.
focusPointOfInterestSupported
&&
device
.
isFocusModeSupported
(
.
AutoFocus
)
{
// */
var
error
:
NSError
?
// public var cameraSupportsTapToExpose: Bool {
do
{
// get {
try
device
.
lockForConfiguration
()
// return activeCamera.exposurePointOfInterestSupported
device
.
focusPointOfInterest
=
point
// }
device
.
focusMode
=
.
AutoFocus
// }
device
.
unlockForConfiguration
()
//
}
catch
let
error1
as
NSError
{
// /**
error
=
error1
// * exposeAtPoint
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// * Sets a point for exposure.
}
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
}
// */
}
// public func exposeAtPoint(point: CGPoint) {
// let device: AVCaptureDevice = activeCamera
/**
// let exposureMode: AVCaptureExposureMode = .ContinuousAutoExposure
* cameraSupportsTapToExpose
//
* Checks whether the device supports tap to expose.
// if device.exposurePointOfInterestSupported && device.isExposureModeSupported(exposureMode) {
* @return A boolean of the result, true if yes, false otherwise.
// var error: NSError?
*/
// do {
public
var
cameraSupportsTapToExpose
:
Bool
{
// try device.lockForConfiguration()
get
{
// device.exposurePointOfInterest = point
return
activeCamera
.
exposurePointOfInterestSupported
// device.exposureMode = exposureMode
}
//
}
// if device.isExposureModeSupported(.Locked) {
// device.addObserver(self, forKeyPath: "adjustingExposure", options: .New, context: &CaptureAdjustingExposureContext)
/**
// }
* exposeAtPoint
// device.unlockForConfiguration()
* Sets a point for exposure.
// } catch let error1 as NSError {
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// error = error1
*/
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
public
func
exposeAtPoint
(
point
:
CGPoint
)
{
// }
let
device
:
AVCaptureDevice
=
activeCamera
// }
let
exposureMode
:
AVCaptureExposureMode
=
.
ContinuousAutoExposure
// }
//
if
device
.
exposurePointOfInterestSupported
&&
device
.
isExposureModeSupported
(
exposureMode
)
{
// /**
var
error
:
NSError
?
// * override to set observeValueForKeyPath and handle exposure observance.
do
{
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
try
device
.
lockForConfiguration
()
// */
device
.
exposurePointOfInterest
=
point
// override public func observeValueForKeyPath(keyPath: String?, ofObject object: AnyObject?, change: [String : AnyObject]?, context: UnsafeMutablePointer<Void>) {
device
.
exposureMode
=
exposureMode
// if context == &CaptureAdjustingExposureContext {
// let device: AVCaptureDevice = object as! AVCaptureDevice
if
device
.
isExposureModeSupported
(
.
Locked
)
{
//
device
.
addObserver
(
self
,
forKeyPath
:
"adjustingExposure"
,
options
:
.
New
,
context
:
&
CaptureAdjustingExposureContext
)
// if device.adjustingExposure && device.isExposureModeSupported(.Locked) {
}
// object!.removeObserver(self, forKeyPath: "adjustingExposure", context: &CaptureAdjustingExposureContext)
device
.
unlockForConfiguration
()
// dispatch_async(queue) {
}
catch
let
error1
as
NSError
{
// var error: NSError?
error
=
error1
// do {
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// try device.lockForConfiguration()
}
// device.unlockForConfiguration()
}
// } catch let e as NSError {
}
// error = e
// self.delegate?.captureDeviceConfigurationFailed?(self, error: error)
/**
// } catch {
* override to set observeValueForKeyPath and handle exposure observance.
// fatalError()
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// }
*/
// }
override
public
func
observeValueForKeyPath
(
keyPath
:
String
?,
ofObject
object
:
AnyObject
?,
change
:
[
String
:
AnyObject
]?,
context
:
UnsafeMutablePointer
<
Void
>
)
{
// } else {
if
context
==
&
CaptureAdjustingExposureContext
{
// super.observeValueForKeyPath(keyPath, ofObject: object, change: change, context: context)
let
device
:
AVCaptureDevice
=
object
as!
AVCaptureDevice
// }
// }
if
device
.
adjustingExposure
&&
device
.
isExposureModeSupported
(
.
Locked
)
{
// }
object
!.
removeObserver
(
self
,
forKeyPath
:
"adjustingExposure"
,
context
:
&
CaptureAdjustingExposureContext
)
//
dispatch_async
(
queue
)
{
// /**
var
error
:
NSError
?
// * resetFocusAndExposureModes
do
{
// * Resets to default configuration for device focus and exposure mode.
try
device
.
lockForConfiguration
()
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
device
.
unlockForConfiguration
()
// */
}
catch
let
e
as
NSError
{
// public func resetFocusAndExposureModes() {
error
=
e
// let device: AVCaptureDevice = activeCamera
self
.
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
//
}
catch
{
// let exposureMode: AVCaptureExposureMode = .ContinuousAutoExposure
fatalError
()
// let canResetExposure: Bool = device.focusPointOfInterestSupported && device.isExposureModeSupported(exposureMode)
}
//
}
// let focusMode: AVCaptureFocusMode = .ContinuousAutoFocus
}
else
{
// let canResetFocus: Bool = device.focusPointOfInterestSupported && device.isFocusModeSupported(focusMode)
super
.
observeValueForKeyPath
(
keyPath
,
ofObject
:
object
,
change
:
change
,
context
:
context
)
//
}
// let centerPoint: CGPoint = CGPointMake(0.5, 0.5)
}
//
}
// var error: NSError?
// do {
/**
// try device.lockForConfiguration()
* resetFocusAndExposureModes
// if canResetFocus {
* Resets to default configuration for device focus and exposure mode.
// device.focusMode = focusMode
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// device.focusPointOfInterest = centerPoint
*/
// }
public
func
resetFocusAndExposureModes
()
{
// if canResetExposure {
let
device
:
AVCaptureDevice
=
activeCamera
// device.exposureMode = exposureMode
// device.exposurePointOfInterest = centerPoint
let
exposureMode
:
AVCaptureExposureMode
=
.
ContinuousAutoExposure
// }
let
canResetExposure
:
Bool
=
device
.
focusPointOfInterestSupported
&&
device
.
isExposureModeSupported
(
exposureMode
)
// device.unlockForConfiguration()
// } catch let error1 as NSError {
let
focusMode
:
AVCaptureFocusMode
=
.
ContinuousAutoFocus
// error = error1
let
canResetFocus
:
Bool
=
device
.
focusPointOfInterestSupported
&&
device
.
isFocusModeSupported
(
focusMode
)
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
// }
let
centerPoint
:
CGPoint
=
CGPointMake
(
0.5
,
0.5
)
// }
//
var
error
:
NSError
?
// /**
do
{
// * captureStillImage
try
device
.
lockForConfiguration
()
// * Captures the image and write the photo to the user's asset library.
if
canResetFocus
{
// * @delegate If the success, the capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) is called.
device
.
focusMode
=
focusMode
// * @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
device
.
focusPointOfInterest
=
centerPoint
// */
}
// public func captureStillImage() {
if
canResetExposure
{
// let connection: AVCaptureConnection = imageOutput.connectionWithMediaType(AVMediaTypeVideo)
device
.
exposureMode
=
exposureMode
// if connection.supportsVideoOrientation {
device
.
exposurePointOfInterest
=
centerPoint
// connection.videoOrientation = currentVideoOrientation
}
// }
device
.
unlockForConfiguration
()
// imageOutput.captureStillImageAsynchronouslyFromConnection(connection) { (sampleBuffer: CMSampleBufferRef?, error: NSError?) in
}
catch
let
error1
as
NSError
{
// if nil == sampleBuffer {
error
=
error1
// self.delegate?.captureAsetLibraryWriteFailed?(self, error: error)
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// } else {
}
// let imageData: NSData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer)
}
// let image: UIImage = UIImage(data: imageData)!
// self.writeImageToAssetsLibrary(image)
/**
// }
* captureStillImage
// }
* Captures the image and write the photo to the user's asset library.
// }
* @delegate If the success, the capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) is called.
//
* @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
// /**
*/
// * isRecording
public
func
captureStillImage
()
{
// * Checkts whether the device is currently recording.
let
connection
:
AVCaptureConnection
=
imageOutput
.
connectionWithMediaType
(
AVMediaTypeVideo
)
// * @return A boolean of the result, true if yes, false otherwise.
if
connection
.
supportsVideoOrientation
{
// */
connection
.
videoOrientation
=
currentVideoOrientation
// public var isRecording: Bool {
}
// get {
imageOutput
.
captureStillImageAsynchronouslyFromConnection
(
connection
)
{
(
sampleBuffer
:
CMSampleBufferRef
?,
error
:
NSError
?)
in
// return movieOutput.recording
if
nil
==
sampleBuffer
{
// }
self
.
delegate
?
.
captureAsetLibraryWriteFailed
?(
self
,
error
:
error
)
// }
}
else
{
//
let
imageData
:
NSData
=
AVCaptureStillImageOutput
.
jpegStillImageNSDataRepresentation
(
sampleBuffer
)
// /**
let
image
:
UIImage
=
UIImage
(
data
:
imageData
)
!
// * startRecording
self
.
writeImageToAssetsLibrary
(
image
)
// * If the device is not currently recording, this starts the movie recording.
}
// * @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
}
// */
}
// public func startRecording() {
// if !isRecording {
/**
// let connection: AVCaptureConnection = movieOutput.connectionWithMediaType(AVMediaTypeVideo)
* isRecording
// if connection.supportsVideoOrientation {
* Checkts whether the device is currently recording.
// connection.videoOrientation = currentVideoOrientation
* @return A boolean of the result, true if yes, false otherwise.
// }
*/
// if connection.supportsVideoStabilization {
public
var
isRecording
:
Bool
{
// connection.preferredVideoStabilizationMode = .Auto
get
{
// }
return
movieOutput
.
recording
//
}
// let device: AVCaptureDevice = activeCamera
}
//
// if device.smoothAutoFocusSupported {
/**
// var error: NSError?
* startRecording
// do {
* If the device is not currently recording, this starts the movie recording.
// try device.lockForConfiguration()
* @delegate If the configuration fails, the capture(capture: Capture!, deviceConfigurationFailed error: NSError!) is called.
// device.smoothAutoFocusEnabled = false
*/
// device.unlockForConfiguration()
public
func
startRecording
()
{
// } catch let error1 as NSError {
if
!
isRecording
{
// error = error1
let
connection
:
AVCaptureConnection
=
movieOutput
.
connectionWithMediaType
(
AVMediaTypeVideo
)
// delegate?.captureDeviceConfigurationFailed?(self, error: error)
if
connection
.
supportsVideoOrientation
{
// }
connection
.
videoOrientation
=
currentVideoOrientation
// }
}
// movieOutputURL = uniqueURL
if
connection
.
supportsVideoStabilization
{
// movieOutput.startRecordingToOutputFileURL(movieOutputURL, recordingDelegate: self)
connection
.
preferredVideoStabilizationMode
=
.
Auto
// }
}
// }
//
let
device
:
AVCaptureDevice
=
activeCamera
// /**
// * stopRecording
if
device
.
smoothAutoFocusSupported
{
// * If the device is currently recoring, this stops the movie recording.
var
error
:
NSError
?
// */
do
{
// public func stopRecording() {
try
device
.
lockForConfiguration
()
// if isRecording {
device
.
smoothAutoFocusEnabled
=
false
// movieOutput.stopRecording()
device
.
unlockForConfiguration
()
// }
}
catch
let
error1
as
NSError
{
// }
error
=
error1
//
delegate
?
.
captureDeviceConfigurationFailed
?(
self
,
error
:
error
)
// /**
}
// * recordedDuration
}
// * Retrieves the movie recorded duration.
movieOutputURL
=
uniqueURL
// * @return A CMTime value.
movieOutput
.
startRecordingToOutputFileURL
(
movieOutputURL
,
recordingDelegate
:
self
)
// */
}
// public var recordedDuration: CMTime {
}
// get {
// return movieOutput.recordedDuration
/**
// }
* stopRecording
// }
* If the device is currently recoring, this stops the movie recording.
//
*/
// /**
public
func
stopRecording
()
{
// * currentVideoOrientation
if
isRecording
{
// * Retrieves the current orientation of the device.
movieOutput
.
stopRecording
()
// * @return A AVCaptureVideoOrientation value, [Portrait, LandscapeLeft, PortraitUpsideDown, LandscapeRight].
}
// */
}
// public var currentVideoOrientation: AVCaptureVideoOrientation {
// var orientation: AVCaptureVideoOrientation?
/**
// switch UIDevice.currentDevice().orientation {
* recordedDuration
// case .Portrait:
* Retrieves the movie recorded duration.
// orientation = .Portrait
* @return A CMTime value.
// break
*/
// case .LandscapeRight:
public
var
recordedDuration
:
CMTime
{
// orientation = .LandscapeLeft
get
{
// break
return
movieOutput
.
recordedDuration
// case .PortraitUpsideDown:
}
// orientation = .PortraitUpsideDown
}
// break
// default:
/**
// orientation = .LandscapeRight
* currentVideoOrientation
// }
* Retrieves the current orientation of the device.
// return orientation!
* @return A AVCaptureVideoOrientation value, [Portrait, LandscapeLeft, PortraitUpsideDown, LandscapeRight].
// }
*/
//
public
var
currentVideoOrientation
:
AVCaptureVideoOrientation
{
// /**
var
orientation
:
AVCaptureVideoOrientation
?
// * uniqueURL
switch
UIDevice
.
currentDevice
()
.
orientation
{
// * A unique URL generated for the movie video.
case
.
Portrait
:
// * @return An optional NSURL value.
orientation
=
.
Portrait
// */
break
// private var uniqueURL: NSURL? {
case
.
LandscapeRight
:
// let fileManager: NSFileManager = NSFileManager.defaultManager()
orientation
=
.
LandscapeLeft
// let tempDirectoryTemplate: String = (NSTemporaryDirectory() as NSString).stringByAppendingPathComponent("FocusLibrary")
break
// do {
case
.
PortraitUpsideDown
:
// try fileManager.createDirectoryAtPath(tempDirectoryTemplate, withIntermediateDirectories: true, attributes: nil)
orientation
=
.
PortraitUpsideDown
// return NSURL.fileURLWithPath(tempDirectoryTemplate + "/test.mov")
break
// } catch {}
default
:
// return nil
orientation
=
.
LandscapeRight
// }
}
//
return
orientation
!
// /**
}
// * postAssetLibraryNotification
// * Fires an asynchronous call to the capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
/**
// * @param image: UIImage!
* uniqueURL
// * @delegate An asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
* A unique URL generated for the movie video.
// */
* @return An optional NSURL value.
// private func postAssetLibraryNotification(image: UIImage!) {
*/
// dispatch_async(queue) {
private
var
uniqueURL
:
NSURL
?
{
// self.delegate?.capture?(self, assetLibraryDidWrite: image)
let
fileManager
:
NSFileManager
=
NSFileManager
.
defaultManager
()
// }
let
tempDirectoryTemplate
:
String
=
(
NSTemporaryDirectory
()
as
NSString
)
.
stringByAppendingPathComponent
(
"FocusLibrary"
)
// }
do
{
//
try
fileManager
.
createDirectoryAtPath
(
tempDirectoryTemplate
,
withIntermediateDirectories
:
true
,
attributes
:
nil
)
// /**
return
NSURL
.
fileURLWithPath
(
tempDirectoryTemplate
+
"/test.mov"
)
// * writeImageToAssetsLibrary
}
catch
{}
// * Writes the image file to the user's asset library.
return
nil
// * @param image: UIImage!
}
// * @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// * @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
/**
// */
* postAssetLibraryNotification
// private func writeImageToAssetsLibrary(image: UIImage) {
* Fires an asynchronous call to the capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// let library: ALAssetsLibrary = ALAssetsLibrary()
* @param image: UIImage!
// library.writeImageToSavedPhotosAlbum(image.CGImage, orientation: ALAssetOrientation(rawValue: image.imageOrientation.rawValue)!) { (path: NSURL!, error: NSError?) -> Void in
* @delegate An asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// if nil == error {
*/
// self.postAssetLibraryNotification(image)
private
func
postAssetLibraryNotification
(
image
:
UIImage
!
)
{
// } else {
dispatch_async
(
queue
)
{
// self.delegate?.captureAsetLibraryWriteFailed?(self, error: error)
self
.
delegate
?
.
capture
?(
self
,
assetLibraryDidWrite
:
image
)
// }
}
// }
}
// }
//
/**
// /**
* writeImageToAssetsLibrary
// * writeVideoToAssetsLibrary
* Writes the image file to the user's asset library.
// * Writes the video file to the user's asset library.
* @param image: UIImage!
// * @param videoURL: NSURL!
* @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// * @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
* @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
// * @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
*/
// */
private
func
writeImageToAssetsLibrary
(
image
:
UIImage
)
{
// private func writeVideoToAssetsLibrary(videoURL: NSURL!) {
let
library
:
ALAssetsLibrary
=
ALAssetsLibrary
()
// let library: ALAssetsLibrary = ALAssetsLibrary()
library
.
writeImageToSavedPhotosAlbum
(
image
.
CGImage
,
orientation
:
ALAssetOrientation
(
rawValue
:
image
.
imageOrientation
.
rawValue
)
!
)
{
(
path
:
NSURL
!
,
error
:
NSError
?)
->
Void
in
// if library.videoAtPathIsCompatibleWithSavedPhotosAlbum(videoURL) {
if
nil
==
error
{
// library.writeVideoAtPathToSavedPhotosAlbum(videoURL) { (path: NSURL!, error: NSError?) in
self
.
postAssetLibraryNotification
(
image
)
// if nil == error {
}
else
{
// self.generateThumbnailForVideoAtURL(videoURL)
self
.
delegate
?
.
captureAsetLibraryWriteFailed
?(
self
,
error
:
error
)
// } else {
}
// self.delegate?.captureAsetLibraryWriteFailed?(self, error: error)
}
// }
}
// }
// }
/**
// }
* writeVideoToAssetsLibrary
//
* Writes the video file to the user's asset library.
// /**
* @param videoURL: NSURL!
// * generateThumbnailForVideoAtURL
* @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// * Generates a thumbnail for the video URL specified.
* @delegate If failure, capture(capture: Capture!, assetLibraryWriteFailed error: NSError!) is called.
// * @param videoURL: NSURL!
*/
// * @delegate An asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
private
func
writeVideoToAssetsLibrary
(
videoURL
:
NSURL
!
)
{
// */
let
library
:
ALAssetsLibrary
=
ALAssetsLibrary
()
// private func generateThumbnailForVideoAtURL(videoURL: NSURL!) {
if
library
.
videoAtPathIsCompatibleWithSavedPhotosAlbum
(
videoURL
)
{
// dispatch_async(queue) {
library
.
writeVideoAtPathToSavedPhotosAlbum
(
videoURL
)
{
(
path
:
NSURL
!
,
error
:
NSError
?)
in
// do {
if
nil
==
error
{
// let asset: AVAsset = AVAsset(URL: videoURL)
self
.
generateThumbnailForVideoAtURL
(
videoURL
)
// let imageGenerator: AVAssetImageGenerator = AVAssetImageGenerator(asset: asset)
}
else
{
// imageGenerator.maximumSize = CGSizeMake(100, 0)
self
.
delegate
?
.
captureAsetLibraryWriteFailed
?(
self
,
error
:
error
)
// imageGenerator.appliesPreferredTrackTransform = true
}
//
}
// let imageRef: CGImageRef = try imageGenerator.copyCGImageAtTime(kCMTimeZero, actualTime: nil)
}
// let image: UIImage = UIImage(CGImage: imageRef)
}
//
// dispatch_async(dispatch_get_main_queue()) {
/**
// self.postAssetLibraryNotification(image)
* generateThumbnailForVideoAtURL
// }
* Generates a thumbnail for the video URL specified.
// } catch {}
* @param videoURL: NSURL!
// }
* @delegate An asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
// }
*/
//
private
func
generateThumbnailForVideoAtURL
(
videoURL
:
NSURL
!
)
{
// /**
dispatch_async
(
queue
)
{
// * delegate method for capturing video file.
do
{
// * @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
let
asset
:
AVAsset
=
AVAsset
(
URL
:
videoURL
)
// * @delegate If failure, capture(capture: Capture!, mediaCaptureFailed error: NSError!) is called.
let
imageGenerator
:
AVAssetImageGenerator
=
AVAssetImageGenerator
(
asset
:
asset
)
// */
imageGenerator
.
maximumSize
=
CGSizeMake
(
100
,
0
)
// public func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!) {
imageGenerator
.
appliesPreferredTrackTransform
=
true
// if nil == error {
// writeVideoToAssetsLibrary(movieOutputURL!.copy() as! NSURL)
let
imageRef
:
CGImageRef
=
try
imageGenerator
.
copyCGImageAtTime
(
kCMTimeZero
,
actualTime
:
nil
)
// } else {
let
image
:
UIImage
=
UIImage
(
CGImage
:
imageRef
)
// delegate?.captureMediaCaptureFailed?(self, error: error)
// }
dispatch_async
(
dispatch_get_main_queue
())
{
// movieOutputURL = nil
self
.
postAssetLibraryNotification
(
image
)
// }
}
//}
}
catch
{}
\ No newline at end of file
}
}
/**
* delegate method for capturing video file.
* @delegate If successful, an asynchronous call to capture(capture: Capture!, assetLibraryDidWrite image: UIImage!) delegate.
* @delegate If failure, capture(capture: Capture!, mediaCaptureFailed error: NSError!) is called.
*/
public
func
captureOutput
(
captureOutput
:
AVCaptureFileOutput
!
,
didFinishRecordingToOutputFileAtURL
outputFileURL
:
NSURL
!
,
fromConnections
connections
:
[
AnyObject
]
!
,
error
:
NSError
!
)
{
if
nil
==
error
{
writeVideoToAssetsLibrary
(
movieOutputURL
!.
copy
()
as!
NSURL
)
}
else
{
delegate
?
.
captureMediaCaptureFailed
?(
self
,
error
:
error
)
}
movieOutputURL
=
nil
}
}
\ No newline at end of file
Source/CapturePreview.swift
View file @
0f4f4ba0
////
//// Copyright (C) 2015 GraphKit, Inc. <http://graphkit.io> and other GraphKit contributors.
////
//// This program is free software: you can redistribute it and/or modify
//// it under the terms of the GNU Affero General Public License as published
//// by the Free Software Foundation, either version 3 of the License, or
//// (at your option) any later version.
////
//// This program is distributed in the hope that it will be useful,
//// but WITHOUT ANY WARRANTY; without even the implied warranty of
//// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
//// GNU Affero General Public License for more details.
////
//// You should have received a copy of the GNU Affero General Public License
//// along with this program located at the root of the software package
//// in a file called LICENSE. If not, see <http://www.gnu.org/licenses/>.
////
//
//
//import UIKit
// Copyright (C) 2015 GraphKit, Inc. <http://graphkit.io> and other GraphKit contributors.
//import AVFoundation
//
//
//@objc(PreviewDelegate)
// This program is free software: you can redistribute it and/or modify
//public protocol PreviewDelegate {
// it under the terms of the GNU Affero General Public License as published
// optional func previewTappedToFocusAt(preview: Preview, point: CGPoint)
// by the Free Software Foundation, either version 3 of the License, or
// optional func previewTappedToExposeAt(preview: Preview, point: CGPoint)
// (at your option) any later version.
// optional func previewTappedToReset(preview: Preview, focus: UIView, exposure: UIView)
//}
//
//
//public class Preview: UIView {
// This program is distributed in the hope that it will be useful,
// /**
// but WITHOUT ANY WARRANTY; without even the implied warranty of
// :name: boxBounds
// MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
// :description: A static property that sets the initial size of the focusBox and exposureBox properties.
// GNU Affero General Public License for more details.
// */
// static public var boxBounds: CGRect = CGRectMake(0, 0, 150, 150)
//
//
// /**
// You should have received a copy of the GNU Affero General Public License
// :name: delegate
// along with this program located at the root of the software package
// :description: An optional instance of PreviewDelegate to handle events that are triggered during various
// in a file called LICENSE. If not, see <http://www.gnu.org/licenses/>.
// stages of engagement.
// */
// public weak var delegate: PreviewDelegate?
//
//
// /**
// :name: tapToFocusEnabled
import
UIKit
// :description: A mutator and accessor that enables and disables tap to focus gesture.
import
AVFoundation
// */
// public var tapToFocusEnabled: Bool {
@objc(PreviewDelegate)
// get {
public
protocol
PreviewDelegate
{
// return singleTapRecognizer!.enabled
optional
func
previewTappedToFocusAt
(
preview
:
Preview
,
point
:
CGPoint
)
// }
optional
func
previewTappedToExposeAt
(
preview
:
Preview
,
point
:
CGPoint
)
// set(value) {
optional
func
previewTappedToReset
(
preview
:
Preview
,
focus
:
UIView
,
exposure
:
UIView
)
// singleTapRecognizer!.enabled = value
}
// }
// }
public
class
Preview
:
UIView
{
//
/**
// /**
:name: boxBounds
// :name: tapToExposeEnabled
:description: A static property that sets the initial size of the focusBox and exposureBox properties.
// :description: A mutator and accessor that enables and disables tap to expose gesture.
*/
// */
static
public
var
boxBounds
:
CGRect
=
CGRectMake
(
0
,
0
,
150
,
150
)
// public var tapToExposeEnabled: Bool {
// get {
/**
// return doubleTapRecognizer!.enabled
:name: delegate
// }
:description: An optional instance of PreviewDelegate to handle events that are triggered during various
// set(value) {
stages of engagement.
// doubleTapRecognizer!.enabled = value
*/
// }
public
weak
var
delegate
:
PreviewDelegate
?
// }
//
/**
// //
:name: tapToFocusEnabled
// // override for layerClass
:description: A mutator and accessor that enables and disables tap to focus gesture.
// //
*/
// override public class func layerClass() -> AnyClass {
public
var
tapToFocusEnabled
:
Bool
{
// return AVCaptureVideoPreviewLayer.self
get
{
// }
return
singleTapRecognizer
!.
enabled
//
}
// /**
set
(
value
)
{
// :name: session
singleTapRecognizer
!.
enabled
=
value
// :description: A mutator and accessor for the preview AVCaptureSession value.
}
// */
}
// public var session: AVCaptureSession {
// get {
/**
// return (layer as! AVCaptureVideoPreviewLayer).session
:name: tapToExposeEnabled
// }
:description: A mutator and accessor that enables and disables tap to expose gesture.
// set(value) {
*/
// (layer as! AVCaptureVideoPreviewLayer).session = value
public
var
tapToExposeEnabled
:
Bool
{
// }
get
{
// }
return
doubleTapRecognizer
!.
enabled
//
}
// /**
set
(
value
)
{
// :name: focusBox
doubleTapRecognizer
!.
enabled
=
value
// :description: An optional UIView for the focusBox animation. This is used when the
}
// tapToFocusEnabled property is set to true.
}
// */
// public var focusBox: UIView?
//
//
// override for layerClass
// /**
//
// :name: exposureBox
override
public
class
func
layerClass
()
->
AnyClass
{
// :description: An optional UIView for the exposureBox animation. This is used when the
return
AVCaptureVideoPreviewLayer
.
self
// tapToExposeEnabled property is set to true.
}
// */
// public var exposureBox: UIView?
/**
//
:name: session
// //
:description: A mutator and accessor for the preview AVCaptureSession value.
// // :name: singleTapRecognizer
*/
// // :description: Gesture recognizer for single tap.
public
var
session
:
AVCaptureSession
{
// //
get
{
// private var singleTapRecognizer: UITapGestureRecognizer?
return
(
layer
as!
AVCaptureVideoPreviewLayer
)
.
session
//
}
// //
set
(
value
)
{
// // :name: doubleTapRecognizer
(
layer
as!
AVCaptureVideoPreviewLayer
)
.
session
=
value
// // :description: Gesture recognizer for double tap.
}
// //
}
// private var doubleTapRecognizer: UITapGestureRecognizer?
//
/**
// //
:name: focusBox
// // :name: doubleDoubleTapRecognizer
:description: An optional UIView for the focusBox animation. This is used when the
// // :description: Gesture recognizer for double/double tap.
tapToFocusEnabled property is set to true.
// //
*/
// private var doubleDoubleTapRecognizer: UITapGestureRecognizer?
public
var
focusBox
:
UIView
?
//
// required public init?(coder aDecoder: NSCoder) {
/**
// super.init(coder: aDecoder)
:name: exposureBox
// prepareView()
:description: An optional UIView for the exposureBox animation. This is used when the
// }
tapToExposeEnabled property is set to true.
//
*/
// public override init(frame: CGRect) {
public
var
exposureBox
:
UIView
?
// super.init(frame: frame)
// prepareView()
//
// }
// :name: singleTapRecognizer
//
// :description: Gesture recognizer for single tap.
// public init() {
//
// super.init(frame: CGRectZero)
private
var
singleTapRecognizer
:
UITapGestureRecognizer
?
// translatesAutoresizingMaskIntoConstraints = false
// prepareView()
//
// }
// :name: doubleTapRecognizer
//
// :description: Gesture recognizer for double tap.
// //
//
// // :name: handleSingleTap
private
var
doubleTapRecognizer
:
UITapGestureRecognizer
?
// //
// internal func handleSingleTap(recognizer: UIGestureRecognizer) {
//
// let point: CGPoint = recognizer.locationInView(self)
// :name: doubleDoubleTapRecognizer
// runBoxAnimationOnView(focusBox, point: point)
// :description: Gesture recognizer for double/double tap.
// delegate?.previewTappedToFocusAt?(self, point: captureDevicePointForPoint(point))
//
// }
private
var
doubleDoubleTapRecognizer
:
UITapGestureRecognizer
?
//
// //
required
public
init
?(
coder
aDecoder
:
NSCoder
)
{
// // :name: handleDoubleTap
super
.
init
(
coder
:
aDecoder
)
// //
prepareView
()
// internal func handleDoubleTap(recognizer: UIGestureRecognizer) {
}
// let point: CGPoint = recognizer.locationInView(self)
// runBoxAnimationOnView(exposureBox, point: point)
public
override
init
(
frame
:
CGRect
)
{
// delegate?.previewTappedToExposeAt?(self, point: captureDevicePointForPoint(point))
super
.
init
(
frame
:
frame
)
// }
prepareView
()
//
}
// //
// // :name: handleDoubleDoubleTap
public
init
()
{
// //
super
.
init
(
frame
:
CGRectZero
)
// internal func handleDoubleDoubleTap(recognizer: UIGestureRecognizer) {
translatesAutoresizingMaskIntoConstraints
=
false
// runResetAnimation()
prepareView
()
// }
}
//
// //
//
// // :name: prepareView
// :name: handleSingleTap
// // :description: Common setup for view.
//
// //
internal
func
handleSingleTap
(
recognizer
:
UIGestureRecognizer
)
{
// private func prepareView() {
let
point
:
CGPoint
=
recognizer
.
locationInView
(
self
)
// let captureLayer: AVCaptureVideoPreviewLayer = layer as! AVCaptureVideoPreviewLayer
runBoxAnimationOnView
(
focusBox
,
point
:
point
)
// captureLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
delegate
?
.
previewTappedToFocusAt
?(
self
,
point
:
captureDevicePointForPoint
(
point
))
//
}
// singleTapRecognizer = UITapGestureRecognizer(target: self, action: "handleSingleTap:")
// singleTapRecognizer!.numberOfTapsRequired = 1
//
//
// :name: handleDoubleTap
// doubleTapRecognizer = UITapGestureRecognizer(target: self, action: "handleDoubleTap:")
//
// doubleTapRecognizer!.numberOfTapsRequired = 2
internal
func
handleDoubleTap
(
recognizer
:
UIGestureRecognizer
)
{
//
let
point
:
CGPoint
=
recognizer
.
locationInView
(
self
)
// doubleDoubleTapRecognizer = UITapGestureRecognizer(target: self, action: "handleDoubleDoubleTap:")
runBoxAnimationOnView
(
exposureBox
,
point
:
point
)
// doubleDoubleTapRecognizer!.numberOfTapsRequired = 2
delegate
?
.
previewTappedToExposeAt
?(
self
,
point
:
captureDevicePointForPoint
(
point
))
// doubleDoubleTapRecognizer!.numberOfTouchesRequired = 2
}
//
// addGestureRecognizer(singleTapRecognizer!)
//
// addGestureRecognizer(doubleTapRecognizer!)
// :name: handleDoubleDoubleTap
// addGestureRecognizer(doubleDoubleTapRecognizer!)
//
// singleTapRecognizer!.requireGestureRecognizerToFail(doubleTapRecognizer!)
internal
func
handleDoubleDoubleTap
(
recognizer
:
UIGestureRecognizer
)
{
//
runResetAnimation
()
// focusBox = viewWithColor(.redColor())
}
// exposureBox = viewWithColor(.blueColor())
// addSubview(focusBox!)
//
// addSubview(exposureBox!)
// :name: prepareView
// }
// :description: Common setup for view.
//
//
// //
private
func
prepareView
()
{
// // :name: viewWithColor
let
captureLayer
:
AVCaptureVideoPreviewLayer
=
layer
as!
AVCaptureVideoPreviewLayer
// // :description: Initializes a UIView with a set UIColor.
captureLayer
.
videoGravity
=
AVLayerVideoGravityResizeAspectFill
// //
// private func viewWithColor(color: UIColor) -> UIView {
singleTapRecognizer
=
UITapGestureRecognizer
(
target
:
self
,
action
:
"handleSingleTap:"
)
// let view: UIView = UIView(frame: Preview.boxBounds)
singleTapRecognizer
!.
numberOfTapsRequired
=
1
// view.backgroundColor = MaterialTheme.clear.color
// view.layer.borderColor = color.CGColor
doubleTapRecognizer
=
UITapGestureRecognizer
(
target
:
self
,
action
:
"handleDoubleTap:"
)
// view.layer.borderWidth = 5
doubleTapRecognizer
!.
numberOfTapsRequired
=
2
// view.hidden = true
// return view
doubleDoubleTapRecognizer
=
UITapGestureRecognizer
(
target
:
self
,
action
:
"handleDoubleDoubleTap:"
)
// }
doubleDoubleTapRecognizer
!.
numberOfTapsRequired
=
2
//
doubleDoubleTapRecognizer
!.
numberOfTouchesRequired
=
2
// //
// // :name: runBoxAnimationOnView
addGestureRecognizer
(
singleTapRecognizer
!
)
// // :description: Runs the animation used for focusBox and exposureBox on single and double
addGestureRecognizer
(
doubleTapRecognizer
!
)
// // taps respectively at a given point.
addGestureRecognizer
(
doubleDoubleTapRecognizer
!
)
// //
singleTapRecognizer
!.
requireGestureRecognizerToFail
(
doubleTapRecognizer
!
)
// private func runBoxAnimationOnView(view: UIView!, point: CGPoint) {
// view.center = point
focusBox
=
viewWithColor
(
.
redColor
())
// view.hidden = false
exposureBox
=
viewWithColor
(
.
blueColor
())
// UIView.animateWithDuration(0.15, delay: 0, options: .CurveEaseInOut, animations: { _ in
addSubview
(
focusBox
!
)
// view.layer.transform = CATransform3DMakeScale(0.5, 0.5, 1)
addSubview
(
exposureBox
!
)
// }) { _ in
}
// let delayInSeconds: Double = 0.5
// let popTime: dispatch_time_t = dispatch_time(DISPATCH_TIME_NOW, Int64(delayInSeconds * Double(NSEC_PER_SEC)))
//
// dispatch_after(popTime, dispatch_get_main_queue()) {
// :name: viewWithColor
// view.hidden = true
// :description: Initializes a UIView with a set UIColor.
// view.transform = CGAffineTransformIdentity
//
// }
private
func
viewWithColor
(
color
:
UIColor
)
->
UIView
{
// }
let
view
:
UIView
=
UIView
(
frame
:
Preview
.
boxBounds
)
// }
view
.
backgroundColor
=
MaterialTheme
.
clear
.
color
//
view
.
layer
.
borderColor
=
color
.
CGColor
// //
view
.
layer
.
borderWidth
=
5
// // :name: captureDevicePointForPoint
view
.
hidden
=
true
// // :description: Interprets the correct point from touch to preview layer.
return
view
// //
}
// private func captureDevicePointForPoint(point: CGPoint) -> CGPoint {
// let previewLayer: AVCaptureVideoPreviewLayer = layer as! AVCaptureVideoPreviewLayer
//
// return previewLayer.captureDevicePointOfInterestForPoint(point)
// :name: runBoxAnimationOnView
// }
// :description: Runs the animation used for focusBox and exposureBox on single and double
//
// taps respectively at a given point.
// //
//
// // :name: runResetAnimation
private
func
runBoxAnimationOnView
(
view
:
UIView
!
,
point
:
CGPoint
)
{
// // :description: Executes the reset animation for focus and exposure.
view
.
center
=
point
// //
view
.
hidden
=
false
// private func runResetAnimation() {
UIView
.
animateWithDuration
(
0.15
,
delay
:
0
,
options
:
.
CurveEaseInOut
,
animations
:
{
_
in
// if !tapToFocusEnabled && !tapToExposeEnabled {
view
.
layer
.
transform
=
CATransform3DMakeScale
(
0.5
,
0.5
,
1
)
// return
})
{
_
in
// }
let
delayInSeconds
:
Double
=
0.5
//
let
popTime
:
dispatch_time_t
=
dispatch_time
(
DISPATCH_TIME_NOW
,
Int64
(
delayInSeconds
*
Double
(
NSEC_PER_SEC
)))
// let previewLayer: AVCaptureVideoPreviewLayer = layer as! AVCaptureVideoPreviewLayer
dispatch_after
(
popTime
,
dispatch_get_main_queue
())
{
// let centerPoint: CGPoint = previewLayer.pointForCaptureDevicePointOfInterest(CGPointMake(0.5, 0.5))
view
.
hidden
=
true
// focusBox!.center = centerPoint
view
.
transform
=
CGAffineTransformIdentity
// exposureBox!.center = centerPoint
}
// exposureBox!.transform = CGAffineTransformMakeScale(1.2, 1.2)
}
// focusBox!.hidden = false
}
// exposureBox!.hidden = false
//
//
// UIView.animateWithDuration(0.15, delay: 0, options: .CurveEaseInOut, animations: { _ in
// :name: captureDevicePointForPoint
// self.focusBox!.layer.transform = CATransform3DMakeScale(0.5, 0.5, 1)
// :description: Interprets the correct point from touch to preview layer.
// self.exposureBox!.layer.transform = CATransform3DMakeScale(0.7, 0.7, 1)
//
// }) { _ in
private
func
captureDevicePointForPoint
(
point
:
CGPoint
)
->
CGPoint
{
// let delayInSeconds: Double = 0.5
let
previewLayer
:
AVCaptureVideoPreviewLayer
=
layer
as!
AVCaptureVideoPreviewLayer
// let popTime: dispatch_time_t = dispatch_time(DISPATCH_TIME_NOW, Int64(delayInSeconds * Double(NSEC_PER_SEC)))
return
previewLayer
.
captureDevicePointOfInterestForPoint
(
point
)
// dispatch_after(popTime, dispatch_get_main_queue()) {
}
// self.focusBox!.hidden = true
// self.exposureBox!.hidden = true
//
// self.focusBox!.transform = CGAffineTransformIdentity
// :name: runResetAnimation
// self.exposureBox!.transform = CGAffineTransformIdentity
// :description: Executes the reset animation for focus and exposure.
// self.delegate?.previewTappedToReset?(self, focus: self.focusBox!, exposure: self.exposureBox!)
//
// }
private
func
runResetAnimation
()
{
// }
if
!
tapToFocusEnabled
&&
!
tapToExposeEnabled
{
// }
return
//}
}
let
previewLayer
:
AVCaptureVideoPreviewLayer
=
layer
as!
AVCaptureVideoPreviewLayer
let
centerPoint
:
CGPoint
=
previewLayer
.
pointForCaptureDevicePointOfInterest
(
CGPointMake
(
0.5
,
0.5
))
focusBox
!.
center
=
centerPoint
exposureBox
!.
center
=
centerPoint
exposureBox
!.
transform
=
CGAffineTransformMakeScale
(
1.2
,
1.2
)
focusBox
!.
hidden
=
false
exposureBox
!.
hidden
=
false
UIView
.
animateWithDuration
(
0.15
,
delay
:
0
,
options
:
.
CurveEaseInOut
,
animations
:
{
_
in
self
.
focusBox
!.
layer
.
transform
=
CATransform3DMakeScale
(
0.5
,
0.5
,
1
)
self
.
exposureBox
!.
layer
.
transform
=
CATransform3DMakeScale
(
0.7
,
0.7
,
1
)
})
{
_
in
let
delayInSeconds
:
Double
=
0.5
let
popTime
:
dispatch_time_t
=
dispatch_time
(
DISPATCH_TIME_NOW
,
Int64
(
delayInSeconds
*
Double
(
NSEC_PER_SEC
)))
dispatch_after
(
popTime
,
dispatch_get_main_queue
())
{
self
.
focusBox
!.
hidden
=
true
self
.
exposureBox
!.
hidden
=
true
self
.
focusBox
!.
transform
=
CGAffineTransformIdentity
self
.
exposureBox
!.
transform
=
CGAffineTransformIdentity
self
.
delegate
?
.
previewTappedToReset
?(
self
,
focus
:
self
.
focusBox
!
,
exposure
:
self
.
exposureBox
!
)
}
}
}
}
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment