IOS terminal Hikvision camera access

Over the past few days, the company requires the end of the phone needs to see Hikvision’s surveillance video, online looking for a long time there is no ready to use demo. Later, Hikvision’s technical staff to a demo, here to sum up. In addition to Demo can contact me.

1, import as shown in the figure of the three documents

IOS terminal Hikvision camera access


  • VideoPlaySDK is used to monitor the broadcast address (mostly RTSP URL) through streaming media or MAG stream, to achieve playback, real-time preview and remote playback function.
  • VMSNetSDK is the interface to get platform resources. Such as access to equipment information, access to regional information, etc..

2, add the required Library

  • When using VMSNetSDK in libVMSNetSDK.a need to add all
    SystemConfiguration.framework library
    and VMSNetSDK: if compile ‘libxml/tree.h’ file not found Build Setting Header Search set in Paths: /usr/include/libxml2/**”
  • When using VideoPlaySDK, need to add the following
    library AVFoundation.framework

3, the use of the project

  • 1 / / call in the didFinishLaunchingWithOptions method (InitLib); initialize the player library VP_InitSDK ();
  • 2 login login information platform for the 2.1 to obtain the login information: [vmsNetSDK login:_serverAddressTextField.text toUserName: platform toPassword: platform toLineID: account password available line passwordLevel:3 / / user password strength by the developer to judge, intensity increases respectively 0,1,2,3 toServInfo:mspInfo]; the mspInfo is required to obtain the login information 2.2 line list: _lineList = [NSMutableArray array]; [vmsNetSDK getLineList: toLineInfoList:_lineList] server address; network test environment for
  • All the resources 3 to obtain all the resources under the current account / platform level to obtain the current level – (NSMutableArray * _getAllResources) {VMSNetSDK *vmsNetSDK = [VMSNetSDK shareInstance]; _allResorceList [NSMutableArray = array]; NSMutableArray = *tempArray [NSMutableArray array]; / / determine the current object should obtain control center or regional resources under if (NIL = = {if (_regionInfo) nil = = _controlUnitInfo) {/ / access control center [vmsNetSDK getControlUnitList:_serverAddress toSessionID: _mspInfo.sessionID a toControlUnitID:0 toNumPerOnce:50 toCurPage:1 toControlUnitList:tempA Rray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects];} else {/ / access control center under the control of getControlUnitList:_serverAddress toSessionID:_mspInfo.sessionID toControlUnitID: _controlUnitInfo.controlUnitID toNumPerOnce:50 [vmsNetSDK toCurPage:1 toControlUnitList:tempArray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects]; / / get the control center of the area under the [vmsNetSDK getRegionListFromCtrlUnit: _serverAddress toSessionID:_ms PInfo.sessionID toControlUnitID:_controlUnitInfo.controlUnitID toNumPerOnce:50 toCurPage:1 toRegionList:tempArray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects]; / / get the control center under the [vmsNetSDK getCameraListFromCtrlUnit:_serverAddress toSessionID:_mspInfo.sessionID toControlUnitID:_controlUnitInfo.controlUnitID toNumPerOnce:50 toCurPage:1 equipment toCameraList:te MpArray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects];}} else {/ / get the area under the [vmsNetSDK getRegionListFromRegion:_serverAddress toSessionID:_mspInfo.sessionID area toRegionID:_regionInfo.regionID toNumPerOnce:50 toCurPage:1 toRegionList:tempArray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects]; / / get under the [vmsNetSDK getCameraListFromRegion:_serverAddress toSessionID:_mspInfo.sessionID equipment ToRegionID:_regionInfo.regionID toNumPerOnce:50 toCurPage:1 toCameraList:tempArray]; [_allResorceList addObjectsFromArray:tempArray]; [tempArray removeAllObjects];} return _allResorceList;} if the recursive traversal, you can get to all equipment and broadcast channel.
    access to the resource array if it belongs to the CCameraInfo class or subclass, that can be used to play equipment, the parameters will be required to pass it to the
    server address, login information and monitoring information of cameraInfo MspInfo
  • 4 get ready to play play address _realPlayURL = [[CRealPlayURL result = alloc] init]; BOOL [vmsNetSDK getRealPlayURL: toSessionID:_mspInfo.sessionID toCameraID:_cameraInfo.cameraID toRealPlayURL:_realPlayURL toStreamType:STREAM_SUB] server address; / / StreamType = 0, return to the main stream and the MAG address, the = 1 return sub stream and MAG address information acquisition equipment CDeviceInfo *deviceInfo = [[CDeviceInfo alloc] init]; result = [vmsNetSDK getDeviceInfo:_serverAddress toSessionID:_mspInfo.sessionID toDeviceID: _cameraInfo.deviceID toDeviceInfo:deviceInfo]; Fill the monitoring information filled vidioInfo VideoPlayInfo = [[VideoPlayInfo *videoInfo / / alloc] / / init]; ID videoInfo.strID = _cameraInfo.cameraID monitoring point; / / take UDP or TCP stream protocol type videoInfo.protocalType = PROTOCAL_UDP; / / playback mode, real-time preview or remote playback of videoInfo.playType = REAL_PLAY; / / the flow way, currently only supports streaming media from videoInfo.streamMethod = STREAM_METHOD_VTDU; / / master stream VP_STREAM_TYPE streamType = STREAM_SUB; videoInfo.streamType = streamType; / / UIView controls can play videoInfo.pPlayHandle = self.playView (ID); / / whether to open the package to videoInfo.bSystransform = NO; videoInfo.strPlayUrl = _realPlayURL.url1 / broadcast address;
  • 5 / VideoPlaySDK player starts to play gets the handle if (_vpHandle = = NULL) {_vpHandle = VP_Login (videoInfo);} / / set the status callback if (_vpHandle! = NULL) {VP_SetStatusCallBack (_vpHandle, StatusCallBack, self (__bridge * void));} / / to the real-time preview of if (_vpHandle = {if (NULL!)! VP_RealPlay (_vpHandle)) {NSLog (@ “start VP_RealPlay failed”);}}

4, summary

  • Although _mspInfo.sessionID and _cameraInfo.cameraID are dynamic, but when playing does not need access to the project, when they are integrated, only need to get to the RTSP streaming address, this address is written in the death, the initialization of a VideoplayInfo monitoring point ID can not fill, fill in the other parameters, and then start playing you can
  • There is a relationship between the broadcast network camera with Caton, it is best to use the network to upload streaming media, the use of 3/4g card upload if the delay would be more serious.
  • On the optimization of the
    VideoPlaySDK is a good control of the playback, if you need to play fluency optimization should be required to write a player to play.

Welcome to the guest host in exchange QQ group: 257011323