Artificial Intelligence 6 min read

Using Apple Core ML for On‑Device Machine Learning: A Practical Guide with ResNet‑50 Example

Core ML is Apple’s on‑device machine‑learning framework that integrates deep‑learning, decision‑tree, SVM and linear models into iOS apps, leveraging Accelerate, BNNS and Metal Performance Shaders, and the article demonstrates its use with a ResNet‑50 model, Vision APIs, and complete Objective‑C code.

Hujiang Technology
Hujiang Technology
Hujiang Technology
Using Apple Core ML for On‑Device Machine Learning: A Practical Guide with ResNet‑50 Example

Core ML is Apple’s newly announced machine‑learning framework from WWDC, designed to let developers embed a variety of models—deep neural networks with more than 30 layers, decision‑tree ensembles, SVMs, and linear models—directly into iOS applications.

The framework is built on low‑level technologies such as Accelerate, BNNS and Metal Performance Shaders, allowing it to exploit both CPU and GPU resources for maximum performance while keeping all data on the device for privacy.

Core ML provides three high‑level modules:

Vision – high‑performance image analysis and recognition (face tracking, OCR, object detection, QR‑code scanning, etc.).

Natural Language Processing – language identification, tokenization, part‑of‑speech tagging, and more.

GameplayKit – utilities for game development such as random number generation, AI agents, path‑finding, and behavior trees.

Apple offers a collection of ready‑to‑use models (e.g., ResNet‑50) in the official machine‑learning portal, and third‑party models can be converted to the Core ML format using the Core ML Tools package from PyPI.

When the ResNet‑50 model is added to an Xcode project it generates three Objective‑C classes: Resnet50 , Resnet50Input , and Resnet50Output . The input class expects a CVPixelBufferRef , which can be created from an image using the Vision framework.

Resnet50 *resnetModel = [[Resnet50 alloc] init];
UIImage *image = self.selectedImageView.image;
VNCoreMLModel *vnCoreModel = [VNCoreMLModel modelForMLModel:resnetModel.model error:nil];

VNCoreMLRequest *vnCoreMlRequest = [[VNCoreMLRequest alloc] initWithModel:vnCoreModel completionHandler:^(VNRequest *request, NSError *error) {
    CGFloat confidence = 0.0f;
    VNClassificationObservation *tempClassification = nil;
    for (VNClassificationObservation *classification in request.results) {
        if (classification.confidence > confidence) {
            confidence = classification.confidence;
            tempClassification = classification;
        }
    }
    self.recognitionResultLabel.text = [NSString stringWithFormat:@"识别结果:%@", tempClassification.identifier];
    self.confidenceResult.text = [NSString stringWithFormat:@"匹配率:%@", @(tempClassification.confidence)];
}];

VNImageRequestHandler *vnImageRequestHandler = [[VNImageRequestHandler alloc] initWithCGImage:image.CGImage options:nil];
NSError *error = nil;
[vnImageRequestHandler performRequests:@[vnCoreMlRequest] error:&error];
if (error) {
    NSLog(@"%@", error.localizedDescription);
}

The request is executed with [vnImageRequestHandler performRequests:@[vnCoreMlRequest] error:&error]; , and the completion handler receives an array of VNClassificationObservation objects. Each observation contains a confidence value and an identifier that represents the predicted class; the code selects the observation with the highest confidence and displays its identifier and confidence.

For a complete working demo, see the GitHub repository at https://github.com/spykerking/TestCoreML.git .

Artificial IntelligenceiOSmachine learningvisionObjective-CCore MLResNet50
Hujiang Technology
Written by

Hujiang Technology

We focus on the real-world challenges developers face, delivering authentic, practical content and a direct platform for technical networking among developers.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.