how do i create a combine publisher

Generated on 8/2/2024

1 search

To create a Combine publisher, you can use the Publisher protocol provided by the Combine framework. Here is a basic example of how to create a simple publisher that emits a sequence of integers:

import Combine

// Create a simple publisher that emits a sequence of integers
let myPublisher = [1, 2, 3, 4, 5].publisher

// Subscribe to the publisher
let subscription = myPublisher.sink { value in
    print("Received value: \(value)")
}

In this example, the publisher property of the array creates a publisher that emits each element of the array in sequence. The sink method is used to subscribe to the publisher and handle the emitted values.

For more advanced use cases, you can create custom publishers by conforming to the Publisher protocol. Here is an example of a custom publisher that emits a single value after a delay:

import Combine
import Foundation

struct DelayedPublisher: Publisher {
    typealias Output = String
    typealias Failure = Never

    let value: String
    let delay: TimeInterval

    func receive<S>(subscriber: S) where S : Subscriber, DelayedPublisher.Failure == S.Failure, DelayedPublisher.Output == S.Input {
        let subscription = DelayedSubscription(subscriber: subscriber, value: value, delay: delay)
        subscriber.receive(subscription: subscription)
    }

    private class DelayedSubscription<S: Subscriber>: Subscription where S.Input == String, S.Failure == Never {
        private var subscriber: S?
        private let value: String
        private let delay: TimeInterval

        init(subscriber: S, value: String, delay: TimeInterval) {
            self.subscriber = subscriber
            self.value = value
            self.delay = delay
            DispatchQueue.main.asyncAfter(deadline: .now() + delay) { [weak self] in
                _ = self?.subscriber?.receive(value)
                self?.subscriber?.receive(completion: .finished)
            }
        }

        func request(_ demand: Subscribers.Demand) {
            // We ignore demand in this simple example
        }

        func cancel() {
            subscriber = nil
        }
    }
}

// Usage
let delayedPublisher = DelayedPublisher(value: "Hello, Combine!", delay: 2.0)
let delayedSubscription = delayedPublisher.sink { value in
    print("Received value: \(value)")
}

In this custom publisher example, DelayedPublisher emits a string value after a specified delay. The DelayedSubscription class handles the delay and emits the value to the subscriber.

For more detailed information on Combine and creating custom publishers, you might want to check out the relevant sessions from WWDC, such as "What’s new in Xcode 16" which covers updates in the build process and debugging that might be useful when working with Combine.

Here are some relevant sessions that might be of interest:

  1. What’s new in Xcode 16 - Adopting Swift 6 data-race safety guarantees
  2. Demystify explicitly built modules - Using modules

If you need more specific information or examples, feel free to ask!

Support real-time ML inference on the CPU

Support real-time ML inference on the CPU

Discover how you can use BNNSGraph to accelerate the execution of your machine learning model on the CPU. We will show you how to use BNNSGraph to compile and execute a machine learning model on the CPU and share how it provides real-time guarantees such as no runtime memory allocation and single-threaded running for audio or signal processing models.

Deploy machine learning and AI models on-device with Core ML

Deploy machine learning and AI models on-device with Core ML

Learn new ways to optimize speed and memory performance when you convert and run machine learning and AI models through Core ML. We’ll cover new options for model representations, performance insights, execution, and model stitching which can be used together to create compelling and private on-device experiences.

Enhance the immersion of media viewing in custom environments

Enhance the immersion of media viewing in custom environments

Extend your media viewing experience using Reality Composer Pro components like Docking Region, Reverb, and Virtual Environment Probe. Find out how to further enhance immersion using Reflections, Tint Surroundings Effect, SharePlay, and the Immersive Environment Picker.

Create custom environments for your immersive apps in visionOS

Create custom environments for your immersive apps in visionOS

Discover how to create visually rich and performant customized app environments for Apple Vision Pro. Learn design guidelines, get expert recommendations, and explore techniques you can use in any digital content creation tool to begin building your immersive environment.

Explore object tracking for visionOS

Explore object tracking for visionOS

Find out how you can use object tracking to turn real-world objects into virtual anchors in your visionOS app. Learn how you can build spatial experiences with object tracking from start to finish. Find out how to create a reference object using machine learning in Create ML and attach content relative to your target object in Reality Composer Pro, RealityKit or ARKit APIs.

What’s new in Xcode 16

What’s new in Xcode 16

Discover the latest productivity and performance improvements in Xcode 16. Learn about enhancements to code completion, diagnostics, and Xcode Previews. Find out more about updates in builds and explore improvements in debugging and Instruments.

Demystify explicitly built modules

Demystify explicitly built modules

Explore how builds are changing in Xcode 16 with explicitly built modules. Discover how modules are used to build your code, how explicitly built modules improve transparency in compilation tasks, and how you can optimize your build by sharing modules across targets.

Bring your machine learning and AI models to Apple silicon

Bring your machine learning and AI models to Apple silicon

Learn how to optimize your machine learning and AI models to leverage the power of Apple silicon. Review model conversion workflows to prepare your models for on-device deployment. Understand model compression techniques that are compatible with Apple silicon, and at what stages in your model deployment workflow you can apply them. We’ll also explore the tradeoffs between storage size, latency, power usage and accuracy.

Support semantic search with Core Spotlight

Support semantic search with Core Spotlight

Learn how to provide semantic search results in your app using Core Spotlight. Understand how to make your app’s content available in the user’s private, on-device index so people can search for items using natural language. We’ll also share how to optimize your app’s performance by scheduling indexing activities. To get the most out of this session, we recommend first checking out Core Spotlight documentation on the Apple Developer website.