Handling networking responses using Combine in Swift allows for a more reactive approach to managing asynchronous operations. Combine provides a powerful way to work with publishers and subscribers, making it easier to handle responses, errors, and cancellation.
import Combine
import Foundation
// Define a struct to model your data
struct MyData: Codable {
let id: Int
let name: String
}
// Function to perform a network request
func fetchData() -> AnyPublisher {
let url = URL(string: "https://api.example.com/data")!
return URLSession.shared.dataTaskPublisher(for: url)
.map { $0.data }
.decode(type: MyData.self, decoder: JSONDecoder())
.receive(on: DispatchQueue.main)
.eraseToAnyPublisher()
}
// Example usage
var cancellables = Set()
fetchData()
.sink(receiveCompletion: { completion in
switch completion {
case .finished:
print("Request finished.")
case .failure(let error):
print("Error occurred: \(error)")
}
}, receiveValue: { data in
print("Received data: \(data)")
})
.store(in: &cancellables)
How do I avoid rehashing overhead with std::set in multithreaded code?
How do I find elements with custom comparators with std::set for embedded targets?
How do I erase elements while iterating with std::set for embedded targets?
How do I provide stable iteration order with std::unordered_map for large datasets?
How do I reserve capacity ahead of time with std::unordered_map for large datasets?
How do I erase elements while iterating with std::unordered_map in multithreaded code?
How do I provide stable iteration order with std::map for embedded targets?
How do I provide stable iteration order with std::map in multithreaded code?
How do I avoid rehashing overhead with std::map in performance-sensitive code?
How do I merge two containers efficiently with std::map for embedded targets?