<?xml version="1.0" encoding="UTF-8"?>
<!--Generated by Site-Server v@build.version@ (http://www.squarespace.com) on Tue, 07 Apr 2026 19:53:34 GMT
--><rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:wfw="http://wellformedweb.org/CommentAPI/" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:media="http://www.rssboard.org/media-rss" version="2.0"><channel><title>The Prudent Protocol - Prudent Leap Software</title><link>https://www.prudentleap.com/prudent-protocol/</link><lastBuildDate>Tue, 24 Feb 2026 22:26:27 +0000</lastBuildDate><language>en-US</language><generator>Site-Server v@build.version@ (http://www.squarespace.com)</generator><description><![CDATA[]]></description><item><title>Using and customizing Text Input Controls</title><category>SwiftUI</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 22 Feb 2026 22:04:43 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2026/2/using-and-customizing-text-input-controls</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:699b1d425cc57628fc629414</guid><description><![CDATA[Learn how SwiftUI Text Input Controls work, from basic functionality to 
advanced use cases. Explore the security benefits of using a SecureField 
over a TextField.]]></description><content:encoded><![CDATA[<p class="">Since text input is a primary control for many applications, it’s especially important to become familiar with text-based controls and the mechanisms used to modify their behavior.</p><p class="">SwiftUI provides three main controllers you can use to capture input text:</p><ul data-rte-list="default"><li><p class=""><a href="https://developer.apple.com/documentation/swiftui/textfield" title="SwiftUI Documentation - TextField control"><span><strong><em>TextField</em></strong></span></a>, which is generally used to capture a single line of text, as input ( name, e-mail address etc.). It can be used to capture large amounts of text, but it will turn the input text into one single line (it would drop all newline characters, list markers, text decorations, links etc.).</p></li><li><p class=""><a href="https://developer.apple.com/documentation/swiftui/securefield" title="SwiftUI Documentation - SecureField"><span><strong><em>SecureField</em></strong></span></a>, which is generally used to capture confidential and private information. This field takes your unmodified input, but displayed a redacted version, by replacing characters with circles or stars. You should use <strong><em>SecureField</em></strong> to capture <em>passwords</em>, <em>Card Verification Values</em> etc.</p></li><li><p class=""><a href="https://developer.apple.com/documentation/swiftui/texteditor" title="SwiftUI Documentation - TextEditor"><span><strong><em>TextEditor</em></strong></span></a>, used to capture paragraphs of data. You would typically use this control to capture notes in a notes application. With iOS26, the TextEditor control received a lot of <a href="https://developer.apple.com/videos/play/wwdc2025/280" title="WWDC 2025 - Code-along: Cook up a rich text experience in SwiftUI with AttributedString"><span><strong><em>extra functionality</em></strong></span></a>, allowing it to work with AttributedStrings. Unless you need to support older OS versions, you can use this control for a full Rich Text Editor, in native SwiftUI code.</p></li><li><p class=""><strong><em>UIKit/AppKit</em></strong> views wrapped in a <a href="https://developer.apple.com/documentation/swiftui/uiviewrepresentable" title="SwiftUI Documentation -UIViewRepresentable"><span><strong><em>UIViewRepresentable</em></strong></span></a>, such as <a href="https://developer.apple.com/documentation/uikit/uitextview" title="UIKit Documentation - UITextView"><span><strong><em>UITextView</em></strong></span></a> and <a href="https://developer.apple.com/documentation/appkit/nstextview" title="AppKit Documentation - NSTextView"><span><strong><em>NSTextView</em></strong></span></a>, used to capture rich text data. Prior to iOS26 and MacOS26, you would use these controls to create rich text editors in SwiftUI. </p></li></ul><p class="">All SwiftUI text input controls require a <a href="https://developer.apple.com/documentation/swiftui/binding" title="SwiftUI - Binding Property Wrapper"><span><strong><em>binding</em></strong></span></a> to a <a href="https://developer.apple.com/documentation/swiftui/state" title="SwiftUI - State property wrapper"><span><strong><em>state</em></strong></span></a> property. The controls themselves provide the mechanisms your end-users need, in order to interact with that state property. As a result of this implementation detail, these controls provide you with the ability to easily validate and\or process the input, on the fly (for example, using the <code>.onChange</code> modifier). Additionally, the text controls provide enough customization options, to make them fit into most designs.</p><p class="">For context, <strong>login screens</strong> or user <strong>registration forms</strong> both use text input controls, grouped and styled appropriately, to take user information, such as user name and password, and pass it to some other component or service. Another very common use case is the <strong>payment details</strong> form, where users provide credit card information.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><strong><em>When connected to a debugger</em></strong><em>, both in the simulator and on a connected device, all text input controls can display errors and they can block the interface when they load. In general, these issues are safe to ignore, since </em><strong><em>they will not occur when the application runs in standard mode</em></strong><em> (when the debugger is not attached to the process). If you encounter general slowness with text input controls, before investing time to debug, try running the application by directly loading it from the device interface, instead of pushing it from Xcode (so that the debugger does not connect).</em></span></p>


  




&nbsp;
  
  <p class="">To showcase some of the customization possibilities, as well as the way they interact with the rest of the interface you would build, some views in this section are going to be displayed in a <strong><em>ZStack</em></strong>, on top of an animated background view. We are going to explore this in far more detail later but, for now, it’s useful to know that, in SwiftUI, you can animate a vast majority of the views’ properties, including their scale and color. SwiftUI provides several tools you can use to create animations, such as the <a href="https://developer.apple.com/documentation/swiftui/withanimation(_:_:)" title="SwiftUI documentation - The `withAnimation` method"><span><strong><em>withAnimation function</em></strong></span></a>, the <a href="https://developer.apple.com/documentation/swiftui/view/animation(_:)" title="SwiftUI documentation - The `.animation` View Modifier"><span><strong><em>.animation view modifier</em></strong></span></a> or the <a href="https://developer.apple.com/documentation/swiftui/binding/animation(_:)" title="SwiftUI documentation - The `.animation` Binding method"><span><strong><em>.animation Binding method</em></strong></span></a>. Apple also hosted a few presentations regarding SwiftUI’s <a href="https://developer.apple.com/documentation/swiftui/animations" title="SwiftUI documentation - The Animation API Collection"><span><strong><em>Animation APIs</em></strong></span></a>, such as <a href="https://developer.apple.com/videos/play/wwdc2023/10156/" title="WWDC 2023 - Explore SwiftUI animation"><span><strong><em>Explore SwiftUI animations</em></strong></span></a> and <a href="https://developer.apple.com/videos/play/wwdc2023/10157/" title="WWDC 2023 - Wind your way through advanced animations in SwiftUI"><span><strong><em>Wind your way through advanced animations in SwiftUI</em></strong></span></a>. Apple’s SwiftUI team also maintains a brief <a href="https://developer.apple.com/tutorials/swiftui/animating-views-and-transitions" title="SwiftUI Animation Tutorial"><span><strong><em>SwiftUI animation tutorial</em></strong></span></a>. </p><p class="">The animated background view provided below creates a set of randomly colored circles, scales them up and down automatically and moves them on the screen. To animate each circle’s color and position, we tie them to a <strong><em>scale</em></strong> <code>@State</code>property. Using the <code>onAppear</code> modifier, we force SwiftUI to transition between the initial value of the scale property, assigned on initialization, and the new value, provided in the modifier’s closure. To keep the animation running indefinitely, we can simply use the <a href="https://developer.apple.com/documentation/swiftui/animation/repeatforever(autoreverses:)" title="SwiftUI documentation - The `.repeatForever` method"><span><strong><em>repeatForever method</em></strong></span></a>. The <code>blur</code> effect is used to create the illusion of a thick material between the screen and the underlying animated circles and it was very commonly used to give the impression of translucent glass or plastic, before iOS26 and Apple’s Liquid Glass.</p><p class="">To be useful, the animated background view needs to create and moves circles in an area that is visible. You can use <a href="https://docs.swift.org/swift-book/documentation/the-swift-programming-language/statements/#Conditional-Compilation-Block" title="Swift Documentation - Compilation Directives"><span><strong><em>compiler directives</em></strong></span></a> to condition the area in which the colored circles can move, based on the Operating System. There are other, more precise mechanisms (such as the <strong><em>GeometryReader</em></strong> container), but this will suffice for our current requirements.</p>


  




<pre><code class="language-swift">struct AnimatedBackground: View {
    @State private var scale : CGFloat = 1
    @State private var nbOfCircles: Int = 25
    let bubbleColors:[Color] = [.red,.mint,.purple,.pink,.purple,.blue,.cyan,.orange,.red,.teal]

    var body: some View {
        
        ZStack {
            ForEach (0...nbOfCircles, id:\.self) { index in
                Circle ()
                    .foregroundColor(scale &lt; 2 ? bubbleColors[index &lt; bubbleColors.count - 1 ? index : bubbleColors.count - 1] : bubbleColors[index &lt; bubbleColors.count - 1 ? index % bubbleColors.count : index % (bubbleColors.count-2)])
                    .animation (Animation.easeInOut(duration: 1)
                        .repeatForever(autoreverses: true)
                        .speed (.random(in: 0.05...0.5))
                        .delay(.random (in: 0...1)), value: scale
                    )
                    .scaleEffect(self.scale * .random(in: 0.5...3))
                    .frame(width: .random(in: 50...100),
                           height: CGFloat.random (in:50...100),
                           alignment: .center)
#if os(iOS)
                    .position(CGPoint(x: scale &lt; 2 ? .random(in: 50...200) + index : .random(in: 100...300) - index,
                                      y: scale &lt; 2 ? .random(in:50...980) + index : .random(in: 100...800) - index))
#else
                    .position(CGPoint(x: scale &lt; 2 ? .random(in: 10...2000) + index : .random(in: 100...1300) - index,
                                      y: scale &lt; 2 ? .random(in:10...600) + index : .random(in: 100...800) - index))
#endif
                    .animation (Animation.easeInOut(duration: 1)
                        .repeatForever(autoreverses: true)
                        .speed (.random(in: 0.05...0.5))
                        .delay(.random (in: 0...1)), value: scale
                    )
                    .blendMode(.destinationOver)
            }
        }
        .onAppear {
            self.scale = 2
        }
        .background(
            Rectangle()
                .foregroundColor(.gray))
        .blur(radius: 60)
        .ignoresSafeArea()


    }
}</code></pre>


  
  <p class="">The <strong><em>TextInputControls</em></strong> view below acts as the main test view, used to showcase the text input controls and their customization options. The input controls are going to be added in the <strong><em>VStack</em></strong>.</p>


  




<pre><code class="language-swift">struct TextInputControls: View {
    @State private var input = ""
    var body: some View {
        ZStack{
            AnimatedBackground()
            VStack{
                Text("The **raw** input value is: \(input)")
            }
        }
    }
}</code></pre>

&nbsp;
  
  <h3>The TextField Control</h3><p class="">The <strong><em>TextField</em></strong> <em>control</em> is used to capture <em>a single, unformatted line of text</em>. It is an <strong>expansive</strong> view, unlike the <strong>fixed-size<em>Text</em></strong> primitive. The image below showcases some common use cases for the <strong><em>TextField</em></strong> control (or its UIKit equivalent, <strong><em>UITextField</em></strong>), in Apple’s first party applications.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp" data-image-dimensions="2504x1020" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=1000w" width="2504" height="1020" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/13d586d4-ff0c-4110-aacf-50fce7483048/TextFieldExamples.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>TextField controls. From left to right, the applications are Contacts, Reminders, Spotlight, Passwords and the Apple Account section in Settings</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">The snippet below exemplifies a <strong><em>TextField</em></strong> control in its most basic form. It takes a <em>localized string</em> as a <strong><em>label</em></strong> and a <em>binding to a State property</em> as the <strong><em>text</em></strong> parameter. It is, therefore, common to surround the <strong><em>TextField</em></strong> control with some padding, to create some space between the edges of the screen and the control.</p>


  




<pre><code class="language-swift">TextField("Card Title", text:$input).padding()</code></pre>


  
  <p class="">The screenshots below represent the same view, but rendered on various platforms. Notice how, on <strong>iOS</strong> and on <strong>visionOS</strong>, the <strong><em>TextField</em></strong> controls do not provide, in their default styling, any background. There are multiple mechanisms you can use to add a background, depending on the OS version your application targets, and we are going to explore some of them shortly. It’s also useful to note that the animated background may make sense on some devices and in some contexts, but not on others.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp" data-image-dimensions="3516x927" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=1000w" width="3516" height="927" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/a795a9ed-1106-4eb6-b489-ae6820fe94aa/TextFieldDefaultMultiplatform.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>TextField controls, with the default styling, on iPhone(left), Mac(center) and Apple Vision Pro(right)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">Depending on your particular use case, you may want to change the visual style of the control. For example, you may want to make it more visible on compact platforms, like the iPhone and Vision Pro. With <strong>iOS26</strong>, on the iPhone, you can use the <a href="https://developer.apple.com/documentation/swiftui/view/glasseffect(_:in:)" title="SwiftUI documentation - The `.glassEffect` modifier"><span><strong><em>glassEffect modifier</em></strong></span></a>, to quickly apply <a href="https://developer.apple.com/documentation/SwiftUI/Applying-Liquid-Glass-to-custom-views" title="SwiftUI documentation - Applying LiquidGlass to views"><span><strong><em>LiquidGlass</em></strong></span></a>. If you feel LiquidGlass doesn’t match your application, you could use the <code>textFieldStyle</code> modifier, or you could simply add a background modifier, directly. The snippet below showcases a few examples, both with and without Liquid Glass and using various initializers.</p>


  




<pre><code class="language-swift">TextField("Card Title", text:$input).padding().glassEffect().padding()
TextField("Card Title", text:$input).padding().textFieldStyle(.roundedBorder)
TextField("Card Title", text:$input).padding()
    .background{
Capsule(style: .circular).foregroundStyle(.ultraThinMaterial)
    }
    .tint(.red)
    .foregroundColor(.red)
    .padding()
TextField(text: $input){
    Text("Card Title").foregroundStyle(.red)
}.padding()
    .background{
Capsule(style: .circular).foregroundStyle(.ultraThinMaterial)
    }.padding()
TextField("Card Title", text: $input, prompt: Text("Card Title").foregroundStyle(.white), axis: .vertical).padding().glassEffect(.regular.tint(.mint)).padding()</code></pre>


  
  <p class="">The image below showcases the resulting interface, both on <strong>iOS</strong> and on <strong>macOS</strong>.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp" data-image-dimensions="2402x1314" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=1000w" width="2402" height="1314" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01fc8b21-f528-47d9-b99c-6990f97b265e/TextFieldStylingExamples.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>TextField controls, with various styling configurations, on iPhone(left) and Mac(center)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Note that Vision Pro does not currently support Liquid Glass (at least not with </em><strong><em>VisionOS 26</em></strong><em>). If your multi platform application uses Liquid Glass, you should ensure there is a style to fall back onto. Otherwise, the application will not build for the VisionOS target.</em></span></p>


  




&nbsp;&nbsp;
  
  <h3>Customizing the Virtual Keyboard, the Quick Type Bar the Keyboard Toolbar</h3><p class="">A distinct characteristic of text input controls is that, on compact platforms such as iOS, iPadOS, visionOS, watchOS and tvOS, when the control is in focus (active, selected) you also get a pop-up or a modal keyboard. Using the <a href="https://developer.apple.com/documentation/swiftui/view/keyboardtype(_:)" title="SwiftUI documentation - The `.keyboardType` modifier"><span><strong><em>.keyboardType </em></strong></span></a>modifier, you can change the default keyboard to a style that better matches the type of information the text field is meant to capture. The snippet and image below showcase an example you would use when the input you request from your end-users is a phone number. There are other styles, which suit other types of input better (<code>.numbersAndPunctuation</code>, <code>.emailAddress</code> etc).</p><p class="">You could use the default keyboard, but the dedicated style provides a much better experience and many Apple users expect this type of care from their apps.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Since macOS does not support the </em></span><code><span class="sqsrte-text-color--white"><strong>keyboardType</strong></span></code><span class="sqsrte-text-color--white"><em> modifier, we can simply condition the addition of the modifier with a </em><strong><em>compiler directive</em></strong><em>.</em></span></p><p class=""><span class="sqsrte-text-color--white"><em>When working on Multiplatform applications, there will often be cases where specific interface elements will need their own implementation. Sometimes, it’s sufficient to keep a single struct and condition modifiers based on the OS, as shown in this example. In more complex cases, though, you will be a lot better off by simply creating </em><strong><em>individual</em></strong><em> structs (in the same file, or in dedicated files) and use compiler conditional directives to choose which view to render.</em></span></p>


  




&nbsp;<pre><code class="language-swift">import SwiftUI

struct TextInputControls: View {
    @State private var input = ""
    var body: some View {
        ZStack{
            AnimatedBackground()
            VStack{
                TextField("Phone Number", text: $input, prompt: Text("Phone Number").foregroundStyle(.black))
                    .padding(20)

#if (os(iOS) || os(macOS))
                .glassEffect()
                .padding(20)
#endif

#if os(visionOS)
                    .background{
                        Capsule(style: .circular)
                            .stroke(lineWidth: 5)
                            .foregroundStyle(.ultraThinMaterial)
                    }
                    .padding(50)
#endif // os(visionOS)

#if !os(macOS)
                    .keyboardType(.phonePad)
#endif
                
                Text("The **raw** input value is: \(input)").padding()
            }
        }
    }
}

#Preview {
    TextInputControls()
}</code></pre>












































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp" data-image-dimensions="2377x1130" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=1000w" width="2377" height="1130" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3d2ab5f9-ac7e-463d-96b7-8007774a8526/PhonePadKeyboard.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>PhonePad Keyboard Styling, on iOS(left) and visionOS(right)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">You can also further improve the experience you provide to your end-users, by specifying the type of data associated to the text field. For example, by using the <a href="https://developer.apple.com/documentation/swiftui/view/textcontenttype(_:)-ufdv" title="SwiftUI documentation - textContentType Modifier"><span><strong><em>.textContentType modifier</em></strong></span></a>, you can control the options your end-users receive in the quick type bar (the autocomplete area above the keyboard). You can also modify the way the Submit (return) key looks, on the keyboard, using the <a href="https://developer.apple.com/documentation/swiftui/submitlabel" title="SwiftUI documentation - submitLabel Modifier"><span><strong><em>.submitLabel modifier</em></strong></span></a>. Apple provides a set of SubmitLabels you can use, but you cannot easily add your own. Using the <a href="https://developer.apple.com/documentation/swiftui/view/toolbar(content:)" title="SwiftUI documentation - .toolbar Modifier"><span><strong><em>.toolbar modifier,</em></strong></span></a> you can also add custom buttons right above the keyboard. That is where you would add custom buttons, or other types of custom views. In the image below, you can see some of the common use cases of each modifier (although they are not necessarily combined this way in practice).</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp" data-image-dimensions="3594x1951" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=1000w" width="3594" height="1951" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2536a837-3941-4650-8f15-15637ca1a45e/CustomizedKeyboard.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Example of a custom keyboard with Password autocomplete, a toolbar and a custom submit label</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <h3>Adding custom behavior on updates </h3><p class="">As an input control, the main purpose of the <strong><em>TextField</em></strong> is to effect a change in your application’s state. In the previous example, it updates the <strong><em>input</em></strong> state property. In some interfaces, you may want to verify the input your end-user provides (for example, your input property should contain no more than 10 characters, similar to a normal phone number), or you may want to reformat or modify the input ( for example, reformat the text as a phone number, as seen in the Contacts application). There are multiple mechanisms you can use, but the easiest ones to use are the <a href="https://developer.apple.com/documentation/swiftui/view/onchange(of:initial:_:)"><span><strong><em>.onChange</em></strong></span></a> (updated with iOS17) and <a href="https://developer.apple.com/documentation/swiftui/view/onsubmit(of:_:)"><span><strong><em>.onSubmit</em></strong></span></a>modifiers. </p><h4>Limiting the length of the input to 10 characters</h4><p class="">When used strictly for formatting purposes, the formatting logic can be maintained inside the view modifier. For example, the snippet below simply ensures that your end user would not be able to insert an 11th character. Note that the key presses are still registered. In the simulator, you can cause this closure to stop executing, if you use a physical keyboard and spam keys.</p>


  




<pre><code class="language-swift">TextField("Name", text: $input, prompt: Text("Name").foregroundStyle(.black))
    .padding(20)
    .submitLabel(.send)
    .onChange(of: input){ new, old in
        guard new.count &lt;= 10 else { return }
        input = new
         }
    }</code></pre>

&nbsp;
  
    
  
  <p class=""><span class="sqsrte-text-color--white"><em>As a reminder, the </em></span><a href="https://docs.swift.org/swift-book/documentation/the-swift-programming-language/statements/#Guard-Statement" title="Swift Documentation - The guard Statement"><span><span class="sqsrte-text-color--white"><strong><em>guard</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> statement is typically used as an early return mechanism. Its purpose is to return from the current scope, if a set of conditions are not met. The onChange closure above can also be written as :</em></span></p><blockquote><p class="sqsrte-small"><span class="sqsrte-text-color--white">onChange(of: input) { new, old in       </span></p><p class="sqsrte-small"><span class="sqsrte-text-color--white">if new.count &lt;= 10 { input = new }      </span></p><p class="sqsrte-small"><span class="sqsrte-text-color--white">}</span></p></blockquote><p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>In general, with very few exceptions, logic that </em><strong><em>controls</em></strong><em> or </em><strong><em>affects</em></strong><em> the input (limiting the number of characters, adding or removing characters etc.) should be placed in </em></span><span data-text-attribute-id="768ec4c7-7d0e-4b77-84b9-f8f81c60671a" class="sqsrte-text-highlight"><code><span class="sqsrte-text-color--white"><strong>onChange</strong></span></code></span><span class="sqsrte-text-color--white"><em> closures and not in </em></span><code><span class="sqsrte-text-color--white"><strong>onSubmit</strong></span></code><span class="sqsrte-text-color--white"><em> closures. Otherwise, you may hide the effects of the closure away from your end user. This is especially true if you interface jumps to another text field on submit, because your end user may miss the changes your application made to their input. For example, your end user may type 11 or more characters, but their input would get truncated after the value is submitted.</em></span></p>


  




&nbsp;&nbsp;
  
  <h4>Using TextField validation to enable and disable Buttons (e-mail validation)</h4><p class="">Another common requirement in interfaces is to condition the interactivity of a button (whether it’s active or not) to the validity of the <strong><em>TextField</em></strong>’s input. For example, if a <strong><em>TextField</em></strong> takes in an e-mail address, you may want to allow the end user to save the input only if the value contains the <code>@</code> sign and, at least one character after it, a <code>.</code> symbol. With version <a href="https://www.swift.org/blog/swift-5.7-released/" title="Swit 5.7 Release Notes"><span><strong><em>5.7</em></strong></span></a>, Swift introduced dedicated types for<em>regular expressions</em>, or <em>regexes</em>. They also added a <a href="https://developer.apple.com/documentation/regexbuilder" title="Apple Documentation - Regex Builder"><span><strong><em>framework</em></strong></span></a> you can use to build regexes in a more human readable way. In prior Swift versions, the String struct’s <a href="https://developer.apple.com/documentation/swift/stringprotocol/range(of:options:range:locale:)" title="Swift Documentation - String.range() method"><span><strong><em>range()</em></strong></span></a> instance method. Regardless of the approach you choose, the mechanism remains the same. You would build a regular expression using one of the rules in the table below and you would then try to match the input in the TextField to the regex.</p>


  





  
    <table>
  <tr>
    <th>Type of Regex</th>
    <th>Example for e-mail validation</th>
    <th>Expression  Meaning</th>
    <th>Rules for regex</th>
  </tr>
  <tr>
    <td wrap="soft"><code>range(of: options: range: locale:)</code></td>
    <td><span><code>"^[A-Z0-9a-z.%+-](#)+@[A-Za-z0-9.-](#)+.[A-Za-z](#){2,}$"</code></span></td>
    <td>Starts with (<code>^</code>) one or more group of characters (<code>[A-Z0-9a-z.%+-]+</code>), then the <code>@</code> symbol, followed by one or more groups of alphanumeric characters, as well as <code>.</code> or <code>-</code> (<code>[A-Za-z0-9.-]+</code>), a <code>.</code> and two or more uppercase or lowercase letters. The <code>$</code> sign essentially indicates that the input needs to end in a pattern that matches <code>.[A-Za-z]{2,}</code>. </td>
    <td><a href="https://unicode-org.github.io/icu/userguide/strings/regexp.html">International Components of Unicode Regex rules</a></td>
  </tr>
  <tr>
    <td><code>.firstMatch(of:)</code></td>
    <td><span><code>/^[A-Z0-9a-z.\_%+-](#)+@[A-Za-z0-9.-](#)+.[A-Za-z](#){2,}$/</code></span></td>
    <td>Same as above</td>
    <td>Same as above</td>
  </tr>  
  <tr>
    <td><code>Regex {…}</code></td>
    <td><span><code>Regex {/^/;OneOrMore { CharacterClass(.anyOf("._%+-"),("A"..."Z"),("0"..."9"),("a"..."z"))};"@";OneOrMore {CharacterClass(.anyOf(".-"),("A"..."Z"),("a"..."z"),("0"..."9"))};/./;Repeat(2...) {CharacterClass(("A"..."Z"),("a"..."z"))};/$/}</code></span></td>
    <td>The <code>/^/</code> group of symbols indicates the line needs to start with the first classifier that succeeds it (in this case, <code>OneOrMore</code>)`. Conversely, the <code>/$/</code> set of symbols requires the line to end with the classifier that precedes it (in this case, the <code>Repeat(2…)</code> block).</td>
    <td><a href="https://developer.apple.com/documentation/regexbuilder">Regex Builder Documentation</a>)</td>
  </tr>
</table>


  


  
  <p class="">In the example below, the <strong><em>isEmailValid</em></strong> computed property returns <strong>true</strong> if <strong><em>input</em></strong> matches the regex, or <strong>false</strong> if it does not. The commented regexes are equivalent to the uncommented one. Then, we use the <code>.disabled</code> modifier to determine if the button is <em>active</em>. Since <strong><em>isEmailValid</em></strong> returns <strong>true</strong> for a valid e-mail, we use the negation operator <code>!</code> to reverse the value (<code>.disabled</code> is <strong>true</strong> when <strong><em>isEmailValid</em></strong> is <strong>false</strong>).</p><p class="">Alternatively, the value of <strong><em>isEmailValid</em></strong> could have been re-evaluated in a <code>.onChange</code> modifier closure.</p>


  




<pre><code class="language-swift">import SwiftUI
import RegexBuilder

struct Regexes: View {
    @State var input:String = ""
    var isEmailValid: Bool {
        // ^ = starts with, $ = ends with, + after ] or } = one or more, * fter ] or } = zero or more
//        input.range(of: "^[A-Z0-9a-z._%+-]+@[A-Za-z0-9.-]+.[A-Za-z]{2,}$", options: .regularExpression) != nil
//        input.firstMatch(of: /^[A-Z0-9a-z._%+-]+@[A-Za-z0-9.-]+.[A-Za-z]{2,}$/) != nil
        input.firstMatch(of: Regex {/^/
            OneOrMore {
                CharacterClass(
                    .anyOf("._%+-"),
                    ("A"..."Z"),
                    ("0"..."9"),
                    ("a"..."z")
                )}
            "@"
            OneOrMore {CharacterClass(
                .anyOf(".-"),("A"..."Z"),("a"..."z"),("0"..."9"))}
            /./
            Repeat(2...) {CharacterClass(("A"..."Z"),("a"..."z"))}
            /$/
        }) != nil
        
    var body: some View {
        
        TextField("e-mail address", text: $input)
        Button("Submit"){
            
        }.disabled(!isEmailValid)
        
        
    }
}</code></pre>












































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp" data-image-dimensions="1480x1378" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=1000w" width="1480" height="1378" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/e7aad46f-2912-41f2-85f8-b376492e402b/TextFieldValidation.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Disabled (left) and Enabled (right) Button, based on Input Validation</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Although </em><strong><em>RegexBuilder</em></strong><em> seems more human readable at first, especially when spaced properly on new lines, I would </em><strong><em>strongly</em></strong><em> recommend you become familiar with the C-style </em><strong><em>Unicode</em></strong><em> version. RegexBuilder is unique to Apple, whereas Unicode regexes are almost universal. In other words, Unicode regexes are transferrable. Additionally, Unicode regexes are a lot more succinct.</em></span></p>


  




&nbsp;&nbsp;
  
  <h4>Changing input using Formatters</h4><p class="">There may be cases where, for aesthetic or clarity reasons, you need to modify your end-users’ input for display. For example, if the input represents a <em>phone number</em>, you may want to surround the first three digits with <em>rounded braces</em>, to represent the <em>Area Code</em>, then surround the next three digits with a space or <code>-</code> symbol to represent the <em>central office code</em>. This would be work your application should do, not the user. In other words, your end user would type <code>1231231234</code>, but the phone number would be displayed as <code>(123) - 123 - 1234</code>. There are other similar examples, such as credit card numbers, dates, numbers with various decimal places and so on.</p><p class="">Apple provides several tools you can use to rearrange data to be displayed in text form. One of the older such tools is the <a href="https://developer.apple.com/documentation/foundation/formatter" title="Foundation - Formatter class"><span><strong><em>Formatter</em></strong></span></a> class, with its many subclasses, included in the Foundation package with <strong>iOS 2.0</strong>. Since it’s a legacy component, it has both an <em>Objective-C</em> and a <em>Swift</em> version, and you can use either of the two. With <strong>iOS 15</strong>, Apple introduced a redesigned, Swift-native replacement for the <strong><em>Formatter</em></strong> class, in the form of the <a href="https://developer.apple.com/documentation/foundation/formatstyle" title="Foundation - FormatStyle Protocol"><span><strong><em>FormatStyle</em></strong></span></a>, <a href="https://developer.apple.com/documentation/foundation/parseableformatstyle" title="Foundation - ParseableFormatStyle Protocol"><span><strong><em>ParseableFormatStyle</em></strong></span></a> and <a href="https://developer.apple.com/documentation/foundation/parsestrategy" title="Foundation - ParseStrategy"><span><strong><em>ParseStrategy</em></strong></span></a> protocols. </p><p class="">Both the old and the new formatting tools function on similar principles. They provide a way for you to describe how to transform text input to a different concrete type and vice versa. They can also be passed as arguments to the <strong><em>TextField</em></strong> control initializer. You can pass a <strong><em>Formatter</em></strong> class as an argument to <code>TextField(\_ titleKey:LocalizedStringKey,value: Binding&lt;V&gt;,formatter: Formatter)</code> , or a <strong><em>FormatStyle</em></strong> as an argument to <code>TextField(\_ titleKey: LocalizedStringKey, value: Binding&lt;F.FormatInput&gt;,format: Format)</code>.</p><p class="">Although tutorials often demonstrate the functionality by converting between string formats, their purpose is more complex. <em>Formatters</em>, together with <em>FormatStyles</em>, represent a mechanism to convert (or format) any type of data (ranging from currency, dates, textual sentences or complex objects) into a <em>localized string for presentation</em>, and to parse user-entered text back into those concrete types. Their main advantage is that, under the hood, they interact with other APIs on Apple devices. As Apple introduces new capabilities in localization and formatting capabilities, they would be available to your application, with no additional effort on your part.</p><p class="">To exemplify both approaches, in a SwiftUI <strong><em>TextField</em></strong> control, we are going to assume our application runs on a device set to the US region and that the phone numbers we are going to store are typed by a user in the US, in the domestic format <code>(123) 123 - 1234</code>. We are not going to handle the <code>+</code> symbol.</p>


  




&nbsp;&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>If formatting the phone numbers is a core function of your application, it would be useful to choose the formatting rules based on the device’s settings. This way, you make sure the formatting is aligned to your end user’s customs. To do so, you would create a more complex data model and you could, for example, base your formatting decision on the value of </em></span><code><span class="sqsrte-text-color--white">Locale.current.region</span></code><span class="sqsrte-text-color--white"><em>. For example:</em></span></p><p class="sqsrte-small"><span class="sqsrte-text-color--white">if Locale.current.region == .unitedStates {        // add formatting logic      }</span></p><p class=""><span class="sqsrte-text-color--white"><em>Correct and complete localization in general is a complex topic and is out of scope for now, even when it comes to the format of a phone number.</em></span></p>


  




&nbsp;&nbsp;
  
  <p class="">The “legacy” approach relies on subclassing (inheriting) the <strong><em>Formatter</em></strong> class and on <em>overriding</em> a set of methods that get called when SwiftUI performs specific actions (when loading the view or when the bound value changes). You would essentially define <em>how the data should be showed</em> (by overriding the <a href="https://developer.apple.com/documentation/foundation/formatter/string(for:)" title="Formatter - string method"><span><strong><em>string</em></strong></span></a> method) and <em>how the underlying data should be saved</em> (by overriding the <a href="https://developer.apple.com/documentation/foundation/formatter/getobjectvalue(_:for:errordescription:)" title="Formatter - getObjectValue method"><span><strong><em>getObjectValue</em></strong></span></a> method).</p><p class="">A common approach consists of the use of a <em>mask</em>. First, you would choose a placeholder (or control) symbol, to represent a phone digit (for example, <code>X</code>), then lay out the structure of the formatted result. For a phone number, you could use <code>(###) ### - ####</code> as the mask. You can then write a simple method that parses the mask and replaces the placeholder with the appropriate values from your end-user’s input. The snippet below represents the starting point.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">class PhoneFormatter: Formatter {
    let mask = "(###) ### - ####"
    

    override func string(for obj: Any?) -&gt; String? {
    // Converts the raw digits to the formatted version (for display)
    }
    

    override func getObjectValue(_ obj: AutoreleasingUnsafeMutablePointer&lt;AnyObject?&gt;?, for string: String, errorDescription error: AutoreleasingUnsafeMutablePointer&lt;NSString?&gt;?) -&gt; Bool {
    // Converts the formatted string to raw digits (for the Binding)
    }
    
    func format(numbers: String) -&gt; String {
    // Altough you could put the formatting code inside the string function, it is customary to put this in a dedicated method.    
    }
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">Since it’s straight-forward, we can start with the <code>getObjectValue</code> method. As shown in the comment, its purpose is to take whatever input the end-user provides and convert to raw digits. The <strong><em>string</em></strong> parameter will represent the <em>input</em> that needs to be converted (the text displayed in the <strong><em>TextField</em></strong>), and the <strong><em>obj</em></strong> parameter is a reference to the destination object (in this case, it will be the <strong><em>TextField</em></strong>’s <em>Binding</em>). This is required because the end user can paste an already formatter phone number, so we need to be able to convert it to the underlying storage format (in this case, <em>digits</em>). The snippet below represents the updated method. Note that the commented version uses methods supported in older Swift versions, which did not support regexes. If you need to support iOS15 or earlier, you would use the <code>string.components</code> method.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">class PhoneFormatter: Formatter {
    //...
    

    override func getObjectValue(_ obj: AutoreleasingUnsafeMutablePointer&lt;AnyObject?&gt;?, for string: String, errorDescription error: AutoreleasingUnsafeMutablePointer&lt;NSString?&gt;?) -&gt; Bool {
    //    obj?.pointee = string.components(separatedBy: CharacterSet.decimalDigits.inverted).joined() as AnyObject
        obj?.pointee = string.replacingOccurrences(of: "[^0-9]", with: "", options: .regularExpression) as AnyObject
        return true
    }
    //...
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">The <code>string</code> method’s purpose is the opposite of <code>getObjectValue</code>. It’s meant to take the raw value (<em>digits</em> stored in the TextField binding) and convert them to the formatted value, matching the <strong><em>mask</em></strong> we had defined previously. First, we need to get a <strong><em>String</em></strong> from the <strong><em>obj</em></strong> parameter. To do so, we could <em>cast</em> it using a <em>type cast operator</em> (<code>as?</code> if the cast may fail or <code>as!</code> if the cast will surely succeed). This would change the type of the <strong><em>obj</em></strong> parameter, from the existential type <strong><em>Any</em></strong> to a concrete type, without making changes to the underlying data. It generally faster and, if you use the optional cast operator <code>as?</code>, it is safe in Swift, but may not be so in other languages. Since the <strong><em>TextField</em></strong>’s <em>Binding</em> is a <strong><em>String</em></strong>, and we are going to use the formatter to work with that binding, this mechanism would suffice. Alternatively, we could <em>convert</em> <strong><em>obj</em></strong> to a <strong><em>String</em></strong>, for example using a <strong><em>String</em></strong> initializer (<code>String(data:, encoding:)</code>). This would actually perform a conversion, making it generally safer to use if there’s no way to predict the type of the value passed to the Formatter. </p><p class="">Then, we can use a separate method ( in this example, we can call it <strong><em>format</em></strong>) to do the actual processing. In the end, this method needs to <em>return the converted value</em> (the formatted phone number).</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">class PhoneFormatter: Formatter {
    //...
    
    override func string(for obj: Any?) -&gt; String? {
        guard let string = obj as? String else { return nil }
//        guard let string = String(data: obj as! Data, encoding: .utf8)  else { return nil }

        return format(numbers: string)
    }
    //...
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">Due to the way the <a href="https://developer.apple.com/documentation/Swift/String" title="Swift Documentation - The String Struct"><span><strong><em>String</em></strong></span></a> struct is <a href="https://github.com/swiftlang/swift/blob/main/stdlib/public/core/String.swift" title="Swift Github Project - String struct"><span><strong><em>implemented</em></strong></span></a>, you can iterate through the characters of <strong><em>String</em></strong> variables just as you would a Swift Array. Because of this detail, the formatting mask problem is, essentially, a problem of operations over arrays:</p><ul data-rte-list="default"><li><p class="">The <strong><em>mask</em></strong> <strong><em>String</em></strong> is, in essence, the array <code>['(', 'X', 'X', 'X', ')', ' ', 'X', 'X', 'X', ' ', '-', ' ', 'X', 'X', 'X', 'X'] </code></p></li><li><p class="">A full phone number entered in the input field is represented by the array <code>['1', '2', '3', '1', '2', '3', '1', '2', '3', '4']</code></p></li></ul><p class="">To apply the mask, you would iterate through its elements and, when a control character is encountered, you would replace it with the appropriate element in the phone number array. Note that, because the mask contains additional characters compared to the , the indexes of the corresponding elements in each array will not match. Therefore, you would need to iterate through the mask String. The snippet below represents the format <code>function</code>.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">class PhoneFormatter: Formatter {
    //...
    func format(numbers: String) -&gt; String {
       var result = ""
        var index = numbers.startIndex
        for char in mask {
            if index &gt;= numbers.endIndex { break }
            if char == "#" {
                result.append(numbers[index])
                index = numbers.index(after: index)
            } else {
                result.append(char)
            }
        }
        return result
    }
    //...
}</code></pre>

&nbsp;&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>The </em></span><code><span class="sqsrte-text-color--white">index(after: )</span></code><span class="sqsrte-text-color--white"><em> method is used for variety. You could just as easily use </em></span><code><span class="sqsrte-text-color--white">index += 1</span></code><span class="sqsrte-text-color--white"><em>. The out of bounds check is performed in the conditional </em></span><code><span class="sqsrte-text-color--white">if index &gt;= numbers.endIndex { break }</span></code><span class="sqsrte-text-color--white"><em>.</em></span></p>


  




&nbsp;&nbsp;
  
  <p class="">You can then use the <strong><em>PhoneNumber</em></strong> formatter in a regular <strong><em>TextField</em></strong> SwiftUI control, as shown in the snippet below.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">import SwiftUI

struct SampleWithFormatter: View {
    @State var input: String = ""
    var body: some View {
        TextField("Phone Number", value: $input, formatter: PhoneFormatter()).padding().glassEffect().padding(30)
        Text(input).padding().glassEffect()
    }
}

#Preview {
    SampleWithFormatter()
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">If you test this out, you will quickly discover that the text is <em>not</em> reformatted as you type. This is because SwiftUI does not call the formatter’s <code>string</code> method until <em>the view goes out of focus</em>. This means your end-user can input a lot more than 10 characters and they have no visual feedback until they either submit the new value or they select another control. In the section <a href="file:///Users/mircea/Library/Containers/com.ulyssesapp.mac/Data/tmp/68d9a5c4cc81493094fd42a1c8b496bd/Using%20and%20customizing%20Text%20Input%20Controls/index.html#changing-input-through-an-intermediate-binding" title="Changing input through an intermediate binding"><span><strong><em>Changing input through an intermediate binding</em></strong></span></a>, we are going to explore a potential workaround for this.</p>


  




&nbsp;&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>As you find new ways to accomplish specific tasks, you may find it valuable to properly test the implementation’s limitations, if you have the time and energy. Even though I am trying to outline the main limitations of each approach, you may encounter limitations in use cases I may not have considered. This is why, in software development, experience is still more valuable than pure syntactic knowledge.</em></span></p>


  




&nbsp;&nbsp;
  
  <h4>More examples of format functions</h4><p class="">To expand on the idea of the formatting functions presented in the previous section, for more complex types of input, you could also use more control characters. For example, you could use a control character to indicate an uppercase letter (<code>C</code>), and another one to indicate lowercase letters (<code>c</code>). In that case, a switch statement would likely be more feasible. The snippet below represents an example. It would take the input <code>abcdef</code> and turn it to <code>abc-DEF</code>.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">func format(input: String) -&gt; String {
    var result = ""
    var index = input.startIndex
    final mask = "ccc-CCC"
    for char in mask {
        if index &gt;= input.endIndex { break }
        switch char {
        case "c":
            result.append(input[index].lowercased())
            index = input.index(after: index)
        case "C":
            result.append(input[index].uppercased())
            index = input.index(after: index)
        default:
            result.append(char)
        }
    }
    return result
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">A less flexible (but perhaps more concise) option would be to use <em>regexes</em>. Instead of using a mask, you would capture groups of digits, then use them to construct the number. The snippet below represents one such implementation. Notice how, as a fallback, if the regex fails, you should return the unmodified input.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">class PhoneFormatter: Formatter {
    //...
    func format(numbers: String) -&gt; String {
        var result = ""
        guard let matched = numbers.firstMatch(of: /(\d{0,3})(\d{0,3})(\d{0,4})$/) else {
            return numbers
        }
        print(matched)
        return "(\(matched.1)) \(matched.2) - \(matched.3)"
    }
    //...
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">You would often use regexes in the pattern showed above when you need to operate with specific portions of a string. For example, you may want to capture the domain of an e-mail and capitalize it. In that case, the function would become similar to the snippet below.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">func format(input: String) -&gt; String {
    var result = ""
    guard let matched = input.firstMatch(of: /^.*@(\w+).*$/) else {
        return input
    }
    return "\(matched.1.uppercased())"
}
//print(format(input: "josh.tea@prudent.leap.com"))
//PRUDENT
//print(format(input: "josh.tea@prudentleap.com"))
//PRUDENTLEAP</code></pre>

&nbsp;&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>As seen above, regular expressions are powerful. However, they have also been the source of numerous service outages throughout the years. You should always validate regexes carefully, and you should make them as restrictive as you possibly can.</em></span></p>


  




&nbsp;&nbsp;
  
  <h4>Changing input using FormatStyle, ParseStrategy and ParseableFormatStyle</h4><p class="">Apple’s more modern formatting APIs rely on a set of protocols, instead of the <strong><em>Formatter</em></strong> class, to accomplish the same functionality in a more modern and concise way. A SwiftUI <strong><em>TextField</em></strong> using this approach will have the same limitations as one using a <strong><em>Formatter</em></strong>, as the view lifecycle does not change.</p><ul data-rte-list="default"><li><p class=""><a href="https://developer.apple.com/documentation/foundation/formatstyle" title="Foundation - FormatStyle Protocol"><span><strong><em>FormatStyle</em></strong></span></a>, which ensures that conforming types take an input of an underlying type and convert it to a formatted output. Conforming types implement a <code>format(\_ value: InputType) -&gt; OutputType</code> method, which is the equivalent of the <code>string</code> method in the <strong><em>Formatter</em></strong> class.</p></li><li><p class=""><a href="https://developer.apple.com/documentation/foundation/parsestrategy" title="Foundation - ParseStrategy"><span><strong><em>ParseStrategy</em></strong></span></a>, which ensures that conforming types take a formatted value and convert to an underlying input data type . Conforming types implement a <code>parse(_ value: OutputType) throws -&gt; InputType </code>method, which is the equivalent of the <code>getObjectValue</code> method in the <strong><em>Formatter</em></strong> class.</p></li><li><p class=""><a href="https://developer.apple.com/documentation/foundation/parseableformatstyle" title="Foundation - ParseableFormatStyle Protocol"><span><strong><em>ParseableFormatStyle</em></strong></span></a> , which acts as a wrapper over <strong><em>FormatStyle</em></strong> and <strong><em>ParseStrategy</em></strong>. Conforming types implement a <strong><em>parseStrategy</em></strong> read-only computed property that instantiates a ParseStrategy, as well as a <code>format</code>method.</p></li><li><p class="">To make it easier to use, you can extend the <strong><em>FormatStyle</em></strong> base protocol with <em>a static accessor</em>. This is known as <a href="https://github.com/swiftlang/swift-evolution/blob/main/proposals/0299-extend-generic-static-member-lookup.md#extending-static-member-lookup-in-generic-contexts" title="Swift Evolution - static member lookup evolution proposal (Github)"><span><strong><em>static member lookup over generics</em></strong></span></a>, and it has been introduced in Swift 5.5. </p></li></ul><p class="">The snippet below provides the same formatting capabilities as the previous Formatter-based example, but in a more modern way. For modern code bases, it should be the preferred implementation.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">struct PhoneNumberStyle: ParseableFormatStyle {
    var parseStrategy: PhoneParseStrategy {
        PhoneParseStrategy()
    }
    
    func format(_ value: String) -&gt; String {
        let mask = "(###) ### - ####"
        var result = ""
        var index = value.startIndex
        for char in mask {
            if index &gt;= value.endIndex { break }
            if char == "#" {
                result.append(value[index])
                index = value.index(after: index)
            } else {
                result.append(char)
            }
        }
        return result
   }   
}


struct PhoneParseStrategy: ParseStrategy {
    func parse(_ value: String) throws -&gt; String {
        return value.replacingOccurrences(of: "[^0-9]", with: "", options: .regularExpression)
    }    
}

extension FormatStyle where Self == PhoneNumberStyle {
    static var phoneNumber: PhoneNumberStyle { PhoneNumberStyle() }
}


struct SampleWithFormatter: View {
    @State var input: String = ""
    var body: some View {
        TextField("Phone Number", value: $input, format: .phoneNumber).padding().glassEffect().padding(30)
        TextField("Sample", text:$input).padding().glassEffect().padding(30)
        Text(input).padding().glassEffect()
    }
}</code></pre>

&nbsp;
  
  <h4>Changing input through an intermediate binding</h4><p class="">There are other ways you can obtain a similar effect to formatters. As an example, you could also use a <em>computed property wrapped in a binding</em> to handle the translation logic. This would, essentially, act as a proxy between the TextField and the State Property it should control. The mechanisms remain, essentially, the same:</p><ul data-rte-list="default"><li><p class="">A mechanism to take the input and change it to an underlying, cleaned up format. In the case of a computed property, we could use the <code>set</code> method</p></li><li><p class="">A mechanism to provide the formatted value, when SwiftUI needs to display the updated view. In a computed property, this is the function of the <code>get</code> method.</p></li></ul><p class="">The snippet below showcases this type of implementation. Notice how the <strong><em>phoneFilterBinding</em></strong> computed property itself is a <em>read-only property</em> and it <em>implicitly</em> returns a <strong><em>Binding</em></strong> via the <code>Binding(get:() -&gt; Value , set: (Value) -&gt; Void)</code>initializer.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">struct SampleWithFormatter: View {
    @State var input: String = ""
    private var phoneFilterBinding : Binding&lt;String&gt; {
        Binding(
            get: {
                let mask = "(###) ### - ####"
                var result = ""
                var index = input.startIndex
                for char in mask {
                    if index &gt;= input.endIndex { break }
                    if char == "#" {
                        result.append(input[index])
                        index = input.index(after: index) 
                    } else {
                        result.append(char)
                    }
                }
                return result},
            set: { newValue in
                input = newValue.replacingOccurrences(of: "[^0-9]", with: "", options: .regularExpression)
            }
        )
    }
    var body: some View {
        TextField("Phone Number", text: phoneFilterBinding, prompt: Text("(###) ### - ####").foregroundStyle(.background))
        .padding().glassEffect().padding(30)
    }
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">This approach has the same limitations as all of the previous ones, but it adds an additional bug. The mask is applied to the <em>computed property</em>, rather than directly to the <em>state property</em>. This means the state property will temporarily hold one or two additional characters, until you either submit the value or another view comes into focus. </p><p class="">However, this can be fixed. The <strong><em>TextField</em></strong> view automatically updates when its text binding changes. Since the <strong><em>TextField</em></strong>is bound to a <em>computed property</em> instead of the actual state property, any updates done to the <strong><em>input</em></strong> <em>state property</em> will force the <strong><em>TextField</em></strong> view to update.</p><p class="">As a reminder, when a view displays a computed property, <em>any update to the value it’s based on will invalidate the view, forcing a view update</em>. When the view updates, it re-evaluates the computed property. </p><p class="">We can leverage this behavior, by using the <code>.onChange(of:)</code> modifier to react to changes to the <strong><em>input</em></strong> state property’s value. Any assignment to <strong><em>input</em></strong> would invalidate the view, which will cause the computed property to be re-evaluated. The most obvious operation to perform there is to limit the number of characters input can take, via a <code>String.prefix</code> method, as shown in the snippet below.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">TextField("Phone Number", text: phoneFilterBinding, prompt: Text("(###) ### - ####").foregroundStyle(.background))
    .onChange(of: input){ new, old in // this could have been "new _ in" instead 
        input = String(new.prefix(10))
    }
.padding().glassEffect().padding(30)</code></pre>

&nbsp;&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>To clarify, the reason this works here and it does not work if you bind the TextField </em><strong><em>directly</em></strong><em> to the </em><strong><em>State</em></strong><em> property is that the </em><strong><em>value of input is only changed on value submission</em></strong><em>. The value submission operations occurs implicitly when the view goes out of focus, or when the Submit operation is explicitly called (for example, by touching the submit button on the phone’s keyboard, or the return key on a Mac keyboard).</em></span></p>


  




&nbsp;&nbsp;
  
  <h3>The SecureField Control and its limitations</h3><p class="">Sometimes, your application needs to accept an end-user’s password, which is highly sensitive. It’s common practice to <em>replace</em> the characters your users type with <code>*</code> or with other symbols. Although you could apply one of the techniques we explored previously, Apple already provides a control you can use. In addition to masking characters and providing a quick link to Passwords as auto-fill options, the <em>SecureField</em> control <em>also completely blanks out its content when you take screenshots or screen recordings and it blocks the ability to copy its contents </em>, among other security measures. The snippet below and the adjoining screen shots exemplify a typical use case. Notice how the SecureField redacted characters are visible in normal use (left) but are hidden when a phone screen grab is taken(right).</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">struct SecureFieldSample: View {
    @State private var password = ""
    var body: some View {
        Text("Login Form")
        SecureField("Password", text: $password)
            .textContentType(.newPassword)
            .padding()
            .glassEffect()
            .padding()
    }
}</code></pre>

&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp" data-image-dimensions="1456x1368" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=1000w" width="1456" height="1368" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/2f1119ce-00e6-49e1-9d8e-f42ac4038f22/SecureFieldScreenshot.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>SecureField control, in normal use (left) and during a screen capture (right)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">The control does have its own limitations, though:</p><ul data-rte-list="default"><li><p class="">At the time of writing, there is no built-in mechanism to show the password.</p></li><li><p class="">You cannot customize the redaction characters, from the default <code>*</code></p></li><li><p class="">If the control loses focus, it no longer supports in-place edits. In other words, any attempted changes will overwrite the existing text entirely.</p></li></ul><p class="">If needed, you could implement a variant of the show/hide function using a <em>state variable</em> to signify whether the password should be visible. If <strong>true</strong>, you would display a <strong><em>TextField</em></strong> and, if <strong>false</strong>, you would display a <strong><em>SecureField</em></strong>. The snippet below represents such an example.</p><p data-rte-preserve-empty="true" class=""></p>


  




<pre><code class="language-swift">struct SecureField_ShowPassword: View {
    @State var showPassword = false
    @State var password = ""
    var body: some View {
        HStack{
            if showPassword == true {
                TextField("Password",text: $password)
                    .padding()
            } else {
                SecureField("Password",text: $password)
                    .padding()
            }
            
            Button(action: {
                showPassword.toggle()
            }, label: {
                Image(systemName: showPassword == false ? "eye" : "eye.slash")
                    .foregroundStyle(showPassword == true ? .gray : .accentColor )
            })
        }
        .padding(.horizontal)
        .glassEffect()
        .padding(.horizontal,40)
    }
}</code></pre>


  
  <p data-rte-preserve-empty="true" class=""></p><p class="">However, this implementation brings its own limitations:</p><ul data-rte-list="default"><li><p class="">Pressing the show/hide password button will cause the <strong><em>SecureField</em></strong> to re-render. As mentioned previously, the control loses in-place edit capabilities when it loses focus</p></li><li><p class="">The <strong><em>TextField</em></strong> control is not blocked for broadcast. Its content will be picked up by screenshots and screen recordings</p></li></ul><p class="">The images below showcase the difference.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp" data-image-dimensions="2680x1301" data-image-focal-point="0.5581585084554659,0.38879670397143296" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=1000w" width="2680" height="1301" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9b775cc5-fdbb-4050-8fb3-20c73c8c8809/SecureFieldTextFieldWithScreenshotComparison.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>SecureFIeld and TextField differences for screenshots.</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">If the first issue is more of a <em>nuisance</em>, the second is a <em>security concern</em>. We can mitigate both by using UIKIt controls wrapped in a SwiftUI view, and we are going to explore this next. However, <em>just because a solution is technically possible or even feasible, doesn’t mean it should be implemented</em>. If a control misses the type of functionality you have in mind, you should always consider <em>why</em>. </p><p class="">In the case of UIKit’s <a href="https://developer.apple.com/documentation/uikit/uitextfield" title="UIKit Documentation - UITextField control"><span><strong><em>UITextField</em></strong></span></a>, the same control is used both for normal text and for passwords. The security features are all linked to the instance property <a href="https://developer.apple.com/documentation/uikit/uitextinputtraits/issecuretextentry" title="UIKit Documentation - isSecureTextEntry trait"><span><strong><em>isSecureTextEntry</em></strong></span></a>. If you were to implement the same show/hide functionality, you would set that property to false, in order to show the password in plain text.</p><p class="">In the case of SwiftUI, Apple specifically chose to create two distinct controls. You would use <strong><em>SecureField</em></strong> for things you need to protect and <strong><em>TextField</em></strong> for anything else. </p><p class="">From a security standpoint, if the user can see the password, so can any potential bystanders.This type of threat is commonly known as <em>shoulder surfing</em>. Most likely, Apple specifically did not implement a show/hide password functionality for <strong><em>SecureField</em></strong> controls for this reason. With this constraint, in-place edits can cause potential issues with password mismatches, when you can’t visually confirm the edits. </p><p class="">A counter-argument to the <em>shoulder surfing</em> scenario is that the input mechanism itself (a keyboard on the screen) would pose a similar threat. This, however, is a false equivalence. A password displayed in plain text, even a moment, is immediately and passively readable. Conversely, deciphering what someone is typing requires active, sustained effort. Having said that, this potential threat is likely one of the reasons Apple introduced Keychain and Password Manager integration for secure fields. By eliminating the need to physically type in the password you remove the entire threat. Accessing secure features on the device using biometric authentication serves the same purpose. Not having to type or see the password ultimately makes it harder to leak information, even if the phone screen is accidentally captured by a CCTV camera. Ultimately, security is not a binary property, but rather a spectrum, and these features collectively push the experience towards the more secure end of that range, without sacrificing usability.</p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1771797330054-NO0MMYAVKWV3VWOVL8Z9/unsplash-image-8YRWIy8QGpI.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="1500"><media:title type="plain">Using and customizing Text Input Controls</media:title></media:content></item><item><title>From a CLI task to a run (main) loop</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sat, 04 Oct 2025 16:38:26 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/10/from-a-cli-task-to-a-run-main-loop</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68e14ab275f32945d74913f4</guid><description><![CDATA[In this short for post, we are going to explore the basic mechanisms by 
which you can create a very basic application, which keeps running until 
you close it. The idea is to explore how a rudimentary loop can be used to 
control the lifecycle of an application. This technique is foundational to 
almost every application you would run and build, from iOS apps to browsers 
to video games.]]></description><content:encoded><![CDATA[<p class="">In the <a href="https://www.prudentleap.com/byte-the-apple/2025/10/exploring-swift-creating-a-simple-cli-tool" target="_blank">previous post</a>, we explored the process of creating a simple Swift Command Line Tool with a one-time execution. It starts up, processes a task, then it closes. This is a useful and very common pattern - but applications wouldn’t be very useful if they always closed after executing a task. Any application with a <em>Graphical User Interface</em> runs until it’s told to stop or until it crashes - and the same applies to some server-side applications. We can change the way this application behaves simply by adding a simple <strong><em>event loop</em></strong> (also known as a <em>main loop</em> or <em>run loop</em>). In essence, the mechanism remains the same, regardless of the underlying frameworks (SwiftUI, UIKit, or technologies outside of Apple’s infrastructure):</p><ol data-rte-list="default"><li><p class="">Your application performs some setup activities, either in a serial, synchronous manner - or asynchronously, concurrently</p></li><li><p class="">Once the setup process is complete, the final instruction in the application’s entry point creates an infinite loop, with some very clear (and mandatory) exit/return/continue conditions. To see this in action, simply replace the content of the <strong><em>cli-example</em></strong> <strong><em>main.swift</em></strong> file with the snippet below:</p></li></ol>


  




<pre><code class="language-swift">print("Enter some text and I will repeat it, or enter 'q' to quit\n")

while true {
        print("what should I repeat?: ", terminator: "")
        
        guard let input = readLine()?.trimmingCharacters(in: .whitespacesAndNewlines) else {
            continue
        }
        
        // Check if user wants to quit
        if input.lowercased() == "q" {
            print("Goodbye!")
            break
        }
        
        // Skip empty input
        if input.isEmpty {
            print("Please enter some text or 'q' to quit\n")
            continue
        }
        
        print("Input was: \(input)")

}
print("Enter some text and I will repeat it, or enter 'q' to quit\n")

while true {
        print("what should I repeat?: ", terminator: "")
        
        guard let input = readLine()?.trimmingCharacters(in: .whitespacesAndNewlines) else {
            continue
        }
        
        // Check if user wants to quit
        if input.lowercased() == "q" {
            print("Goodbye!")
            break
        }
        
        // Skip empty input
        if input.isEmpty {
            print("Please enter some text or 'q' to quit\n")
            continue
        }
        
        print("Input was: \(input)")

}</code></pre>


  
  <p class="">If you would run the new executable, you would essentially have a process with an event loop ( in an overly simplified manner, this is mimics very well how more complex applications work).</p>


  




<pre><code class="language-bash">$\&gt; swift run cli-example
Building for debugging...
[7/7] Applying cli-example
Build of product 'cli-example' complete! (0.81s)
Enter some text and I will repeat it, or enter 'q' to quit

what should I repeat?: Something
Input was: Something
what should I repeat?: Something else
Input was: Something else
what should I repeat?: 
Please enter some text or 'q' to quit

what should I repeat?: q
Goodbye!</code></pre>


  
  <h3><br>A Swift Program - from source code to executable file</h3><p class="">When developing applications for Apple’s ecosystem, you would spend a good portion of your time writing Swift source code files. Sometimes, you may need to use Objective-C, C or C++, but even then, you would mostly write source code. While writing Swift source code, you use components (libraries) included in the Swift versions of Apple’s numerous frameworks - one of which being SwiftUI - and you ensure your code respects the requirements of Apple’s frameworks, <em>into which your code would need to integrate</em>. </p><p class="">Since Swift is a <em>compiled language</em>, when your program’s code is ready, you need to compile it. So far, we have seen how a simple Swift program is created by writing text into files - and then how Swift’s toolset transforms that text into another set of files, such as <em>object files</em>, <em>dynamic library files</em> and, importantly, the <em>executable file</em>. We have explored the main mechanisms you can use to <em>describe</em> an application, using <em>concepts</em> (<strong>symbols</strong> and <strong>abstractions</strong>) you find in the Swift programming language. We have then looked at the build process and how it translates those source files into binary object files, then bundles everything together into one, <strong><em>Mach-O executable</em></strong> file you can use to execute the program. </p><p class="">Between the source code files and the final executable files, the LLVM toolset (Compiler, Assembler, Linker) performs various other, intermediary tasks:</p><ul data-rte-list="default"><li><p class="">Perform <em>Lexical</em> and <em>Syntactic</em> <em>Analysis</em> (this is where compilation due to syntax errors would fail);</p></li><li><p class="">Perform <em>Semantic</em> <em>Analysis</em> (for type checking);</p></li><li><p class="">Generate intermediary <em>LLVM IR</em> files (LLVM Intermediate Representation files, which are target-independent files written in the <a href="https://llvm.org/docs/LangRef.html#abstract" title="LLVM - Abstract"><span><strong><em>LLVM Assembly Language</em></strong></span></a>)</p></li><li><p class="">Generate platform-specific assembly language files</p></li><li><p class="">Generate the Binary Object files, from the platform specific assembly language files</p></li><li><p class="">Bundle everything into an executable file, usable by the Operating System’s kernel ( in this particular case, the<strong>Mach-O executable file</strong> for the <strong>XNU kernel</strong> )</p></li></ul><p class="">More complex projects (such as an application with a GUI) would also include many other components, which are used while the application is running (from icons to pre-rendered sprites to <a href="https://developer.apple.com/library/archive/documentation/Cocoa/Conceptual/LoadingResources/Introduction/Introduction.html" title="Apple Archive - nib files"><span><strong><em>interface builder files</em></strong></span></a>)</p><p class="">The diagram below represents a simplified overview of the numerous transformations your Swift Code will go through.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp" data-image-dimensions="1627x800" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=1000w" width="1627" height="800" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/3265de03-7f68-4d5e-9da6-54731183c855/SwiftBuilds.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>A Swift Program - from the Human Domain to the CPU Domain</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">Once your program is bundled into a Mach-O executable file, it can be started by the Operating System.</p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759595448705-ZGEU3FGEAMMFZWJ5HYMM/unsplash-image-Q_gpG90dW7c.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">From a CLI task to a run (main) loop</media:title></media:content></item><item><title>Exploring Swift - Creating a simple CLI tool</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sat, 04 Oct 2025 16:26:19 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/10/exploring-swift-creating-a-simple-cli-tool</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68e10813c03be95a4ded8bbf</guid><description><![CDATA[Although it’s not the primary use of the language, Swift can be used to 
accomplish a wide range of tasks, from cli tools to macOS and iOS 
applications, to web servers. In this section, we are going to explore how 
to build a very simple cli in Swift, while also looking at Swift’s package 
manager and build tools.]]></description><content:encoded><![CDATA[<p class="">Swift is a powerful, modern and relatively easy to pick up language. It was publicly unveiled in 2014, at <a href="https://developer.apple.com/videos/swift" title="Swift Presentations - WWDC (reverse chronological order)"><span><strong><em>WWDC</em></strong></span></a> and it is based on C (for this reason, you can easily run C, Objective-C and, more recently, C++ code directly in your Swift Projects). The language supports <a href=""><span><strong><em>Object Oriented</em></strong></span></a><strong> constructs</strong> (such as <strong>classes</strong> and, with them, <strong>inheritance</strong>), but it was designed around the paradigm known as <a href=""><span><strong><em>Protocol Oriented Programming</em></strong></span></a>.</p><p class="">Although it’s easy to pick up and become productive, <em>mastering Swift</em> is a process that takes time and effort. For this reason, it's usually a good idea to become familiar with the official documentation. You can take a<a href="https://docs.swift.org/swift-book/documentation/the-swift-programming-language/guidedtour/" title="Swift Documentation - Language Tour"><span><strong><em> Swift Language Tour</em></strong></span></a> or further consult the <a href="https://www.swift.org/documentation/tspl/" title="Swift Documentation - Language Reference"><span><strong><em>Language's Reference</em></strong></span></a> and, eventually, read the <a href="https://www.swift.org/documentation/api-design-guidelines/" title="Swift Documentation - API Design Guidelines"><span><strong><em>API Design Guidelines</em></strong></span></a>. These are all bundled under Swift’s <a href="https://www.swift.org/documentation/" title="Swift Documentation - Home Page"><span><strong><em>Documentation</em></strong></span></a> section.</p><p class="">You can conceivably be very successful in your work without consulting these resources periodically, but it's useful to know they exist. The Swift project team also manages a <a href="https://www.swift.org/blog/" title="Swift Documentation - Blog"><span><strong><em>Blog</em></strong></span></a>, where you can stay up to date with newer developments, including some that didn't yet make it to WWDC talks.</p><p class="">If you would like to understand how Swift evolves as a language - and to understand how the Language team decides which features to add and which not to add, you can check the <a href="https://github.com/swiftlang/swift-evolution/tree/main/proposals" title="Swift Evolution - Github Repository"><span><strong><em>Swift Evolution</em></strong></span></a> GitHub repository. You can also see what’s in store for upcoming versions of Swift by checking the <a href="https://www.swift.org/swift-evolution/" title="Swift Evolution - Upcoming Proposals"><span><strong><em>Swift Evolution Page </em></strong></span></a>directly, as well. Both sources contain highly technical information, but you have the opportunity to see <em>why</em> Swift works the way it does. There are many gems hidden in the <em>answers</em> to proposal, not just the proposals themselves, so they are all great resources if you want to understand Swift’s language design philosophy straight from the language team.</p><p class="">Out of the box, Swift supports <strong><em>C/Objective-C interoperability</em></strong> (as Apple had/has a lot of Objective-C legacy code) and, since Swift 5.9, opt-in <a href="https://www.swift.org/documentation/cxx-interop/#importing-c-into-swift" title="Swift Documentation - C++ Interop"><span><strong><em>C++ interoperability</em></strong></span></a>.</p><p class=""> </p>


  




<hr />
  
  <p class="">Similar to many other programming languages, you can use Swift to create anything from simple <strong>cli tools</strong> (like the one in this section), to locally executed <strong>desktop applications</strong>, all the way to complex <strong>web servers</strong>. </p><p class="">Throughout this post, we are going to create a very simple cli tool which takes a list of numbers separated by the “<strong><em>,</em></strong>” (comma) symbol, and then returns those numbers in ascending order. The tools itself is not particularly useful, but it’s going to help us explore Swift a bit more closely, from the ground up.</p><h3>Problem Decomposition</h3><p class="">In order to capture the arguments of the cli application, we can use Apple’s <strong><em>swift-argument-parser</em></strong> <strong><em>module</em></strong>. </p><p class="">The main steps required to obtain the output are going to be:</p><ul data-rte-list="default"><li><p class="">In a dedicated <strong><em>struct</em></strong> entity, we are going to save the argument provided to the cli on call time, as a <strong><em>String</em></strong> variable named <strong><em>input</em></strong>. For brevity, we are not going to perform any input validation (which is a very bad practice in real world applications. You should always validate user input, even for offline applications. Validating user input is required not just for security reasons - but also to ensure that your application does not end up in an undefined state); </p></li><li><p class="">Using the <strong><em>split</em></strong> <strong><em>function</em></strong> associated to the <strong><em>String</em></strong> types, we are going to divide the <strong><em>input</em></strong> string into individual elements, based on the separator “<strong><em>,</em></strong>”;</p></li><li><p class="">Using the <strong><em>configMap</em></strong> <strong><em>function</em></strong> associated to <strong><em>Arrays</em></strong> , we are going to convert every element obtained from the split function into an Int</p></li><li><p class="">Finally, using the <strong><em>sorted</em></strong> function associated to <strong><em>Arrays</em></strong>, we can store a copy of the sorted array, in a <strong><em>temporary variable</em></strong>, which can then be printed to the console.</p></li></ul>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Before you start solving a problem, it’s usually a good idea to break it down to smaller, more manageable pieces. This helps you (and, if you work in a team, your colleagues) set up a common understanding. It’s not always possible to design an application (especially a complex one) unless you already built it once. Similarly, it’s not always possible to tackle complex problems, unless you break them down into smaller pieces first. As you solve more problems, it becomes a lot easier to find the smaller chunks - and this steps becomes less important, if you work alone.</em></span></p>


  




&nbsp;
  
  <h3>Creating a new Package</h3><p class="">In Swift, code is organized in <a href="https://developer.apple.com/videos/play/wwdc2024/10171?time=34" title="WWDC24 - Demystify explicitly built modules"><span><strong><em>Modules</em></strong></span></a>. Unlike other languages (eg Java, C# or Go), you do not explicitly define <em>module</em>names at the beginning of a Swift file, nor do you explicitly define the <em>namespace</em>. </p><p class="">One or more modules, together with <em>a single manifest file</em>, make up a Swift <a href="https://developer.apple.com/documentation/xcode/swift-packages" title="Xcode Documentation - About Swift Packages"><span><strong><em>Package</em></strong></span></a>. Within the Package’s manifest file, you can define the <a href="https://www.swift.org/documentation/package-manager/" title="Swift Documentation - Package Manager"><span><strong><em>Package Dependencies </em></strong></span></a>(functionality that is defined in other, external packages) and one or more <a href="https://developer.apple.com/documentation/xcode/configuring-a-new-target-in-your-project" title="Swift Documentation - Target"><span><strong><em>Targets</em></strong></span></a>for your application to be compiled for. Finally, each target builds a <a href="https://developer.apple.com/documentation/packagedescription/product" title="Swift Documentation - Product"><span><strong><em>Product</em></strong></span></a>, which is either a <em>library</em> (to be used in other projects) or an <em>executable</em> (and actual file). </p><p class="">To standardize the structure of Swift Project, but also to support the creation of various tools (such as build and CI/CD tools), all Swift projects are maintained as a Swift <a href="https://developer.apple.com/documentation/bundleresources/placing-content-in-a-bundle" title="Swift Documentation - Bundles"><span><strong><em>Bundle</em></strong></span></a>. </p><p class="">To start a new <em>project</em>, you can use a Swift utility to <em>bootstrap</em> a package for you. The utility would initialize your package, by creating the main directories and files for you. For example, in a dedicated directory called <strong><em>cli-example</em></strong>, you can run the command below (in the terminal):</p>


  




<pre><code class="language-bash">$&gt; swift package init --name cli-example --type executable</code></pre>


  
  <p class="">This command would return the following output, in the terminal’s standard output stream:</p>


  




&nbsp;
  
  <p class=""><em>Creating executable package: cli-example</em></p><p class=""><em>Creating Package.swift</em></p><p class=""><em>Creating Sources/</em></p><p class=""><em>Creating Sources/main.swift</em></p>


  




&nbsp;
  
  <p class="">The tool created a file for package management, named <strong><em>Package.swift</em></strong>, as well as a directory for source files, named <strong><em>Sources</em></strong>. Within the Sources directory, you can find a swift file named <strong><em>main.swift</em></strong>. By convention, the <strong><em>main.swift</em></strong> file is the only file that can contain top-level instruction.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Alternatively, you could designate a class as the top-level entry point for the program by using the </em></span><a href="https://docs.swift.org/swift-book/documentation/the-swift-programming-language/attributes/#main" title="Swift - Main Attribute"><span><span class="sqsrte-text-color--white"><strong><em>@main</em></strong></span></span></a><span><span class="sqsrte-text-color--white"><strong><em> </em></strong></span></span><span class="sqsrte-text-color--white"><em>attribute, in a file that is </em><strong><em>not</em></strong><em> called </em><strong><em>main.swift</em></strong><em>. The two options are mutually exclusive: you cannot declare a </em><strong><em>@main</em></strong><em> class in a file named </em><strong><em>main.swift</em></strong><em>.</em></span></p>


  




&nbsp;
  
  <p class="">A tree representation of the directory structure would be:</p>


  




<pre><code class="language-bash">.
└── Package.swift
└── Sources
    └── main.swift</code></pre>


  
  <h3>Adding Dependencies</h3><p class="">Using the <strong><em>Package.swift</em></strong> file, you can also add external dependencies to your project. For example, Apple already provides a package you can use to capture and use command-line arguments, named <a href="https://github.com/apple/swift-argument-parser" title="Swift Argument Parser - Source Repository"><span><strong><em>swift-argument-parser</em></strong></span></a>. We can add this package as a dependency to cli-example, by modifying the Swift Package Manifest:</p>


  




<pre><code class="language-swift">// swift-tools-version: 6.0
// The swift-tools-version declares the minimum version of Swift required to build this package.

import PackageDescription

let package = Package(
    name: "cli-example",
    dependencies: [
        .package(url: "https://github.com/apple/swift-argument-parser", from: "1.0.2"),
    ],
    targets: [
        // Targets are the basic building blocks of a package, defining a module or a test suite.
        // Targets can depend on other targets in this package and products from dependencies.
        .executableTarget(
            name: "cli-example",
            dependencies: [
               .product(name: "ArgumentParser", package: "swift-argument-parser"),
        	]
       )
    ]
)</code></pre>


  
  <p class="">Throughout this example, we used Swift’s Package Manager, but you don’t necessarily have to. You can also use external package managers and their repositories, such as <strong>CocoaPods</strong> - which is industry standard for Apple development. </p><p class="">For reference, this is what the <strong>CocoaPods</strong> manifest (known as <strong><em>specification</em></strong> or <strong><em>spec</em></strong>) would look like:</p>


  




<pre><code class="language-cocoa">// Example
platform :ios, '8.0'
use_frameworks!

target 'MyApp' do
  pod 'AFNetworking', '~&gt; 2.6'
  pod 'ORStackView', '~&gt; 3.0'
  pod 'SwiftyJSON', '~&gt; 2.3'
end</code></pre>


  
  <p class="">Alternatively, you can simply use Xcode to manage dependencies, as shown in <a href="https://developer.apple.com/documentation/xcode/adding-package-dependencies-to-your-app" title="XCode Documentation - Adding External Packages"><span><strong><em>Apple’s Xcode documentation</em></strong></span></a></p><h3>Implementing the CLI Tool</h3><p class="">Before proceeding with the actual implementation, it would be useful to be aware of a few rules and conventions (enforced by Swift’s tooling, but not always well documented in literature):</p><ol data-rte-list="default"><li><p class="">In a Swift Package, source code files must end with the <strong><em>.swift</em></strong> extension.</p></li><li><p class="">Swift source code needs to be placed in a directory named <strong><em>Sources</em></strong> or <strong><em>Src</em></strong> (or variants of these directory names, such as <strong><em>Source</em></strong> or <strong><em>src</em></strong>), or in one of its subdirectories (eg Sources/MainApplication or Sources/Services). </p></li><li><p class="">Within the<strong><em> Swift Package Description Manifest</em></strong>, you can specify custom source directories for individual targets, using the path key in the <strong><em>executableTarget</em></strong> dictionary.</p></li><li><p class="">In Swift, directories act as <em>helpers for you, the developer.</em> They don’t really hold much meaning (unlike other languages where directories are effectively namespaces or packages), unless you attribute meaning in the Swift Package Description manifest, as mentioned previously. On compilation, all Swift files are loaded in a flat structure. For this reason, you cannot have two files with the same name in a single Swift project (unless they are part of different modules).</p></li><li><p class="">The entry point of an application is, by convention, the <strong><em>main.swift</em></strong> file. There are two main exceptions to this rule:</p><ul data-rte-list="default"><li><p class="">The package contains only one source file, in which case the name doesn’t matter </p></li><li><p class="">The package does not contain a <strong><em>main.swift</em></strong> file and, instead, it has a <strong>class</strong> or <strong>structure</strong> marked with the <strong><em>@main</em></strong> <strong><em>attribute</em></strong>. In this case, the entry point is the <strong><em>main</em></strong> function belonging to the marked <strong><em>type</em></strong>(functions associated to the type and not an instance of the type are identified by the <strong><em>static</em></strong> keyword). This is the more common pattern, if you work on developing regular applications in Xcode. The snippet below is an example of such an implementation.</p></li></ul></li></ol>


  




<pre><code class="language-swift">@main
struct TestStruct {
    var sample = ""
    //This is the Instance Function
    func run() {
        print("The sample is: \(sample)")
    }
    //This is the entry point. It is a static function with no input and no returns (() -&gt; Void, but Void can be omitted)
    static func main() {
        let s = TestStruct(sample: "A String provided within the static main function")
        s.run()
    }
}</code></pre>


  
  <p class="">With these conventions and rules in mind, we can now work on our simple Command Line Tool implementation. Although the exact structure of your project will likely vary, it’s a good idea to separate the main sections of your application into their areas of responsibility. For example, you would use an <strong><em>App</em></strong> directory to keep the application’s entry point, while the rest of the business logic would be maintained in separate directories (such as <strong><em>Services</em></strong>, <strong><em>Networking</em></strong>, <strong><em>Shaders</em></strong> etc.). Since the directory structure is more important to you than it is to the Swift tools, these conventions are mainly meant to help you organize your code. </p><p class="">To start, add two new directories inside the <strong><em>Sources</em></strong> directory - and name them <strong><em>App</em></strong> and <strong><em>Services</em></strong>, respectively (though you could use any other names, instead). Within the <strong><em>Services</em></strong> directory, add a new<strong><em> swift file</em></strong>, with the name <strong><em>CLIService.swift</em></strong>. Then, move the original <strong><em>main.swift</em></strong> file, from <strong><em>Sources</em></strong> to <strong><em>Sources/App</em></strong>. Your project structure should now resemble the tree below:</p>


  




<pre><code class="language-bash">.
└── Package.swift
└── Sources
    └── App
        └── main.swift
    └── Services
        └── CLIService.swift</code></pre>


  
  <p class="">In the <strong><em>App/main.swift</em></strong> file, we would typically define instructions that are relevant to the application’s life cycle. For example, this is where you would execute the function that captures input from the user. For now, a set of print instructions would suffice. Replace the content of the <strong><em>main.swift</em></strong> file with the following print instructions:</p>


  




<pre><code class="language-swift">print("Welcome to our simple CLI!")
print("We have finished the execution, I will close now")</code></pre>


  
  <p class="">Within the <strong><em>Services/CLIService.swift</em></strong> file, use the <strong><em>import</em></strong> keyword to add the <strong><em>ArgumentParser</em></strong> module to this file. This allows the compiler to recognize and use the functions, protocols, and types provided by that module within the file where it's imported. Then, define a <strong><em>struct</em></strong> to act as a wrapper for our CLI functionality. It should be called <strong><em>CLIService</em></strong> and it should conform to the <a href="https://github.com/apple/swift-argument-parser/blob/main/Sources/ArgumentParser/Parsable%20Types/ParsableCommand.swift" title="ArgumentParser SCM - ParsableCommand"><span><strong><em>ParsableCommand</em></strong></span></a> <strong><em>protocol</em></strong> (we are going to look closely into protocols later in the book. For now, know that the protocol is part of the ArgumentParser module). </p>


  




<pre><code class="language-swift">import ArgumentParser

struct CLIService: ParsableCommand {
}</code></pre>


  
  <p class="">First, we would need to <em>store</em> the <em>user input</em>. Structs can contain <em>properties</em> (the equivalent of <em>fields</em> in other languages) and <em>functions</em>. Therefore, you can declare a <em>public variable</em> <strong><em>input</em></strong>, of <em>type</em> <strong><em>String</em></strong>, as the first property of the <strong><em>CLIService</em></strong> struct. To mark the variable as the storage of the user’s input, we can use the <strong><em>@Option</em></strong> <em>property wrapper</em> (more on this later)</p>


  




<pre><code class="language-swift">import ArgumentParser

struct CLIService: ParsableCommand {

  @Option(help: "Specify the input")
  public var input: String

}</code></pre>


  
  <p class="">The <strong><em>ParsableCommand</em></strong> <strong><em>protocol</em></strong> includes a <strong><em>main</em></strong> function, which can be executed by your application’s <em>entry point</em>. It captures the arguments passed in the command line and, using them, it executes the <em>instance function</em> named <strong><em>run</em></strong>. We can, therefore, write our sorting logic <em>within</em> the <strong><em>run</em></strong> function - or in <em>a separate function</em>, which is <em>executed (called) by</em> the <strong><em>run</em></strong>function. </p><p class="">There are benefits and drawbacks to both approaches. If you keep the sorting functionality inside the run function, the code is<em> easier to read</em>, but <em>slightly harder to extend</em>. Conversely, if you use a <em>dedicated function</em> for sorting, you could add more functionality (such as removing an element), at the potential cost of resource consumption (function calls require resources) and a <em>slight decrease in readability</em> (how easy a human can understand the code and its purpose), especially if the functionality extends over a larger number of files.</p><p class="">Here is an example where the sorting logic is performed directly in the <strong><em>run</em></strong> function:</p>


  




<pre><code class="language-swift">import ArgumentParser

struct CLIService: ParsableCommand {

  @Option(help: "Specify the input")
  public var input: String

  //This function is called by CLIExample.main()
  public func run() throws {
    let result = input
      .split(separator:",")
      .compactMap{
          Int($0)
      }
      .sorted()
    print("The array of sorted numbers is: \(result)")
  }
}</code></pre>


  
  <p class="">Alternatively, here is the same logic, but with the sorting performed by another, <em>dedicated</em> function, named <strong><em>sortInput</em></strong>:</p>


  




<pre><code class="language-swift">import ArgumentParser

struct CLIService: ParsableCommand {

  @Option(help: "Specify the input")
  public var input: String
  //This function is called by run()
  private func sortInput() -&gt; [Int]{
    input
      .split(separator:",")
      .compactMap{
          Int($0)
      }
      .sorted()
  }
  //This function is called by CLIExample.main()
  public func run() throws {
    print("The array of sorted numbers is: \(sortInput())")
  }
}</code></pre>


  
  <p class="">You may keep either of the two examples. Since we separated the project in two files, each with their own purpose, you can easily replace or modify each of the two pieces, independently - and revert your decisions easily. You can also add more functionality in either of the two, without affecting the other. Just as importantly, you can test the two components individually.</p><p class="">Going forward, you can keep either of the two examples used for <strong><em>CLIService</em></strong>. They will both work just as well, because the <strong><em>struct</em></strong> is named <strong><em>CLIService</em></strong> in both examples, and the <em>function</em> is named <strong><em>run</em></strong> in both examples. </p><p class="">Update the <strong><em>main.swift</em></strong> file and, between the two print instructions, call the <strong><em>main()</em></strong> function of the <strong><em>CLIService</em></strong> struct.</p>


  




<pre><code class="language-swift">print("Welcome to a simple CLI tool!")
CLIService.main()
print("Finished the execution, closing the process...")</code></pre>


  
  <p class="">When the <strong><em>cli-example </em></strong><em>project</em> will be executed, the <strong><em>main.swift</em></strong> file will act as the <em>entry point</em> - and execute the <em>static</em>function <strong><em>main() </em></strong>associated to the <strong><em>CLIService</em></strong> <strong><em>struct</em></strong>. </p><p class="">You can test out the example using the <em>swift run</em> command, as shown below.</p>


  




<pre><code class="language-bash">$&gt; swift run cli-example --input 3,1,5,7,6,4,90,78,45,32,15</code></pre>


  
  <p class="">Running that command would result in the following output (together with some other messages related to building and debugging, recorded by the run tool itself):</p>


  




<pre><code class="language-bash">Fetching https://github.com/apple/swift-argument-parser from cache
Fetched https://github.com/apple/swift-argument-parser from cache (1.29s)
Computing version for https://github.com/apple/swift-argument-parser
Computed https://github.com/apple/swift-argument-parser at 1.5.0 (1.82s)
Computed https://github.com/apple/swift-argument-parser at 1.5.0 (0.00s)
Creating working copy for https://github.com/apple/swift-argument-parser
Working copy of https://github.com/apple/swift-argument-parser resolved at 1.5.0
Building for debugging...
56/56 Applying cli-example
Build of product 'cli-example' complete! (6.24s)
Welcome to a simple CLI tool!
The array of sorted numbers is: [1, 3, 4, 5, 6, 7, 15, 32, 45, 78, 90]
Finished the execution, closing the process...</code></pre>


  
  <p class="">If you would try running the same command again, you would still get the build output - indicating that the project builds again. However, it would take a lot less.</p>


  




<pre><code class="language-bash">$&gt; swift run cli-example --input 3,1,5,7,6,4,90,78,45,32,15
[1/1] Planning build
Building for debugging...
[1/1] Write swift-version--58304C5D6DBC2206.txt
Build of product 'cli-example' complete! (0.20s)
Welcome to our simple CLI!
The array of sorted numbers is: [1, 3, 4, 5, 6, 7, 15, 32, 45, 78, 90]
We have finished the execution, I will close now</code></pre>


  
  <p class="">This is because, when we executed the initial swift run command, Swift’s toolset also built the project. When this occurs, the toolset creates a hidden directory (with a “<strong>.</strong>” in front of its name). In order to see hidden directories in the terminal, you would need to use <code>ls -a</code>. For example:</p>


  




<pre><code class="language-bash">$&gt; ls -la</code></pre>


  
  <p class="">A truncated representation of the directory tree would look something like this:</p>


  




<pre><code class="language-bash">.
└── .build
    └── artifacts
    └── checkouts
        └── swift-argument-parser
            └── ....
     └── repositories
        └── swift-argument-parser-54a11a8d
            └── ...
    └── arm64-apple-macosx
        └── debug
            └── ...
    └── debug  # // This is where the build clie binary resides. 
└── Package.resolved
└── Package.swift
└── Sources
    └── app
        └── main.swift
    └── services
        └── CLIService.swift</code></pre>


  
  <p class="">From the project’s directory, you can also run the built binary, as well. This time, it runs directly, as it’s the final, built executable.</p>


  




<pre><code class="language-bash">$&gt; ./.build/debug/cli-example --input 3,1,5,7,6,4,90,78,45,32,15
Welcome to our simple CLI!
The array of sorted numbers is: [1, 3, 4, 5, 6, 7, 15, 32, 45, 78, 90]
We have finished the execution, I will close now</code></pre>


  
  <p class="">You can also see the <em>file type</em>, to confirm that it is, indeed, a <strong><em>Mach-O executable</em></strong>:</p>


  




<pre><code class="language-bash">$&gt; file ./.build/debug/cli-example
./.build/debug/cli-example: Mach-O 64-bit executable arm64</code></pre>


  
  <p class="">Notice that the build is, at this point, a <strong>debug</strong> build. As shown in the <a href="https://www.swift.org/documentation/server/guides/building.html" title="Swift for Server - Build Guide"><span><strong><em>Swift Build System Guide</em></strong></span></a>, though, you can also build a <strong>release</strong> version, directly:</p>


  




<pre><code class="language-bash">$&gt; swift build -c release            
[1/1] Planning build
Building for production...
[10/10] Linking cli-example
Build complete! (11.58s)</code></pre>


  
  <p class="">Once you run this command, your released binary can be found in <code>.build/release</code>. You can also run the release binary.</p>


  




<pre><code class="language-bash">$&gt; .build/release/cli-example --input 3,1,5,7,6,4,90,9,8,2,3,3,5,6
Welcome to our simple CLI!
The array of sorted numbers is: [1, 2, 3, 3, 3, 4, 5, 5, 6, 6, 7, 8, 9, 90]
We have finished the execution, I will close now</code></pre>


  
  <p class="">If you check the contents of the <strong><em>release</em></strong> directory, though, you would find many other items. All of these files were generated when the Swift toolset (including the compiler) analyzed the source code files - and built the entire project.</p>


  




<pre><code class="language-bash">.
└── ArgumentParser-tool.build
    └── ...
└── ArgumentParserToolInfo-tool.build
    └── ...
└── cli-example.dSYM --&gt; debug symbols
    └── ...
└── cli-example.product
    └── Objects.LinkFileList --&gt; text reference to other binary object files
└── cli_example.build
    └── CLIService.swift.o --&gt; binary object files
    └── cli_example.d --&gt; text reference to make dependency files
    └── main.swift.o
    └── sources 
    └── output-file-map.json
    └── master.swiftdeps
└── swift-version--58304C5D6DBC2206.txt
└── ...
└── ...
└── ArgumentParserToolInfo.build
    └── ToolInfo.swift.o
    └── ArgumentParserToolInfo.d
    └── ArgumentParserToolInfo-Swift.h
    └── sources
    └── output-file-map.json
    └── module.modulemap
    └── master.swiftdeps
└── ArgumentParser.build
    └── ...
└── ...
└── Modules 
    ...
    └── cli_example.abi.json
    └── ArgumentParserToolInfo.abi.json</code></pre>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759594398158-NML7QO8JIXNGNAJ2ECIB/unsplash-image-Q_gpG90dW7c.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">Exploring Swift - Creating a simple CLI tool</media:title></media:content></item><item><title>The Journey of a Touch - Part V</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Tue, 30 Sep 2025 17:15:37 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/the-journey-of-a-touch-part-ii</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68dbeebbe94a70156a65cabc</guid><description><![CDATA[Having explored the way an SPI-based digitizer would be integrated into a 
iOS, we can now explore the mechanisms through which the operating system 
would transfer the touch event to the relevant application running on an 
Apple Device.]]></description><content:encoded><![CDATA[<p class="">Having explored the way an SPI-based digitizer would be integrated into a iOS, we can now explore the mechanisms through which the operating system would transfer the touch event to the relevant application running on an Apple Device.</p>


  




<hr />
  
  <h3>Surfacing the touch event, from Kernel Space to User Space</h3><p class="">For security reasons, direct access to hardware, ranging from memory to peripherals, is only allowed in the <em>kernel space</em>. To access hardware, a process has to either be <em>running in the kernel space</em> (as it’s the case with the drivers as <em>kernel extensions</em>) or <em>have a kernel delegate</em>, which does have access to hardware ( such as <em>driver extensions</em> in DriverKit). </p><p class="">Since IOKit drivers and their WorkLoops all execute in the kernel, in various contexts and with various restrictions, the events they work with are still within the <strong>kernel’s boundary</strong>. For those events to be visible to end-user facing applications, they need to be surfaced to the <strong>user space</strong>. In other words, they need to <em>cross the kernel boundaries</em>. Surfacing these events is a complex, carefully coordinated process.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp" data-image-dimensions="3840x1469" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=1000w" width="3840" height="1469" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5b5e93f9-457d-4520-bf0d-22e22e19f728/JourneyOfATouch_IOKit_Backboadd.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Routing events from IOKit to Backboardd</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">After the IOKit driver consumes the processed touch information from the Ping-Pong buffers, using the structures described in the post <a href="https://www.prudentleap.com/byte-the-apple/2025/9/starting-with-the-basics-part-iii" target="_blank"><span><strong><em>Exploring Apple’s drivers ecosystem</em></strong></span></a>, it persists the message in a dedicated <em>shared memory ring buffer</em>, implemented as an <a href="https://github.com/apple-oss-distributions/xnu/blob/main/iokit/Kernel/IOSharedDataQueue.cpp" title="XNU IOSharedDataQueue class"><span><strong><em>IOSharedDataQueue</em></strong></span></a>. When data is added to this buffer, registered clients (particularly the <strong><em>backboardd</em></strong>daemon) receive a <strong>Mach notification</strong>, which in turn wakes the daemon’s <strong><em>event dispatcher thread</em></strong>. <strong>Backboardd</strong> then <em>dequeues</em>events from the shared memory buffer and further processes them. </p><p class="">The mach notification model, with gated access to shared memory, ensures the lowest possible latency for high-throughput scenarios, as it’s the case with device-generated events. It works well because it’s carefully synchronized by Apple, using low-level synchronization mechanisms, such as <a href="https://developer.apple.com/documentation/kernel/iocommandgate" title="Apple IOCommandGate" target=""><span><strong><em>IOCommandGates</em></strong></span></a>. And yet, as seen in <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-7162" title="Report concerning a vulnerability caused by a race condition in IOKit"><span><strong><em>CVE-2017-7162</em></strong></span></a>, even with Apple’s careful design, mistakes can happen.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>In higher level programming, communicating through shared memory is often discouraged. Even though it’s less common, it is still useful to understand this model, especially when optimizing for performance.</em></span></p>


  




&nbsp;
  
  <p class="">Next, <strong>backboardd</strong> forwards the event to multiple subsystems. First, it assumes <strong>System Gestures</strong> (such as minimizing an application, lowering the status bar etc.) are possible, so it sends the event to the <strong>SpringBoard</strong> process, which manages the Home Screen and System UI (also via <em>Mach Ports</em>). If it needs to, <strong>SpringBoard</strong> assumes control and handles the system events.</p><p class="">Secondly, since it keeps a record of all <em>application frames</em>, together with the <em>state of these applications</em> (whether they are running, whether they are in the foreground or background etc.), <strong>backboardd</strong> also locates the <strong>application</strong> it should send the event to, by determining if the touch coordinates fall within the frame of an <em>active</em> application in the <em>foreground</em>. Once the <strong>hit tests</strong> find the frame that contains the touch point, <strong>backboardd</strong> identifies the <em>process that owns the frame</em> and forwards the touch event information to its associated <em>listening Mach Port</em>.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp" data-image-dimensions="4439x2785" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=1000w" width="4439" height="2785" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/10f36323-d0c0-4014-9eb5-890c8aac3897/JourneyOfATouch_BBoard_SpringBoard_App.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Routing Events from Backboardd to applications</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <h3>Handling the Touch Events </h3><p class="">When any application starts up, it is encapsulated within a process running on the operating system. This process spans an <em>initial thread</em> which, on Apple systems, is named <strong>Thread 1</strong>, and it’s known as the <em>Main Thread</em>. It executes the instructions found in the <em>executable top level code</em> (the <em>entry point of the application</em>, or the <em>main</em> function, in most languages) and, since a process ends when its main thread completes, the main thread usually runs in a loop. More accurately, it starts a <a href="https://developer.apple.com/documentation/corefoundation/cfrunloop" title="Apple Documentation - Core Foundation RunLoop"><span><strong><em>CFRunLoop</em></strong></span></a>, executes its setup instructions, then blocks (sleeps), waiting for various events. Unlike the example in “<a href=""><span><strong><em>From a CLI task to a run (main) loop</em></strong></span></a>”, though, SwiftUI (or most other UI frameworks designed for efficiency) does not trigger timer events (or any events) unless it needs to. Instead, when an application starts up, it executes an initial setup process, to load the data it requires to display the first scene (<em>a lot more on this in the SwiftUI sections</em>) and then, <em>if the application is well written</em>, it blocks quickly, waiting for new events.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp" data-image-dimensions="3840x1281" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=1000w" width="3840" height="1281" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d9856583-d2ce-47e1-bcc9-70ef37a537c4/JourneyOfATouch_RunLoop.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Handling touch events in an Application</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">As shown in the previous section, when <strong>backboardd</strong> identifies the application that should receive the event, it sends the touch information, serialized into a <em>mach message</em>, to its <em>dedicated mach port</em>. This, in turn, marks the application thread listening on the port as <strong>runnable</strong> and, in a mechanism similar to the one explored previously, the application’s event handling thread <em>drains the port</em>. Each event is dequeued in sequence and it is then processed by the <strong>UI Framework</strong>. </p><p class="">First, it performs hit tests, to determine which <a href="https://developer.apple.com/documentation/uikit/uiresponder" title="UIKit - UIResponder"><span><strong><em>UI Responder</em></strong></span></a> should receive the touch event ( similar to how <em>backboardd</em>identified the application to route the event towards). Then, the touch information (packaged as a <strong>UITouch Event</strong> in <em>UIKit</em> or a <strong>Gesture</strong> in <em>SwiftUI</em>) triggers the execution of a few functions. The exact implementation is not particularly relevant at this point, but the general idea can serve as inspiration for your own implementations.</p><p class="">Of the numerous events a control can react to, there are two types of <em>touch events</em> that are particularly relevant for a <strong>button</strong>control. First, the <a href="https://developer.apple.com/documentation/uikit/uicontrol/event/touchdown" title="Apple Documentation - touchDown Event"><span><strong><em>touchDown</em></strong></span></a> event, which indicates that the control has been touched, <em>triggers the execution of an activation animation</em> (usually, this highlights the button). Additionally, the <a href="https://developer.apple.com/documentation/uikit/uicontrol/event/touchupinside" title="Apple Documentation - touchUpInside event"><span><strong><em>touchUpInside</em></strong></span></a> or <a href="https://developer.apple.com/documentation/uikit/uicontrol/event/touchupoutside" title="Apple Documentation - touchUpOutside event"><span><strong><em>touchUpOutside</em></strong></span></a> events, which indicate that the button has been released, <em>trigger the execution of a deactivation animation</em> and, at the same time, signal the framework to execute the instructions found in the closure of the <strong>SwiftUI Button</strong> control view.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>You can easily check this separation (and how SwiftUI handles these events) by touching the “Please press the button” button in the example application at the start of this post and holding it pressed. You should notice how the button lights up (activates), but you would also notice how the message underneath the button remains “The button has been pressed 0 times ”. This indicates that the Button’s action closure did not execute yet. Once you release the button, the control’s deactivation animation plays and the text updates to reflect the change.</em></span></p>


  




&nbsp;
  
  <p class="">In our example, since the end-user just touched then released the button, the first event sent by <strong>backboardd</strong> is translated to a <strong>touchDown</strong> event. Since the touch occurred within the frame of a <em>button</em>, the UI framework runs the code associated with the control’s activation animation. Typically, this animation extends over several frames, depending on the way the animation is configured. To prepare the first frame in the animation, the <strong>Application</strong> (more specifically the code that handles the UI, such as SwiftUI or UIKit) updates the characteristics of the <em>Button</em> control. For example, it changes a the color to a different, <em>lighter shade</em>, to act as a <em>highlight</em>, while also <em>scaling</em> the shape down and perhaps adding some changes to the button’s <em>shadows</em> and <em>outlines</em>. After all changes required for the first frame of the animation are processed, they result in an update to the underlying <a href="https://developer.apple.com/documentation/quartzcore/calayer" title="Apple Documentation - CALayer"><span><strong><em>CALayer</em></strong></span></a> construct (within the <a href="https://developer.apple.com/documentation/quartzcore" title="Apple Documentation - CoreAnimation Framework"><span><strong><em>CoreAnimation</em></strong></span></a> framework). Whenever a view needs to change its aspect, the underlying CoreAnimation CALayer structure triggers a <a href="https://developer.apple.com/documentation/quartzcore/calayer/setneedslayout()" title="Core Animation - setNeedsLayout function"><span><strong><em>setNeedsLayout</em></strong></span></a> layout call. The function invalidates the view (more on this in the <a href=""><span><strong><em>SwiftUI sections</em></strong></span></a>), which results in an update in the Application’s <strong>CALayer Tree</strong>. The diagram below outlines this process.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp" data-image-dimensions="3840x1887" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=1000w" width="3840" height="1887" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9723ef13-3fc0-4fd3-890f-98aa97048956/JourneyOfATouch_ActionExecution.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Updating the CALayer Tree and the Application’s State</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">The next event sent by <strong>backboardd</strong> is translated to a <strong>touchUpInside</strong> event, which marks the beginning of the button deactivation animation, as well as the execution of the <code>action</code> closure’s instructions (in this case, <code>counter += 1</code>).</p><h3>Reacting to UI Update Events</h3><p class=""><br>When the visual content of an application needs to be updated, the UI framework schedules rendering updates as part of a mechanism commonly referred to as the <strong>Render Loop</strong>. This process is typically synchronized with the device’s display (or, if more than one display is connected, the <em>fastest</em> display) using a <a href="https://developer.apple.com/documentation/quartzcore/cadisplaylink" title="Core Animation - CADisplayLink"><span><strong><em>CADisplayLink</em></strong></span></a> object.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp" data-image-dimensions="4120x1510" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=1000w" width="4120" height="1510" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/bbdef66e-5586-4093-abc4-3e85443f8d76/JourneyOfATouch_RenderLoop.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Reacting to Events and Rendering - the Render Loop</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class=""> Internally, the <strong>CADisplayLink</strong> is tied to the system’s <em>V-SYNC (Vertical Synchronization)</em> events, which occur during the display’s <em>vertical blanking interval</em> ( <em>VBLANK</em>). VBLANK is the brief moment where the screen is refreshed and not actively drawing. On a standard <em>60Hz</em> display, this interval occurs roughly translates to <em>16.67 milliseconds</em> (the system needs to render <em>60 frames per second</em>, or <em>one frame every 16 milliseconds</em>). For a <em>120Hz</em> <strong>Pro Motion</strong> display, the V-Sync interval is even shorter, around <em>8.33 milliseconds</em>. </p><p class="">To prevent screen tearing and other potential issues, the application is required to <em>receive and process the event</em>, then <em>update the underlying CALayer Tree</em></p><p class="">(Apple calls this the <strong>event phase</strong>), all <em>within a V-Sync Interval</em>. Once this <em>synchronous</em>, <em>blocking</em> process is complete, the application moves to the <strong>commit phase</strong>, where it sends the new CALayer Tree information to another process, known as the <strong>Render Server</strong>. Since it’s an inter-process communication flow, it’s also implemented through Mach Ports. </p><p class="">On iOS, the <em>Render Server</em> functionality is implemented in the <strong>backboardd</strong> process, which is why it’s marked as the <em>Render Server</em> in the Instruments application. </p><p class="">On every V-Sync interval, the <strong>backboardd</strong> process needs to complete its own tasks, split in two phases: the <strong>rendering preparation phase</strong>, followed by the <strong>render execution phase</strong>. In the rendering preparation phase, which runs on the CPU portion of the SoC, backboardd collects the information submitted by all applications in the foreground, then composes a final image data structure, in the form of GPU Render instructions (for the Metal framework). Then, the SoC portion of the render loop forwards the frame information to the GPU, which in turn draws the final image and saves it into a <strong>frame buffer</strong>. </p><p class="">By default, the render loop uses two frame buffers, in a setup known as <strong>double buffering</strong>. One buffer contains the image that is visible on the display (the <em>front buffer</em>), while the other contains the image that is being rendered (the <em>back buffer</em>). On every screen refresh, the two buffers are swapped. <strong><em>Whatever image is presented in the front buffer at the time of the Display’s VBLANK would be displayed on the screen</em></strong>. This is why the GPU never renders directly in the front buffer ( if the image is not complete, t<em>he display would simply display the incomplete image</em>). </p><p class="">As a safety mechanism, if the image in the back buffer is not completely drawn, the system falls back to a <em>triple buffering mechanism</em>, where it uses two back buffers, instead of the one. It loads the image currently present in the front buffer into the a second back buffer (which is usually called the <em>spare buffer</em>), while also continuing to render the image in the initial back buffer. When the V-Sync event is triggered, the system swaps the front buffer with the spare buffer. </p><p class="">At the next V-Sync event, the front buffer should be swapped with the back buffer, and the spare buffer is usually destroyed. Any deviation from this process would cause a <em>hitch</em>, or an issue with the animations. Apple describes this process in great detail in their Tech Talks such as <a href="https://developer.apple.com/videos/play/tech-talks/10855" title="Apple Tech Talks - Explore UI Animation and the Render Loop"><span><strong><em>Explore UI animation hitches and the render loop</em></strong></span></a>. The linked talk contains detailed explanations and troubleshooting details that would become valuable to you later, so I am purposefully not diving into more details here. Instead, I strongly recommend you would watch that talk. Together with the information presented in this section, you should have a detailed understanding of how events are captured, processed and rendered on your device.</p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759252589435-ZOII8D7RG2OBUESD24FW/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">The Journey of a Touch - Part V</media:title></media:content></item><item><title>The Journey of a Touch - Part IV</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 28 Sep 2025 21:47:41 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/the-journey-of-a-touch</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68d9a3273bd03063d66200ae</guid><description><![CDATA[To deliver even the most basic functionality to your end users, a large 
number of systems needs to work well, in concert. This is, of course, 
without even diving into Apple’s numerous frameworks. This time, we’ll 
explore the mechanisms through which a touch event on a phone’s screen 
reaches the Operating System.]]></description><content:encoded><![CDATA[<p class="">The main goal behind the “<a href="https://www.prudentleap.com/byte-the-apple/2025/9/about-swift-applications-and-apple-operating-systems" target="_blank">Starting with the Basics”</a> series was to show that, to deliver even the most basic functionality to your end users, a large number of systems needs to work well, in concert. This is, of course, without even diving into Apple’s numerous frameworks. This was all to set the stage for what I believe to be a much more interesting topic: <em>the mechanisms through which a touch event on a phone’s screen becomes an action in an app</em>.</p>


  




<hr />
  
  <p class="">For each type of device they designed, Apple selected the most fitting operating system and the best suited input and output mechanisms, to deliver unique experiences, appropriate for the host device’s form factor. On <em>iPhone</em>, you use <strong>iOS</strong> and you interact with applications through <em>touch gestures</em>. On a <em>Mac</em>, with <strong>macOS</strong>, you typically use a <em>combination of keyboard and mouse/trackpad, as well as various other peripherals (decks, drawing tablets etc.)</em>. On an <em>Apple Watch</em>, you use a combination between the <em>Digital Crown and touch gestures</em> to interact with applications running on <strong>WatchOS</strong>. On a newer <em>Vision Pro</em>, with <strong>visionOS</strong>, you use a <em>combination of eye tracking and hand gestures</em> and, at times, various <em>buttons</em>. And finally, on an <em>Apple TV</em>, you use a <em>remote control with a built in trackpad</em>, to interact with <strong>TvOS</strong>. On top of those, you have a wide variety of additional accessories you can connect to your device, as well as voice commands. </p><p class="">Regardless of the OS and the exact input technology, the main mechanisms involved are generally the same:</p><ul data-rte-list="default"><li><p class="">An <strong>input device</strong>, which captures input and translates it into a binary data packet. Then, the binary data is transmitted to the device’s SoC, through dedicated lines on a dedicated bus;</p></li><li><p class="">A <strong>driver</strong> running on the host device’s operating system kernel, which manages the communication between the input component and the operating system. It instructs the operating system how to interact with the device and how to control it. It also converts the device specific data intro a more generic <em>Human Interface Device (HID) package</em>(the <strong>IOKit</strong> <em>IOHIDEvent</em>);</p></li><li><p class="">An <strong>IO Manager</strong>, which aggregates and routes IO events from various connected devices to other software components. On iOS, this is the <strong><em>backboardd</em></strong>; </p></li><li><p class="">An <strong>Application and Window Manager/Server</strong>, which receives <em>IOHIDEvents</em> from the IO Manager and routes them to the relevant application. The relevant application is generally determined by keeping a record of which applications are in the foreground and/or in focus, as well as the frames of their windows (their coordinates). This work divided between the the <strong><em>SpringBoard</em></strong> and <strong><em>backboardd</em></strong> processes on <strong>iOS</strong>, or the <strong><em>WindowServer</em></strong> process, on <strong>macOS</strong>;</p></li><li><p class=""><strong>Applications and Frameworks</strong>, which ultimately receive the input event and effect some change in their own state, based on the input. Here, the <strong>IOKit</strong> <em>IOHIDEvent</em> is converted to application framework specific formats, such as <strong>UITouch</strong> (<em>UIKit</em>) or an appropriate <strong>Gesture</strong> type (<em>SwiftUI</em>);</p></li><li><p class="">A <strong>Window Compositor</strong> and a <strong>Render Server</strong>, which renders the images to be displayed on the screen(s), with the frequency dictated by the display with the highest refresh rate;</p></li><li><p class="">Optionally, depending on the design and technology, there can be various <strong>buffers</strong> and/or <strong>shared memory locations</strong>, on any of the previously mentioned components. Their purpose may vary, from ensuring that events are not lost in case of failures to various memory, speed or power consumption optimizations.</p></li></ul><p class="">Throughout this article, we are going to follow the entire journey of a touch event, from the moment an <em>end-user presses on a button</em> in an application’s UI, to the moment the button’s effects are visible on the screen. </p><p class="">For reference, the application will render a single view, as shown in the screenshot below.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp" data-image-dimensions="1958x1839" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=1000w" width="1958" height="1839" sizes="(max-width: 640px) 100vw, (max-width: 767px) 50vw, 50vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/31cdd52c-aa40-491c-858b-0fb5422188a2/JourneyOfATouch_SampleScreenshot.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>“The Journey of a Touch”. The initial state (left) and the final state, after the button had been pressed 10 times (right)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">The following snippet represents the view’s code. We are going to explore SwiftUI in detail in the following sections. For now, the code is just for support, in case you would like to explore it further.</p>


  




<pre><code>//   JourneyOfATouch.swift
//===================
//   Created by Samwise Prudent on 23.06.2025
//   Copyright (c) 2025 Prudent Leap Software SRL. All rights reserved.
//   


import SwiftUI

struct JourneyOfATouch: View {
    @State var counter = 0
    var body: some View {
        Spacer()
            .frame(maxHeight:40)
        Text("The journey of a touch")
            .font(.largeTitle)
        Spacer()
        Button(action: {
            counter += 1
        }, label: {
            Text("Please press the button")
        })
        .buttonStyle(.borderedProminent)
        Text("The button has been pressed \(counter) times")
        Spacer()
    }
}

#Preview {
    JourneyOfATouch()
}</code></pre>


  
  <p class="">The diagram below represents a very high level overview of the main elements that work together in order to capture, interpret and transform the act of touching an area on your screen into an action that effects change in an application.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp" data-image-dimensions="3940x1654" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=1000w" width="3940" height="1654" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/81386c86-ef96-47f8-ad5b-21b6bae899ba/JourneyOfATouch_Overview.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Journey of a Touch - High Level Overview of the main components</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Note how, in the diagram above, some elements exceed the outlines of their parent structures. This is intentional, because those elements (for example, the controllers) are used as interfaces with external components.</em></span></p>


  




&nbsp;
  
  <h3>Understanding the devices</h3><p class="">The announcement of the first generation iPhone, at <a href="https://www.apple.com/newsroom/2007/01/09Apple-Reinvents-the-Phone-with-iPhone/" title="Apple Newsroom - iPhone Launch"><span><strong><em>MacWorld 2007</em></strong></span></a>, is still considered one of the most impactful keynote presentations in recent history. With the original iPhone, Apple entered the mobile phone market in a landscape dominated by two telecommunications giants, <strong>Nokia</strong> (Finland) and <strong>Motorola</strong> (United States of America). At the time, Nokia dominated the market by a large margin. Their flagship <em>multimedia computer</em>, the <a href="https://en.wikipedia.org/wiki/Nokia_N95" title="Nokia"><span><strong><em>Nokia N95</em></strong></span></a>, was also looking to deliver a rich experience. Their design, however, was still heavily rooted in the classic, time-tested and resilient button based form factor.</p><p class="">Apple’s original <strong>iPhone</strong> marked several shifts in the mobile phone market. First, it encouraged the release and adoption of a new generation of <em>Operating Systems</em>, primarily focused on touch-based mobile devices. At the time, the main Operating Systems were Nokia’s <strong>Symbian</strong> and Microsoft’s <strong>Windows Phone</strong>, together with various proprietary Operating Systems, such as <strong>BlackBerry OS</strong>. Secondly, it solidified a <em>design language</em> around the modern smartphone, with a <em>large screen and only a few hardware buttons</em>. The latter was also made possible by the manufacturers’ move away from <em>resistive</em> touchscreens (pressure sensitive), to a different technology, the <em>capacitive</em> touch screen. This technology is described in Apple’s patents, <a href="https://patentimages.storage.googleapis.com/da/d5/cb/716a79f0f8794c/US8243027.pdf" title="Apple Patent hosted on Google - US8243027"><span><strong><em>US8243027 - Touch Screen Liquid Display</em></strong></span></a>, <a href="https://patentimages.storage.googleapis.com/f3/47/11/1d06ba4060d70a/US7479949.pdf" title="Apple Patent hosted on Google - US7479949"><span><strong><em>US7479949 - Touch Screen Device, Method, and Graphical User Interface for determining commands by applying heuristics</em></strong></span></a>, <a href="https://patentimages.storage.googleapis.com/6b/5f/7b/e8c6fb038332c0/US8432371.pdf" title="Apple Patent hosted on Google - US8432371"><span><strong><em>US8432371 - Touch Screen Liquid Display</em></strong></span></a> and <a href="https://patentimages.storage.googleapis.com/2f/2b/32/40c76d7fe3c964/US7663607.pdf" title="Apple Patent hosted on Google - US7663607"><span><strong><em>US7663607 - Multipoint Touchscreen</em></strong></span></a>. The development of capacitive touch screen technology, combined with a dedicated, touch-focused User Interface, sparked a new generation of devices. This combination enabled a stylus-free experience, enriched by multitouch capabilities. If you experienced the transition, you may recall that some Android devices either didn’t support multitouch, or implemented inconsistently. Apple’s multitouch was leaps and bounds ahead.</p><p class="">On an iPhone, iPad or an Apple Watch, the screen assembly is, simultaneously, both an <strong>input</strong> and an <strong>output</strong> device. Underneath the screen’s glass, you would find a <strong>conductive layer</strong> (usually consisting of <em>Indium Tin Oxide</em> - <em>ITO</em>) and the screen’s <strong>digitizer</strong>. These components make up the main assembly that acts as the <em>input</em> side of the phone’s screen. Underneath those, you would find the <em>LCD</em> or <em>OLED</em> assembly, or the <em>output</em> part of the screen.</p><p class="">The screen’s input assembly, the <strong><em>Touch</em></strong><em> </em><strong><em>A</em></strong><em>pplication </em><strong><em>S</em></strong><em>pecific </em><strong><em>I</em></strong><em>ntegrated </em><strong><em>C</em></strong><em>ircuit</em> (or, in short, the <strong>digitizer</strong>), consists of an <strong>array of sensors</strong> and a <strong>touch controller</strong>. As seen in Apple’s Patents <em>US8243027B2</em> and <em>US8432371B2</em>, (in the patent material, sheet 17, Fig. 25), Apple may have preferred digitizers that are compliant to the <strong>SPI</strong> specification, but this is not necessarily indicative of the actual implementation. SPI is slightly more complex than other similar specs (such as <strong>I2C</strong>), but it does ensure higher bandwidth. </p><p class="">The <em>Touch IC</em> itself usually resides on the phone’s <em>main logic board</em> and it connects to the sensors array via a flat flexible cable. The main logic board consists of a multi-layered PCB assembly and it also hosts the SoC, Modem and GSM modules, the Input/Output connectors, the main memory modules, the T1/T2 security chips and various other modules.</p><p class="">In 2014, Apple submitted a new patent, for an <strong>in-cell touch</strong> technology. As a result, the old technology became known as <strong>on-cell touch</strong>. In essence, <strong>in-cell</strong> technology eliminates the need to overlay the display and the sensing layers, by interweaving the touch sensors with the pixel cells of the display. Apple holds a patent for an implementation of the technology, under <a href="file:///Users/mircea/Library/Containers/com.ulyssesapp.mac/Data/tmp/21e23bf4b5c54141bd5b72b0ede7d68c/The%20journey%20of%20a%20touch/US20140225838A1" title="Apple Patent hosted on Google  - US20140225838A1"><span><strong><em>US20140225838A1 - In-Cell Touch for LED</em></strong></span></a>. </p><p class="">The iPhone hardware is controlled by its dedicated firmware and by the Apple iOS Operating System, which is based on macOS X.</p>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Apple does not typically publish too much information about the internal components of their devices, nor do the manufacturers of these components. For this reason, the sections discussing the digitizer and its connection to the SoC are </em><strong><em>speculative</em></strong><em>. It is likely that Apple is using some variant of SPI for the bus between the digitizer and the SoC, it’s likely they are using DMA, and it’s possible that they use out of band hardware interrupts.</em></span></p>


  




&nbsp;
  
  <h3>Capturing a touch event and generating an Interrupt (in a hypothetical SPI-based flow)</h3><p class="">When it comes to events directly initiated by end users’ interaction with any device, from any manufacturer, the trigger is always some form of change in the <em>state</em> of an <em>input device</em>. In most cases, end users interact with smartphones by touching a region of the screen. Because the human body is a relatively good conductor, the touch disrupts the electrostatic field of the capacitive screen’s <strong>digitizer</strong>, causing a change in the state of the its <em>sensors array</em> ( affecting the mutual capacitance between the sensors). This change is detected by the <em>touch controller</em>, which eventually sends the change information to the<em> SoC</em>, for processing. The diagram below depicts a general implementation of the touch section of a touchscreen, as an input device. It abstracts away the data bus used for communication between the <em>Peripheral Touch Controller</em> (which controls the sensors array) and the <em>Host Touch Controller</em> (which orchestrates other similar peripheral controllers, on the same bus but with different purposes). You can find schematics for various devices online or in dedicated software and, with enough experience, you can infer some of the design choices, but this is out of scope for now.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp" data-image-dimensions="3940x1380" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=1000w" width="3940" height="1380" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/5c42a313-9673-4bdc-8a94-edb9328227df/JourneyOfATouch_CapturingATouchEvent.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Capturing a touch event and sending it to the SoC - Generic (in/out-cell, no bus specifications)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">To conserve power while also maintaining the overall responsiveness of the device, <em>the digitizer’s touch controller polls the sensors array at a fixed frequency of </em><strong><em>120 Hz</em></strong><em>, regardless of the display’s refresh rate</em> (as opposed to constantly reading the array). Older displays polled the touch controller at <strong>60Hz</strong>. This means that, in modern devices, a <em>Pro Motion</em> digitizer’s controller would cycle through its detection routine (draw power from the battery, power up the sensors array, read the mutual capacitance of the array then power off) <em>120 times every second</em>. In other words, the controller takes a snapshot of the sensors array <em>once every 8.33 milliseconds</em>.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>For comparison, </em><strong><em>Apple Pencil devices</em></strong><em> poll their controller at a frequency of </em><strong><em>240 Hz</em></strong><em>, while </em><strong><em>gaming mice</em></strong><em> poll their controllers at much higher frequencies, up to </em><strong><em>4000-8000 Hz</em></strong><em>. This variance in poll rates is handled by Apple’s Operating Systems and UI Frameworks, for example by use of </em></span><a href="https://developer.apple.com/documentation/uikit/getting-high-fidelity-input-with-coalesced-touches" title="Apple UIKit - Coalesced Touches"><span><span class="sqsrte-text-color--white"><strong><em>Coalesced UITouch events</em></strong></span></span></a><span class="sqsrte-text-color--white"><em>.</em></span></p>


  




&nbsp;
  
  <p class="">When the touch controller detects a change in characteristics (<em>a signal)</em>, it performs an initial analysis, to determine if the change represents a <em>valid</em> touch event. It performs <em>noise filtering</em>, to ensure that the detected signal is not triggered by electrical noise (such as interference with other devices) and, in some cases, it may also periodically perform various <em>calibration tasks</em>(for example, to account for <em>changes in temperature</em>, which may also affect the characteristics of the sensors array). </p><p class="">If the signal matches the parameters of a valid touch event, the digitizer converts and serializes the event’s characteristics into a special data structure, which would later be processed by other components of the host device (in our case, an iPhone).</p><p class="">The serialized touch data structure, which includes the coordinates of the touched points, as well as whether the touch gesture started or ended, together with some other potentially useful information, is then persisted into a <strong><em>F</em></strong><em>irst-</em><strong><em>I</em></strong><em>n </em><strong><em>F</em></strong><em>irst-</em><strong><em>O</em></strong><em>ut </em><strong><em>Queue</em></strong>, located in the controller’s memory. This allows the controller to queue up multiple touch events until they are processed by the host device, keeping the entire system responsive and ensuring no events are lost. </p><p class="">With its limited processing capabilities and focus on power efficiency, the digitizer can’t do much more (nor should it). Therefore, after the touch information is persisted, the touch information needs to be transferred to the SoC, for further processing. </p><p class="">There are many mechanisms Apple could use to transfer the touch information from the <strong><em>Touch</em></strong><em> </em><strong><em>I</em></strong><em>ntegrated </em><strong><em>C</em></strong><em>ircuit</em> to the <strong>SoC</strong>. Their initial patents mention <em>I2C</em> and <em>SPI</em> as possible buses to use for the touch screen assembly - and the schematics available online do indicate that Apple SoCs support both I2C and SPI. However, Apple is likely using a modified variant of an existing bus or a proprietary bus altogether.</p><p class=""><strong>As a hypothetical example</strong>, we are going to assume Apple uses an <strong>SPI bus</strong> with an <strong>out of band interrupt</strong> mechanism. While Apple didn’t publicly share the exact implementation details, it’s a plausible scenario, considering <em>MacBook trackpads do use an SPI bus</em>. Apple may prefer <a href="https://developer.apple.com/documentation/pcidriverkit/creating-custom-pcie-drivers-for-thunderbolt-devices#Support-Message-Signaled-Interrupts-in-Your-Device" title="Apple PCIDriverKit - Custom PCIe drivers for ThunderBolt Devices"><span><strong><em>Message Signaled Interrupts</em></strong></span></a>, but it’s not necessarily the case. Both approaches have advantages and disadvantages and Apple can optimize the entire stack, from hardware to software and everything in-between. </p><p class="">Regardless of the exact implementation, though, the general mechanisms remain unchanged. The <em>peripheral touch controller</em> needs to <em>communicate</em> with its <em>host controller on the SoC</em> via some <em>data bus </em>and the <em>SoC’s CPU requires a signal</em> to stop its current execution and focus on processing the touch event.</p><p class="">The <a href="https://en.wikipedia.org/wiki/Serial_Peripheral_Interface" title="Wikipedia - Serial Peripheral Interface"><span><strong><em>SPI Specification</em></strong></span></a>, originally developed by Motorola, includes a dedicated data bus used for synchronous, bi-directional (full-duplex) communication between a <em>Main Node</em> (the Host Controller) and <em>one or multiple Peripheral Nodes</em>.This data bus consists of <a href="https://oshwa.org/resources/a-resolution-to-redefine-spi-signal-names/" title="Open Source Hardware Association - Updated SPI Names"><span><strong><em>four lines types</em></strong></span></a> (wires):</p><ul data-rte-list="default"><li><p class=""><strong>CS</strong> (Chip Select) <strong>Line(s)</strong>, used by the Main Node to select the peripheral it talks to. In the old specification, this was called the <strong>Slave Select</strong> line. Usually, there is a dedicated line for each peripheral. </p></li><li><p class=""><strong>SCLK</strong> (Serial Clock) <strong>Line</strong>, used by the Main Node to synchronize clocks with the peripheral it selects using the CS Line</p></li><li><p class=""><strong>PICO</strong> (Peripheral In, Controller Out) <strong>Line</strong>, used by the Main Node to communicate with its selected Peripheral. In the old nomenclature, this line was <strong>MISO</strong></p></li><li><p class=""><strong>POCI</strong> (Peripheral Out, Controller In) <strong>Line</strong>, used by the peripheral to send data to the Main Node. In the old nomenclature, this line was <strong>MOSI</strong></p></li></ul><p class="">In SPI, there is no dedicated message or packet format. Instead, the communication is governed by a command-response pattern, where the Host controller indicates what it needs to read (eg. a <strong>status register</strong> to know how many bytes are present in a buffer, or the <strong>content</strong> of the buffer) and the Peripheral controller provides the response. </p><p class="">By design, <strong>SPI</strong> is <em>slightly more secure</em> than <strong>I2C</strong>, another commonly used peripheral bus. SPI Hosts <em>select a specific peripheral by pulling the dedicated Chip Select (CS) line </em>(each peripheral has a dedicated pin on the host). While the data lines (PICO, POCI and SCLK) are still shared among devices, a <em>peripheral only listens if its CS line is active</em>, reducing the risk of unintended data exposure.</p><p class="">In contrast, I2C Hosts <em>broadcast packets that include the address of the intended peripheral</em>. By convention, only the controller that has the transmitted address would act on the message. Since the two I2C wires (Clock and Data) are shared across all peripherals on the bus, it’s much easier to eavesdrop on I2C traffic or spoof peripherals. </p><p class="">Regardless of interface, it’s good practice to secure the transmission, with various protection mechanisms. Some common approaches are encrypted packets, peripheral authentication via dedicated security chips and/or bus zone isolation.</p><p class="">Although SPI does not specifically include an <strong>Interrupt Line</strong>, it is commonly used when designing responsive, power-efficient devices. In most cases, it is a dedicated <em>out-of-band</em> line (not part of the standard SPI Bus specifications).</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp" data-image-dimensions="3940x1380" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=1000w" width="3940" height="1380" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/08210e54-2e93-4cf6-9392-5dc5b97242fe/JourneyOfATouch_SOC_SPI_OOB-IRQ.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Capturing a touch event and sending it to the SoC - SPI and Out-of-Band Hardware Interrupt</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">As an example scenario, let us assume an end-user touches the <code>"Please press the button"</code> button in the interface presented at the start of this post. As long as the user touches the screen for slightly more than 8 milliseconds, the <em>Touch IC</em>will detect a change in the mutual capacitance values of its sensors array. Having confirmed the readings indicate a valid touch, it will convert the analog readings into digital information and then store the details in its local memory.</p><p class="">Having persisted the touch information into its buffer, the <strong>peripheral host controller</strong> sends a signal on its <em>interrupt line output pin</em>. The interrupt signal is characterized by the presence of a higher voltage (relative to the ground). In slightly more technical terms, the digitizer <em>asserts the interrupt line</em>, signaling that it encountered a scenario that needs to be handled (<em>serviced</em>) by the circuit on the other end of the interrupt line. This line is connected to a dedicated<em>General Purpose Input/Output</em> connector (<strong>GPIO</strong>), which carries the signal further to the <a href="https://developer.arm.com/documentation/198123/0302/What-is-a-Generic-Interrupt-Controller-" title="ARM Documentatin - The Generic Interrupt Controller"><span><strong><em>Generic Interrupt Controller</em></strong></span></a> (<strong><em>A</em></strong><em>pple </em><strong><em>I</em></strong><em>nterrupt </em><strong><em>C</em></strong><em>ontroller</em> in our case).</p><p class="">When the <strong>AIC</strong> receives the interrupt signal, it generates a request known as the <strong><em>Hardware Interrupt Request (IRQ)</em></strong>. The purpose of an IRQ is to halt any task the processor may be currently doing and trigger the <a href="https://developer.arm.com/documentation/ihi0048/a/Software-Examples-for-the-GIC/Processor-response-to-an-initial-interrupt?lang=en" title="ARM Documentation - Response to initial Interrupt"><span><strong><em>Interrupt Response</em></strong></span></a> process. </p><p class="">Based on system load and other criteria, the <strong>AIC</strong> then assigns a <strong>CPU Core</strong> to process the <strong>IRQ</strong>. At this moment, the AIC creates an interrupt <a href="https://developer.arm.com/documentation/ihi0048/a/Programmers-Model/CPU-interface-register-descriptions/Interrupt-Priority-Mask-Register--ICCPMR-?lang=en" title="ARM Documentation - Interrupt Priority Mask Register"><span><strong><em>mask</em></strong></span></a>, to ensure that subsequent IRQs are only directed to that core <em>in specific circumstances</em> (for example, higher priority IRQs of a different class). Otherwise, the Core is allowed to process the interrupt until it completes the task. This process has two main advantages:</p><ul data-rte-list="default"><li><p class="">It helps prevent possible data corruption and it ensures that <em>Kernel panics</em>, for example, which would have a very high priority, would still be processed by a core that is currently engaged in an IRQ process, avoiding deadlocks;</p></li><li><p class="">It allows other cores to potentially pick up other interrupts from the same digitizer, increasing the system’s responsiveness</p></li></ul><p class="">During this time, other cores may handle other interrupts from the same digitizer and/or perform other tasks. For this reason, access to shared memory areas is controlled through a very carefully set up system of mutually exclusive locks and semaphores. This guarantees events are handled in order and it ensures one core doesn’t accidentally overwrite the data another core needs. </p><h3>Handling the (primary) interrupt</h3><p class="">Compared to the previous model, where the <em>digitizer polls its sensors at a set frequency</em>, interrupts are handled <em>as they occur</em>. When a peripheral controller asserts its interrupt line, it maintains the voltage until its host controller signals it to <em>de-assert the line</em>. Since this signal is the <em>initial trigger</em> of any action on the CPU side, it is referred to as the <strong><em>primary</em></strong><em> interrupt</em>. Since it’s also a physical signal (not software), it’s a <em>primary </em><strong><em>hardware</em></strong><em> interrupt</em>. </p><p class="">The host controller is configured to instruct peripherals to clear the interrupt <em>after the events are dequeued</em>. This model allows the CPU Cores (which end up servicing the interrupts) to go into a <em>deep sleep state</em>. Requiring the CPU to poll every peripheral to determine if it should do anything would be highly inefficient.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp" data-image-dimensions="3788x1342" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=1000w" width="3788" height="1342" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/11d8c9ed-e534-439e-a14e-0e566491ca61/JourneyOfATouch_ServicingTheInterrupt.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Handling (servicing) the hardware (primary) interrupt</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">During normal operation, any given <strong>CPU Core</strong> is either in a deep sleep state or it’s executing instructions from one of the many programs that might be running on the device. When an interrupt signal arrives, the core needs to temporarily suspend this work (or if it’s sleeping, it would wake up) and handle the interrupt request. This process is known as <a href="https://developer.arm.com/documentation/den0013/0400/Exception-Handling?lang=en" title="ARM Documentation - Exception Handling"><span><strong><em>Exception Handling</em></strong></span></a>(IRQs are a subtype of Exception) and it generally follows a clear pattern. First, the core switches from <em>Exception Level 0</em> (<strong>EL0</strong>, for Application execution) to <em>Exception Level 1</em> (<strong>EL1</strong>, for Rich OS or Kernel execution), to gain access to privileged Kernel instructions. After it gains the elevated privileges, it also <em>automatically</em> saves the <strong><em>C</em></strong><em>urrent </em><strong><em>P</em></strong><em>rogram </em><strong><em>S</em></strong><em>tatus </em><strong><em>R</em></strong><em>egister</em>, as well as the <strong><em>P</em></strong><em>rogram </em><strong><em>C</em></strong><em>ounter</em> (the next instruction it should execute when it resumes the interrupted task) and other special registries, in a dedicated memory location. Next, the core jumps to a special area of <em>Kernel Memory</em>, known as the <a href="https://developer.arm.com/documentation/den0013/0400/Exception-Handling/Exception-priorities/The-Vector-table" title="ARM Documentation - Interrupt Vector Table"><span><strong><em>Interrupt Vector Table</em></strong></span></a>(or, on x84, <em>Interrupt Descriptor Table</em>). The <strong>IVT</strong> stores short branching instructions, generally known as <em>trampolines</em> or <em>redirects</em>, which instruct the CPU to jump to another area in memory, where the appropriate long form interrupt-specific <strong>Interrupt Service Routine</strong> is stored. It runs in a <em>highly restricted kernel space, in an interrupt context</em> and its purpose is to identify the type of device that raised the interrupt, then schedule the more complex operations required to actually servicing the interrupt to a device-specific handler. Code that is executed in an interrupt context can only use specific memory addresses, cannot acquire locks, cannot create new memory structures and it completely blocks the CPU Core while it runs.</p><p class="">The <em>ISR</em> is also known as <em>the top half of a driver</em>, while the device-specific handler is known as the <em>bottom half of the driver</em>. The top half of the driver handles the <em>primary (or direct) hardware interrupt</em>.</p><p class="">In their <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/performance/performance.html#//apple_ref/doc/uid/TP30000905-CH207-TPXREF108" title="Apple Archives - Kernel Programming Guide"><span><strong><em>Kernel Programming Guide</em></strong></span></a> and <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/HandlingEvents/HandlingEvents.html#//apple_ref/doc/uid/TP0000018-TPXREF106" title="Apple Archives - IOKit Foundations"><span><strong><em>Handling Interrupts in IOKit</em></strong></span></a> guides, Apple describes how, in OS X and, by extension, in iOS, this <strong>ISR</strong> is a <strong>generic low level interrupt handling routine</strong>, which Apple programs and maintains. Its purpose is to hand off the <em>direct interrupt</em> to a device handler (in this case, the digitizer’s handler), then schedule an <em>indirect interrupt</em> ( an interrupt handler written as an <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/HandlingEvents/HandlingEvents.html#//apple_ref/doc/uid/TP0000018-BAJFICDI" title="Apple Archives - IO Kit WorkLoop"><span><strong><em>IO Kit Work Loop</em></strong></span></a> <em>Kernel Extension (kext)</em>) and then <em>immediately clear the interrupt bit </em>in the <strong>Apple Interrupt Controller</strong>. By doing so, the <strong>Operating System Kernel</strong> (specifically the <em>Mach Scheduler</em>) gets the chance to further arrange the execution of other tasks (other IRQs or perhaps some other urgent work). As a result, the CPU Core can reload its previous state and continue the execution, in a reasonably low timeframe.</p><h3>Handling the indirect (secondary) interrupt</h3><p class="">Shortly after handling the primary interrupt, the system proceeds to execute the scheduled touch driver event handler’s <strong>IOKit WorkLoop</strong>. In the hypothetical case of a SPI bus, the driver could issue a request to the <em>Main SPI Controller</em> (this is the controller on the SoC), to retrieve events from the appropriate <em>Subnode (Peripheral) Controller</em>. The <em>Peripheral SPI Controller</em>would then <em>dequeue</em> the messages it has stored and transmit the binary (serialized) representation of <em>every</em> touch event it currently holds in its buffer, potentially in a batched touch events report.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp" data-image-dimensions="3789x1606" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=1000w" width="3789" height="1606" sizes="(max-width: 640px) 100vw, (max-width: 767px) 66.66666666666666vw, 66.66666666666666vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/ebc00576-a982-42b7-ad01-55dcb6fbf116/JourneyOfATouch_SecondaryInterrupt.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Handling the secondary Interrupt</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">As Apple <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/HandlingEvents/HandlingEvents.html#//apple_ref/doc/uid/TP0000018-BAJFICDI" title="IOKit FUndamentals - WorkLoops"><span><strong><em>describes</em></strong></span></a> them and as shown in “<a href=""><span><strong><em>Exploring Apple’s drivers ecosystem</em></strong></span></a>”, <strong>IOKit WorkLoops</strong> are essentially <em>gating mechanisms</em> that ensure <em>single-threaded access</em> to data structures used by the hardware. This is especially useful with driver constructs, which can be accessed concurrently by multiple threads for a variety of reasons (primary interrupts, timeout events and others). <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/WritingDeviceDriver/Introduction/Intro.html#//apple_ref/doc/uid/TP30000694" title="Apple Archives - Device Driver Design Guidelines"><span><strong><em>Device driver design and implementation</em></strong></span></a> are complex topics, which we are <em>not</em> going to explore further. For now, it’s useful to know that IOKit WorkLoops run on <em>dedicated kernel threads</em> and they run in the <em>normal kernel space</em>, which allows them to allocate memory, acquire locks and perform more complex logic. When the event assigned to an <em>IOKit Workloop</em> is processed by the system, a more comprehensive analysis of the touch data is performed. The final task of the driver is to package the processed information into a standard <a href="https://developer.apple.com/documentation/iokit/iohideventstruct?language=objc" title="IOKit - IOHIDEvent Structure"><span><strong><em>IOHIDEvent</em></strong></span></a>, which represents the processed touch input in Apple’s <strong>HID Event System</strong> ( you can find some examples <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEvent.cpp" title="Apple Open Source Repository - IOHIDFamily"><span><strong><em>here</em></strong></span></a>).</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>The mechanism could be slightly different on a Mac, where Kernel Extensions are no longer supported and have been replaced with the </em></span><a href="https://developer.apple.com/documentation/driverkit" title="Apple Documentation - DriverKit"><span><span class="sqsrte-text-color--white"><strong><em>DriverKit</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> </em></span><a href="https://developer.apple.com/documentation/systemextensions" title="Apple Documentation - System Extensions"><span><span class="sqsrte-text-color--white"><strong><em>System Extensions</em></strong></span></span></a><span class="sqsrte-text-color--white"><em>. System and Driver extensions were announced in WWDC19, in a </em></span><a href="https://developer.apple.com/videos/play/wwdc2019/702/" title="WWDC19 - System Extensions and DriverKit"><span><span class="sqsrte-text-color--white"><strong><em>dedicated set of talks</em></strong></span></span></a><span class="sqsrte-text-color--white"><em>.</em></span></p>


  




&nbsp;
  
  <h3>Improving the hypothetical SPI model</h3><p class="">Another, more complex (and arguably <em>more likely</em>) model , still within the hypothetical SPI scenario, involves the use of <strong>DMA</strong> (<a href="https://ww1.microchip.com/downloads/en/DeviceDoc/70223b.pdf" title="Example of a DMA Implementation"><span><strong><em>Direct Memory Access</em></strong></span></a>) enabled <strong>SPI</strong> flows. In this scenario, the CPU handles two interrupt sources, the <strong>SPI Peripheral</strong> and the <strong>DMA Controller</strong>. The driver initialization phase, which occurs when the Operating System starts up, is slightly more complex, because it involves configuring the DMA context. This process requires loading the DMA controller’s drivers, setting up the controller configuration and initializing its memory-mapped registers, among other tasks.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp" data-image-dimensions="3940x1390" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=1000w" width="3940" height="1390" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/98329723-5942-4b5d-b741-f50849bc1142/JourneyOfATouch_CapturingATouch_DMA_SPI.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Capturing and processing Touch Data with a DMA-enabled SPI interface</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">The trade-off is that DMA enables the <em>transfer of data from the SPI peripheral to the system memory</em> (to be processed and encoded as an <strong>IOHIDEvent</strong>) with <a href="https://www.ti.com/lit/wp/spna105/spna105.pdf" title="Texas Instruments - DMA with High Performance Peripherals"><span><strong><em>minimal CPU intervention</em></strong></span></a>. In contrast to the previous model, which requires the <strong>CPU</strong> to <em>constantly read data</em> from the controller’s dedicated MMIO register, <strong>DMA</strong> <em>offloads most of this work</em> by coordinating the transfer of data from the host controller to the dedicated system memory area. This allows the CPU to work on other tasks in the meantime. </p><p class="">In DMA-enabled flows, the CPU is responsible for initiating the DMA-supervised data transfer, and later for reading the data from the appropriate system memory buffer, once the DMA controller signals that the transfer is complete.</p><p class="">When the <strong>SPI Peripheral</strong> detects a valid touch event, it still buffers the data - and it still asserts an interrupt to the <strong>CPU</strong>. The <em>generic ISR</em> executes, but instead of issuing a read command to the <strong>SPI Host</strong> controller, the driver’s <em>IOKit WorkLoop</em>instructs the <strong>DMA controller</strong> to perform the transfer. After doing so, the WorkLoop waits for the DMA to signal completion.</p><p class="">Next, the <strong>SPI Host</strong> controller retrieves data from the peripheral controller and store it in a dedicated <strong>RX_FIFO</strong> memory area.</p><p class="">The <strong>DMA controller</strong> then transfers data from the SPI Host controller’s RX_FIFO to a dedicated <em>System Memory Buffer</em>. Many <em>Audio</em> and <em>Touch Screen</em> implementations rely on <em>ping-pong buffers</em>. In these implementations, the DMA controller reads data from the peripheral, then writes to a buffer, usually named the <em>Ping buffer</em>. When the transfer for the Ping buffer is complete, the <strong>DMA controller</strong> <em>raises an interrupt</em>, signaling the <strong>CPU</strong> to <em>read from that buffer</em>. The driver’s interrupt handler responds by waking up the previously blocked WorkLoop, which had been put to sleep when the DMA transfer process was initiated.</p><p class="">The driver then reads the data from the system memory, processes it through a <strong>Driver Interface</strong>, then encodes it into an <strong>IOHIDEvent</strong> structure, which is then processed as a Human Interface Device Event.</p><p class="">While the CPU processes the Ping buffer, the DMA controller continues reading packets from the peripheral and writes them to <em>another area in system memory</em>, usually known as the <em>Pong buffer</em>. </p><p class="">When the transfer is ready, it raises an interrupt, to signal the CPU that the Pong buffer is ready for processing. The IOKit WorkLoop wakes up, consumes and processes the message, then blocks again. <em>This alternating process repeats as long there is data to be transferred</em>.</p>


  




<hr />
  
  <p class="sqsrte-large"><em>To Be Continued…</em></p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759096088459-Z55J31PPEBMB0L63SPUR/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">The Journey of a Touch - Part IV</media:title></media:content></item><item><title>Starting with the basics - Part III</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 28 Sep 2025 21:05:19 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/starting-with-the-basics-part-iii</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68d97d6661cffb4328467ac8</guid><description><![CDATA[Soon, we’re going to explore how an input event, such as a touch on an 
iPhone screen, is eventually translated into User Interface events. To make 
it all possible, however, the Operating System needs to be able to talk to 
hardware input devices. This post will focus on Apple’s Drivers Frameworks 
and it will briefly explore the modern approach to User Level driver 
development.]]></description><content:encoded><![CDATA[<p class="">Soon, we’re going to explore how an input event, such as a touch on an iPhone screen, is eventually translated into User Interface events. To make it all possible, however, the Operating System needs to be able to talk to hardware input devices. This post will focus on Apple’s Drivers Frameworks and it will briefly explore the modern approach to User Level driver development. </p>


  




<hr />
  
  <h3>Exploring Apple’s drivers ecosystem</h3><p class="">When an Apple Operating System starts up, it loads the <strong><em>drivers</em></strong> required to interact with hardware components, such as connected displays, keyboards, headsets and various integrated hardware devices, such as an iPhone’s screen and its digitizer, or a macbook’s trackpad. </p><p class="">At first, Apple represented physical devices using a complex set of system frameworks, libraries and tools, known as <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/Features/Features.html#//apple_ref/doc/uid/TP0000012-TPXREF101" title="IOKit Introduction"><span><strong><em>I/O Kit</em></strong></span></a>. Later, Apple chose to extend the set of tools, by introducing <a href="https://developer.apple.com/documentation/driverkit" title="DriverKit Introduction"><span><strong><em>DriverKit</em></strong></span></a>. With MacOS 15, Apple further enriched the toolset, with <a href="https://developer.apple.com/documentation/corehid" title="CoreHID Introduction"><span><strong><em>CoreHID</em></strong></span></a>. Each new addition made the creation and maintenance of device drivers slightly more accessible and safe. <strong><em>IOKit</em></strong> components are, in their vast majority, entities running in the <em>kernel space</em>, as <em>Kernel Extensions (kext)</em>. They have the potential to bring the entire operating system down, if bugs occur. With macOS 10, <strong><em>DriverKit</em></strong> introduced <em>System Extensions</em> and <em>Driver Extensions (</em><strong><em>dext</em></strong><em>)</em>, which run in the <em>user space</em>. Unlike kernel extensions, when a driver extension crashes, the operating system (<code><strong>launchd</strong></code>) simply restarts its process, with limited impact to the overall system. Both frameworks are primarily written in<strong> C++</strong> and <strong>Objective-C</strong>. <strong><em>CoreHID</em></strong>, introduced with macOS 15, is a <strong>Swift</strong>-based framework. </p><p class="">As mentioned in the post “<a href="https://www.prudentleap.com/byte-the-apple/2025/9/about-swift-applications-and-apple-operating-systems" target="_blank"><span><strong><em>About (Swift) Applications and (Apple) Operating Systems</em></strong></span></a>”, it’s generally a good idea to analyze complex systems in a layered approach, especially if they are unfamiliar. In the context of device driver components, the <em>highest level</em> framework is (currently) <em>CoreHID</em>, followed <em>DriverKit</em> and ending with <em>IOKit</em>, the latter actually controlling the interactions with hardware devices.</p><p class="">At the <em>lowest level</em>, Apple built IOKit by extending the base C, Objective-C and C++ foundation of kernel modules, <strong><em>libkern</em></strong>. They consist of low level <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/ArchitectOverview/ArchitectOverview.html#//apple_ref/doc/uid/TP0000013-TPXREF104" title="IOKit - OS Classes"><span><strong><em>OS classes</em></strong></span></a>, which you can find in the <a href="https://github.com/apple-oss-distributions/xnu/tree/main" title="Apple OSS - XNU Kernel"><span><strong><em>XNU</em></strong></span></a> repository. These classes provide implementations for base <strong>ring buffers</strong>, <strong>meta classes</strong> (classes which define the behavior and structure of other classes and their instances) , <strong>mach port interfaces</strong> and so on.</p><p class="">On a <em>slightly higher level</em>, Apple abstracts devices as <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/ArchitectOverview/ArchitectOverview.html#//apple_ref/doc/uid/TP0000013-TPXREF105" title="IOKit Base Classes"><span><strong><em>IOKit classes</em></strong></span></a>. They represent drivers as elements in a driver registry, they categorize components as services and clients, define IOKit abstractions on top of OS Classes (for example, the OS Class <a href="https://github.com/apple-oss-distributions/xnu/blob/main/iokit/Kernel/IOSharedDataQueue.cpp" title="XNU IOSharedDataQueue class"><span><strong><em>IOSharedDataQueue</em></strong></span></a> is abstracted as <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEventQueue.cpp" title="IOHIDEventQueue class"><span><strong><em>IOHIDEventQueue</em></strong></span></a> and <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEventServiceQueue.cpp" title="IOHIDEventServiceQueue class"><span><strong><em>IOHIDEventServiceQueue</em></strong></span></a>) and so on. The classes we’re interested in, in the context of Human Interface Devices, can be found in the <a href="https://github.com/apple-oss-distributions/IOHIDFamily/tree/main" title="IOHID Open Source Repository"><span><strong><em>IOHIDFamily</em></strong></span></a> and <a href="https://github.com/apple-oss-distributions/IOKitUser/tree/main" title="IOKitUser Open Source Repository"><span><strong><em>IOKitUser</em></strong></span></a>repositories.</p><p class="">With <strong>DriverKit</strong> and later <strong>CoreHID</strong>, Apple further abstracts these lower level components, by providing user space accessible counterparts, via <a href="https://developer.apple.com/documentation/driverkit#Services" title="DriverKit Base Classes"><span><strong><em>DriverKit base classes</em></strong></span></a>, representing Services, Dispatch Queues, Servers and Clients.</p><p class=""><strong>CoreHID</strong> provides Swift-based abstractions on top of Apple’s lower level Human Interface Device Classes, with its own <a href="https://developer.apple.com/documentation/corehid#Interaction" title="CoreHID Base Components"><span><strong><em>Swift actors, enums, structs and protocols</em></strong></span></a>.</p><p class="">To see these systems in action, you can use <code><strong>ioreg -l</strong></code> to explore the <strong>Kernel IO Registry</strong>. You can also use <code><span class="sqsrte-text-color--white"><strong>hidutil list</strong></span></code> to see the <strong>HID Event System</strong> <em>services</em> and <em>devices</em> running on your Mac, as well. You can extract a tree of the known drivers on your computer (similar to Windows’ device manager, but in the terminal), allowing you to take a structured approach to understanding the ecosystem. </p><p class="">The diagram below showcases the elements used by the operating system to interact with a <em>mouse</em> connected to the system. You could start by identifying the <strong>IOHIDUserDevice</strong> element for which the <strong>Product</strong> field matches the name of your mouse ( look for <code>"Product" = </code>, under a <code><strong>IOHIDUserDevice</strong> <strong>&lt;class IOHIDUserDevice...&gt;</strong>&nbsp;</code>entry. You can then find related classes by going up and down the tree. This is also a useful technique to add to your tool belt.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp" data-image-dimensions="4156x1834" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=1000w" width="4156" height="1834" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/f1ebef97-0d81-40cf-86a7-6aa58f39c16e/Drivers_MouseTree.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Device Tree for a Mouse, on MacOS</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">For comparison, the MacBook Trackpad, which is an <em>integrated</em> device connected to the SoC via an SPI bus, appears in the IORegistry as shown in the diagram below. </p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp" data-image-dimensions="3940x837" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=1000w" width="3940" height="837" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cd0fa083-ec90-457c-b730-db0065048cf1/Drivers_SPITrackpad.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Apple SPI trackpad, as seen by the IO Registry</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Keep in mind that IORegistry expands the inheritance chain of both IOKit and DriverKit classes. Not everything you see there is a producer or consumer of events, or even a standalone service. It essentially presents a suggestive view of object-oriented design, with deep dependency chains operating at low levels of the system.</em></span></p>


  




&nbsp;
  
  <p class="">The main purpose of a driver is to manage the flow of data <em>to</em> and <em>from</em> a connected device. In its most basic form, a device driver needs a few basic components to accomplish its purpose:</p><ul data-rte-list="default"><li><p class=""><em>a way to describe a hardware device</em> and the mechanisms by which the driver can interact with it (<strong>Device Service</strong>)</p></li><li><p class="">a <em>memory area</em> where it can store data it receives from the device, until the higher order components can use the data (<strong>Memory Buffers</strong>)</p></li><li><p class=""><em>a way to describe higher order operating system elements</em>, which would consume data from the device, as well as mechanisms to interact with those higher order components. (<strong>Operating System Services</strong>)</p></li><li><p class="">a mechanism to <em>signal</em> when the Device or OS services to consume data from the memory buffers</p></li><li><p class=""><em>a mechanism to properly orchestrate access to memory</em>, in <em>very tightly synchronized</em> order. Otherwise, the OS could read the wrong data at the wrong time, or the device could try to overwrite the data the OS only partly read.</p></li></ul><p class="">With these basic components, you can manage the process of collecting data from the device, storing it into a memory buffer, then passing it to the operating system clients. You can also manage the reverse process, of writing instructions from the OS Services to a memory buffer, then forwarding those commands to the device. All of this relatively safely, in the context of a very busy parallel and concurrent system.</p><p class="">Modern Operating Systems rely on a large variety of devices and connections, to deliver the functionality we’re accustomed to, as end users. All operating systems rely on some form of a Device Tree, to discover and manage devices connected to the system. </p><p class="">The diagram below showcases the main base classes used by IOKit, with the exception of low level work handling classes (<strong><em>IOWorkLoop</em></strong>, <strong><em>IOCommandGate</em></strong>, <strong><em>OSAction</em></strong> etc). IOKit is a pure object oriented framework, based on the conventions and language features provided by the C-class family (C, C++ and, more notably, Objective-C). It relies heavily on inheritance and composition, where objects are structured as descendants of parent classes, while their functionality is composed through the integration with other classes.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp" data-image-dimensions="3940x1931" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=1000w" width="3940" height="1931" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/479b855f-8a20-4c7b-92ff-82d74058b2ec/Drivers_Device-UserSpace.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Main classes involved in the communication between a device and a user space application</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">In IOKit, both <em>drivers</em> and <em>services</em> that interact with drivers inherit from the <a href="https://developer.apple.com/documentation/DriverKit/IOService" title="Apple DriverKit Documentation - IOService"><span><strong><em>IOService</em></strong></span></a> class, which is a subclass of <a href="https://developer.apple.com/documentation/kernel/ioregistryentry" title="Apple Kernel Documentation - IORegistry"><span><strong><em>IORegistryEntry</em></strong></span></a>. Together, these two classes define how an object fits into the IOKit device tree, as well as the resources it requires in order to deliver its functionality. Device drivers are all grouped into <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/Families_Ref/Families_Ref.html#//apple_ref/doc/uid/TP0000021-BABCCBIJ" title="Apple Archives - Driver Family Reference"><span><strong><em>Driver Families</em></strong></span></a>, which offer abstractions common to specific types of hardware devices. Typically, drivers are clients of the abstractions that describe their underlying bus (for example, <em>USB</em>) and they inherit from abstractions that describe their purpose (for example, <em>HID</em>).</p><p class="">Using <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/HID/intro/intro.html#//apple_ref/doc/uid/TP40000970-CH202-SW1" title="Apple Archive - Driver Inferfaces"><span><strong><em>Driver Interfaces</em></strong></span></a>, IOKit describes the flow of data, from transport level drivers (such as <strong><em>AppleUSBHIDDriver</em></strong>) to higher order EventServices. In IOKit jargon, driver interfaces decode device reports and bundle their components into <strong><em>IOElement</em></strong> or <a href="https://developer.apple.com/documentation/hiddriverkit/iohidelement" title="Apple HIDDriverKit Documentation - IOHIDelement"><span><strong><em>IOHIDElement</em></strong></span></a> objects, to be further processed by <a href="https://developer.apple.com/documentation/hiddriverkit/iohiddevice" title="Apple HIDDriverKit Documentation - IOHIDDevice"><span><strong><em>IOHIDDevice</em></strong></span></a> objects. With <a href="https://developer.apple.com/documentation/hiddriverkit/iouserhideventservice" title="Apple HIDDriverKit Documentation - IOUserHIDEventService"><span><strong><em>IOUserHIDEventService </em></strong></span></a>(a service based on <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEventService.cpp" title="Base class for IOHIDEventService subclasses"><span><strong><em>IOHIDEventService</em></strong></span></a>), drivers gain the ability to process the device-specific report messages, into generic, IOKit native <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEvent.cpp" title="Base class for IOHIDEvents"><span><strong><em>IOHIDEvents</em></strong></span></a>.</p><p class="">Finally, leveraging <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEventQueue.cpp" title="Base blass for HID Event Queues"><span><strong><em>IOHIDEventQueues</em></strong></span></a> and <a href="https://developer.apple.com/documentation/kernel/iouserclient" title="Apple Kernel Documentation - IOUserClients"><span><strong><em>IOUserClients</em></strong></span></a>, drivers gain the ability to efficiently surface those IOHIDEvents to higher order services.</p><p class="">Both <strong>iOS</strong> and <strong>macOS</strong> rely on processes running in the <em>user space</em> to act as a bridge between the kernel and various end-user facing applications. On <strong>macOS</strong>, this process is the <strong><em>WindowServer</em></strong>. On <strong>iOS</strong>, it’s the <strong><em>backboardd</em></strong> daemon. Both <strong><em>WindowServer</em></strong> and <strong><em>backboardd</em></strong> use the system-wide singleton <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDSystem/IOHIDUserClient.cpp" title="Base class for IOHIDUserClient"><span><strong><em>IOHIDUserClient</em></strong></span></a> (a subclass of <a href="https://developer.apple.com/documentation/driverkit/iouserclient" title="DriverKit Documentation - IOUSerClient"><span><strong><em>IOUserClient</em></strong></span></a>) to register as clients that consume <em>IOHIDEvents</em>.</p><p class="">For every Human Interface Device, <strong><em>backboardd</em></strong> and <strong><em>WindowServer</em></strong> also have associated <a href="https://github.com/apple-oss-distributions/IOHIDFamily/blob/main/IOHIDFamily/IOHIDEventServiceUserClient.cpp" title="Base class for dedicated device driver event clients"><span><strong><em>IOHIDEventServiceUserClient</em></strong></span></a> instances, which set up a <strong>shared memory</strong> area, implemented as a <strong>ring buffer</strong>. </p><p class="">Traces captured in Instruments also expose part of this model. They indicate that <em>backboardd</em> receives input events via <strong>IOHIDEventServicePlugin</strong>, as shown in the screenshot below.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp" data-image-dimensions="3546x1195" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=1000w" width="3546" height="1195" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/067f8b84-532d-4759-a776-27053603ae98/IOHIDEventServicePlugin.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>IOHIDEventServicePlugin in an IOKit stack trace</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">To transfer data from the kernel space to the user space, Apple relies heavily on <strong>Mach Messages</strong> and<strong> Mach Ports</strong> (provided by the Mach Kernel). In Apple’s ecosystem, this is the preferred <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/boundaries/boundaries.html#//apple_ref/doc/uid/TP30000905-CH217-BABDECEG" title="Apple Archives - Mach Messaging"><span><strong><em>Inter-Process Communication mechanism</em></strong></span></a>. This is especially true for cross-boundary scenarios (user space to kernel or vice-versa). Apple chose these constructs for their lower level and foundational architecture, because they enable <em>asynchronous</em> information delivery <em>with built-in security and flow control</em>. Because it’s a pattern you would encounter often in Apple’s ecosystem, especially while debugging, it’s useful to know how it works.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>You will likely never use Mach Ports in applications you would release on the App Store, because Apple provides dedicated, higher level abstractions (such as </em></span><a href="https://developer.apple.com/documentation/corefoundation/cfmachport" title="CoreFoundation - CFMachPort"><span><span class="sqsrte-text-color--white"><strong><em>CFMachPort</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> and </em></span><a href="https://developer.apple.com/documentation/xpc" title="Apple XPC Documentation"><span><span class="sqsrte-text-color--white"><strong><em>XPC</em></strong></span></span></a><span class="sqsrte-text-color--white"><em>).</em></span></p>


  




&nbsp;
  
  <p class="">There are two main models for using <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/Mach/Mach.html"><span><strong><em>Mach ports</em></strong></span></a> for inter process communication. The first is a <strong><em>notification-based</em></strong> model, where a <em>client</em> process sends a <em>notification</em> (an empty message) to a <em>server</em> process <em>listening on a dedicated port</em>, as a signal that data is available or that a specific event has occurred. This was also the very first mechanism supported by the operating system. These notifications do not carry any payload; instead, they prompt the receiver to take further action, typically involving reading from the memory area associated with the port. </p><p class="">The second model is <strong><em>message-based</em></strong>, resembling higher level communication patterns. In this case, a Mach port <em>client</em> communicates with a Mach port <em>server</em> through a <em>full message, with a payload</em>. This mechanism is more commonly used for higher order components, such as user space process-to-process communication.</p><p class="">Access to Mach ports is controlled through integer-based flags known as <a href="https://www.gnu.org/software/hurd/gnumach-doc/Port-Rights.html" title="GNU Documentation - Mach Port Rights"><span><strong><em>Mach Port Rights</em></strong></span></a>, indicating operations that a process may perform over the port.</p><p class="">If you are familiar with Linux constructs, Mach Ports essentially work as Unix Pipes.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Apple also provides a higher level framework, which it recommends when developing modern software requiring </em><strong><em>Inter Process Communication</em></strong><em>. This is the </em></span><a href="https://developer.apple.com/documentation/xpc" title="Apple Documentation - XPC"><span><span class="sqsrte-text-color--white"><strong><em>XPC Services</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> framework, which leverages the orchestration capabilities of the </em><strong><em>launchd</em></strong><em> daemon and the Mach Kernel’s </em><strong><em>Mach Ports</em></strong><em>.</em></span></p>


  




&nbsp;
  
  <p class="">Regardless of the model used for communication (notification or message), the kernel manages interactions via Mach Ports in the same fundamental way. When a port is registered, the thread that will use it performs its setup, then eventually <strong>blocks</strong> <em>on the port</em>. In other words, it enters a waiting, <strong>SUSPENDED</strong> state, until the kernel reactivates it to resume the execution of its loop. Only a single task can listen on a given Mach port at a time, so only a single thread is ever blocked on it. </p><p class="">When data is written into a Mach port, the kernel marks the associated listener thread as <strong>RUNNABLE</strong>. Then, the Mach Scheduler assigns the thread to a CPU Core. When the core is ready to take new work and change contexts, it picks up the runnable thread and executes its instructions. Generally, the thread <em>dequeues</em> data structures, in an operation is known as<em> draining the port</em>. Apple also describes this process in the <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/Mach/Mach.html#//apple_ref/doc/uid/TP30000905-CH209-CEGJEIAG" title="Apple Kernel Programming Guide - IPC"><span><strong><em>Kernel Programming Guide</em></strong></span></a>.</p>


  





  
  <p class="sqsrte-large"><em>To Be Continued…</em></p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759093455117-QJSLM4JQOPY1825377WO/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">Starting with the basics - Part III</media:title></media:content></item><item><title>Starting with the basics - Part II</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 28 Sep 2025 21:05:03 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/g5qv4lnf57yoyo4fx6xoq3nryjmirx</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68d96c7cab2e535c3fe23d69</guid><description><![CDATA[In the previous post, I covered some of the core concepts behind operating 
systems, in general. To better understand how these systems fit together, 
it’s useful to see them in action. Learn how to explore the user space on 
MacOS, as well as how to use Instruments on a Mac.]]></description><content:encoded><![CDATA[<p class="">In the previous post, I covered some of the core concepts behind operating systems, in general. To better understand how these systems fit together, it’s useful to see them in action.</p>


  




<hr />
  
  <h2>Exploring the User Space in MacOS X</h2><p class="">Before diving into how a SwiftUI application runs on an Apple operating system, it would be useful to become familiar with some of the main entities that constantly run on a Mac (simply because it’s easier to see, compared to an iPhone or iPad). </p><p class="">When you turn your computer on, it has no concept of files, memory, operating system or applications. However, as it boots up, more and more services start working in concert, to provide the logical support for those concepts. The exact steps are different between <a href="https://support.apple.com/guide/security/boot-process-secac71d5623/web" title="Apple Silicon Boot Process"><span><strong><em>Apple Silicon based Macs</em></strong></span></a> and <a href="https://support.apple.com/guide/security/boot-process-sec5d0fab7c6/web" title="Intel Mac Boot Process"><span><strong><em>Intel based Macs</em></strong></span></a>, but the general approach is the same: the <strong><em>S</em></strong><em>ystem </em><strong><em>O</em></strong><em>n A </em><strong><em>C</em></strong><em>hip</em> (as there is no CPU or GPU per se, in ARM) executes the instructions it finds on the <strong><em>Boot</em></strong><em> </em><strong><em>R</em></strong><em>ead </em><strong><em>O</em></strong><em>nly </em><strong><em>M</em></strong><em>emory</em> ( known as the <em>ROM</em>. If you remember the old days of custom jailbroken Android ROMs, this is the concept). Generally, one of the first operations is to verify the signature of the <strong>Bootloader</strong> (as a security mechanism), and, if the signature is valid, it starts it. This, in turn, executes a series of exchanges to gradually verify and load various other pieces of low level software (firmware) from every connected piece of hardware that is relevant to the base boot process ( such as Security Chips, RAM, integrated Storage). Towards the end of this process, MacOS finally begins to start up. The very first MacOS process to execute in the <em>user space</em> is the <em>System Manager</em>, or <strong><em>launchd</em></strong>. </p><p class="">You can easily check this, if you open the <a href="https://support.apple.com/guide/terminal/open-or-quit-terminal-apd5265185d-f365-44cb-8b09-71a064a42125/mac" title="Apple Support - Opening and closing the Terminal app"><span><strong><em>Terminal</em></strong></span></a> app on your Mac. Thanks to its BSD roots, MacOS supports BSD commands, many of which are very similar to Linux commands. You can run the <code>ps -eaf</code> (<a href="https://man7.org/linux/man-pages/man1/ps.1.html" title="Linux Manual - ps command"><span><strong><em>process status</em></strong></span></a>) command in your terminal. The output is going to be ordered nicely, so you can even get a rough idea of which processes start the first. You would see many lines similar to the one below:</p>


  




&nbsp;
  
  <pre><code> UID   PID  PPID   C STIME   TTY           TIME CMD
    0     1     0   ...  ... ...        ... /sbin/launchd</code></pre>


  




&nbsp;
  
  <p class="">The first row in the output represents the output header. There, <strong>UID</strong> represents the <em>User ID</em> (<strong>0</strong> is <strong>root</strong>, the user with full rights in the OS), <strong>PID</strong> represents the <em>process ID</em>, <strong>PPID</strong> is the <em>process’ parent’s PID</em>. <strong>CMD</strong> represents the <em>command issued to start the process</em>. This shows that <strong><em>launchd</em></strong>, which is the system process manger for <strong>MacOS</strong> <em>is one of the very first processes that start</em>, on your Mac. It is started directly by the <strong><em>kernel_task </em></strong>process, which is the only process that has<strong> PID 0</strong> (which is why the <strong>PPID</strong> of <strong><em>launchd</em></strong> is <strong>0</strong>) - and it starts as a <strong>privileged process</strong> (as <strong>root</strong>). It does so because it needs to be able to issue requests to the Kernel (which is not allowed from a non-root level, which is why you are sometimes asked to input your “root” password for very specific tasks).</p><p class="">If you analyze the full output of the command, you will start seeing that many of the processes running on the system (particularly the first part of the output) are started by launchd (they have <em>PPID</em> <strong>1</strong>). Eventually, after the group os processes that started from <em>PPID</em> <strong>1</strong>, you will start seeing processes that are started with another PPID. These are typically processes that other applications started. For example, you would likely see the terminal process.</p>


  




&nbsp;
  
  <pre><code>	...  1468     1   ...  ... ...         0:00.33 /System/Applications/Utilities/Terminal.app/Contents/MacOS/Terminal
	...  1469  1468   ...  ... ...    0:00.01 login -pf &lt;...&gt;
	...  1471  1469   ...  ... ...    0:00.02 -zsh
	...  1477  1471   ...  ... ...    0:00.01 ps -eaf</code></pre>


  




&nbsp;
  
  <p class="">It’s useful to know that PIDs are assigned by the operating system, to processes that are launched, in consecutive order. The OS has a special PID counter, which is incremented every time a new process is started. In the example above, you can see that, between <code>-zsh</code> and <code>ps -eaf</code> there are 6 missing PIDs. This indicates that, in the meantime, 6 processes likely executed and finished their execution.</p><p class="">It is common, when troubleshooting applications on a computer, to check the <em>list of running processes</em>, then walk back up the <em>PID</em> -&gt; <em>PPID</em> chain, to identify relationships between processes. For example, <code>ps -eaf</code> was started by <code>-zsh</code> (Unix Z-Shell), which was started by the login command which was itself issued by the Terminal application.</p><p class="">If you open any application (for example, the <strong>Music Player</strong>), you would see a set of dedicated entries (you can use <code>ps -eaf | grep -i music</code>). For reference, you would use the pipe symbol (“|”) to take the output of one command and pass it as the input of another command. You can use grep to filter, -i is the flag to make the filtering case insensitive and, in this case, music is the argument. In other words, the command lists the processes that are currently running on your Mac, but only shows you the filtered ones. This is very similar to a Swift filter on a collection (<code>let b = try a.filter { $0.contains(try Regex("m|Music")) } </code>):</p>


  




&nbsp;
  
  <pre><code>  ...  4357     1   ...  ... ...         0:02.02 /System/Applications/Music.app/Contents/MacOS/Music
  ...  4358     1   ...  ... ...         0:00.66 /System/Applications/Music.app/Contents/XPCServices/VisualizerService_x86.xpc/Contents/MacOS/VisualizerService_x86
  ...  4359     1   ...  ... ...         0:00.05 /System/Applications/Music.app/Contents/XPCServices/VisualizerService.xpc/Contents/MacOS/VisualizerService</code></pre>


  




&nbsp;
  
  <p class="">The <strong>PID</strong> for the <strong><em>Music</em></strong> application is <strong>4357</strong>. We can confirm it was indeed started by the launchd daemon, because its <strong>PPID</strong>is <strong>1</strong> (which is the PID for launchd). Additionally, right after <strong><em>launchd</em></strong> started Music, it also started two other processes: the <strong><em>VisualizerService</em></strong> (PID <strong>4358</strong>) and the <strong><em>VisualizerService_x86</em></strong> (PID <strong>4359</strong>). </p><p class="">This is another very good example of how Operating System <em>design philosophies</em>, together with <em>Kernel functionality</em>, can influence the way we write applications. When running in an Operating System, each application is <a href="https://developer.apple.com/documentation/security/app-sandbox#//apple_ref/doc/uid/TP40011183" title="Apple Security Guide - Sandbox Design"><span><strong><em>sandboxed</em></strong></span></a>. It receives<em> its own area of memory</em> and, as we’ve already seen, it receives <em>its own OS Process</em> with a dedicated Process ID. This means that, in general, one application cannot directly (willingly or accidentally) access another application’s internal memory space. However, there are cases where you need one application to have multiple isolated components. They remain sandboxed one from the other, but <em>still function as one application</em> and, just as importantly, they <em>are </em><a href="https://developer.apple.com/library/archive/documentation/CoreFoundation/Conceptual/CFBundles/AboutBundles/AboutBundles.html" title="Apple Archive Documentation - Bundles"><span><strong><em>bundled</em></strong></span></a><em> together</em>. They are installed, updated and uninstalled together and the separation into isolated components is not relevant to the end-user.</p><p class="">In this case, the <strong>Music</strong> application, which is the main music playback application on the Mac, also comes bundled with two <em>music visualizer components</em> (which you can bring to the foreground when the Music Application is <strong>in focus</strong>, by going to the <strong>Window</strong> menu and clicking on <strong>Visualizer</strong>). This is a common <em>application design pattern</em>, where developers can break down complex applications into <a href="https://developer.apple.com/library/archive/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingXPCServices.html" title="Apple Archives - XPC Services"><span><strong><em>separate components</em></strong></span></a>, which can then be managed independently, by the OS System Manager (<em>launchd</em> in this case). The <strong>Music</strong> <em>process</em> starts as the <em>main executable</em> of the Music App. The two <strong>Visualizers</strong> are implemented as <a href="https://developer.apple.com/documentation/xpc/creating-xpc-services" title="Apple Documentation - XPC Services"><span><strong><em>XPC Services</em></strong></span></a>, and they each receive their own OS process. Besides increased security, this also ensures that, if the visualizers encounter issues ( they are more prone to crashes due to potential decoding issues or memory problems), it can crash without bringing down the Music Player with it. It can also be managed independently by <em>launchd</em>, which can then restart it. </p><p class="">When it needs to, the <strong>Music</strong> process can then communicate with the <strong>Visualizer</strong> <em>XPC Service</em>, <em>via XPC messages</em> (which are wrapped over <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/Mach/Mach.html#//apple_ref/doc/uid/TP30000905-CH209-CEGJEIAG" title="Apple Archives - Mach IPC"><span><strong><em>Mach Kernel IPC Abstractions</em></strong></span></a>). Essentially, as it streams the music information, the Music process bundles a copy (or some subset) of the binary data into <a href="https://developer.apple.com/documentation/Swift/Codable" title="Codable Struct"><span><strong><em>Codable</em></strong></span></a> struct - and then sends it to the <em>Visualizer</em>. The visualizer then processes the data it receives and generates the animations it needs to. </p><p class="">Of the numerous processes and subsystems that start up with the Operating System, many work in concert to support the underlying functionality required for Application Graphical User Interfaces. One of those is the <strong><em>WindowServer</em></strong> process (the <strong><em>SpringBoard</em></strong> process in iOS). After <strong><em>launchd</em></strong> starts this process, there are others which follow shortly - such as the <strong><em>loginwindow</em></strong>process (which starts the login procedure) and then, some time later, the <strong><em>UserEventAgent</em></strong> process for MacOSX’s higher level system events. Shortly after, Apple’s <em>WindowManager</em> (often referred to as the <em>Stage Manager</em>) process starts up. The <em>WindowManager</em> process is responsible for arranging windows in various workspaces (desktop windows).</p><p class="">As described in Apple’s manual (in the terminal app, run the command <code>man WindowServer</code>), the <em>WindowServer</em> process is in charge of “window management, content compositing, and event routing”. It is a process that runs on multiple threads. All of these services are important and no application would function without any of those services - but the event routing service is perhaps the least obvious, so it merits a dedicated discussion. </p><p class="">Device drivers react to events in real time. To put this in perspective, the screen of an iPad could potentially send <em>one event every 8.33 milliseconds</em> (if its refresh rate is 120Hz). An Apple Pencil device has a polling rate of 240Hz (sending <em>one event each 4.16 ms</em>). A mouse could potentially send <em>an event once every 0.4 ms</em> (if the controller’s polling rate is 2.5 GHz). To prevent applications from being flooded with events, the WindowServer also coalesces and buffers these events. Applications can still use the entire array of events, for high fidelity applications (such as drawing applications), by leveraging the <a href="https://developer.apple.com/documentation/uikit/uievent/coalescedtouches(for:)" title="UIKit - coalescedTouched(for:)"><span><strong><em>coalescedTouches(for:)</em></strong></span></a> API. Generally, however, they retrieve the last touch event recorded at the beginning of the V-Sync cycle (more on this later).</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Another important part of the ecosystem consists of the </em><strong><em>device drivers</em></strong><em>. On MacOS, you can see the status of the Drivers Registry, by opening the terminal and running the command </em></span><code><span class="sqsrte-text-color--white">ioreg</span></code><span class="sqsrte-text-color--white"><em>. If you would like to research further, you can check the main classes (such as </em></span><a href="https://developer.apple.com/documentation/kernel/ioregistryentry" title="Apple Kernel Documentation - IORegistryEntry"><span><span class="sqsrte-text-color--white"><strong><em>IORegistryEntry</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> or </em></span><a href="https://developer.apple.com/documentation/iosurface" title="Apple Documentation - IOSurfaceRoot Class"><span><span class="sqsrte-text-color--white"><strong><em>IOSurfaceRoot</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> or ) and build a more detailed understanding, based on the elements that run on your computer. </em></span></p><p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Finally, you can also check the connected Human Input Devices your OS recognizes at any given time, by running the command </em></span><code><span class="sqsrte-text-color--white">hidutil list</span></code><span class="sqsrte-text-color--white"><em>. Apple provides a guide on how to analyze this information in their </em></span><a href="https://developer.apple.com/documentation/corehid/discoveringhiddevicesfromterminal" title="Apple Documentation - Discovering HID Devices"><span><span class="sqsrte-text-color--white"><strong><em>“Discovering HID Devices in the Terminal”</em></strong></span></span></a><span class="sqsrte-text-color--white"><em> guide.</em></span></p>


  




&nbsp;
  
  <p class="">The <strong>Activity Monitor</strong> is another particularly useful tool for exploration and troubleshooting. You can call it from the <em>Launchpad</em> as a generic <a href="https://support.apple.com/guide/activity-monitor/view-information-about-processes-actmntr1001/10.14/mac/15.0" title="Apple Developer Guides - Activity Monitor as a Task Manager"><span><strong><em>Task Manager</em></strong></span></a> (similar to Windows’ Task Manger) or from the Menu Bar (&lt;Application Name&gt; -&gt; Services -&gt; Activity Monitor), as a <a href="https://developer.apple.com/tutorials/instruments/analyzing-main-thread-activity" title="Apple Developer Guides - Activity Monitor as an Instrument"><span><strong><em>Development Instrument</em></strong></span></a>. Depending on the way you call the Activity Monitor, the interface will likely look slightly different.</p><p class="">If you run the Activity Monitor as a <em>Task Manager</em>, you can also sample any running process. As shown in the screenshot below, you would select the process you’re interested in (WindowServer in this case).</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp" data-image-dimensions="2343x197" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=1000w" width="2343" height="197" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fafc87d4-9c78-4cc4-bd3e-8d0ec1c9e9f2/OS_App_Internals_ActivityMonitor_ProcSelection.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Activity Monitor - Process Selection</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">Then, using the System Diagnostics Menu, you can sample the process. When doing so, the OS takes a snapshot of the running threads and calls - as well as a Stack Snapshot (similar to what you see when reaching a breakpoint when debugging your application).</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp" data-image-dimensions="1920x314" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=1000w" width="1920" height="314" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/94f1f05f-d945-4de5-914a-a04c48b0e7ce/OS_App_Internals_ActivityMonitor_SampleProcess.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Activity Monitor - Sample Process</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">Once the sampling process is complete, Activity Monitor opens up a dedicated <em>Sample Window</em>. You can then choose from multiple Display Options and analyze the process call graph. You can then cross-reference the calls with Apple’s documentation, to determine what each call does. Notice how, for example, <strong>2220</strong> represents the amount of times the thread/call was found during the sample period - and it does not represent a Process ID.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp" data-image-dimensions="1920x1285" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=1000w" width="1920" height="1285" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/fe84f195-12d3-4952-bb9e-f24374c99625/OS_App_Internals_ActivityMonitor_SampleResults.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Activity Monitor - Sample Results</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">You can later use these tools to further explore how other processes and applications work (to some extent). You would not see implementation details and, in many cases, the method calls are intentionally not documented. However, with time, you would become proficient at interpreting call graphs and stack traces. </p>


  




&nbsp;
  
  <h3>Using Instruments to explore the User Space on an iPhone</h3><p class="">You can also explore the main processes running on iPhone and iPad devices, as well as the way the Operating System interacts with the applications you create. It is, however, a slightly more involved process, since you need to connect the devices to a Mac, then use Xcode’s <a href="https://developer.apple.com/tutorials/instruments" title="Apple Developer Help - Instruments"><span><strong><em>Instruments</em></strong></span></a> suite.</p><p class="">For example, you could use the Logging Instrument to capture the log entries created by various processes running on the phone, grouped by the subsystem (framework) that creates them. To make the initial experience easier, it would be a good idea to switch the iPhone to airplane mode, to minimize the amount of messages the logger instrument would intercept. As you gain more and more knowledge and become more comfortable with the toolset, this step would become unnecessary.</p><p class="">There are other Instruments you may want to use, such as:</p><ul data-rte-list="default"><li><p class=""><strong>Runloops</strong> instrument, to explore the execution details of running processes’ various input processing threads, known as <a href="https://developer.apple.com/documentation/foundation/runloop" title="Apple Foundations documentation - RunLoop"><span><strong><em>runloops</em></strong></span></a></p></li><li><p class=""><strong>Virtual Memory Trace</strong> instrument, to explore various structures in memory, as your application changes</p></li><li><p class=""><strong>GPU</strong> and/or <strong>CPU</strong> <strong>Profiler</strong> instruments, to analyze various activities performed by the GPU and CPU sides of the SoC, respectively</p></li></ul>


  




&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>Although you can add multiple instruments to a single recording session and capture data from all processes, the bandwidth you can use is limited. Sometimes, events are dropped due to various rate limiters. </em></span></p><p class=""><span class="sqsrte-text-color--white"><em>Therefore, it helps to create a simple test scenario (eg press a specific button in your application), then run the test scenario multiple times, with one instrument at a time, but for all processes. Or, with all instruments, but on one specific process at a time.</em></span></p><p class=""><span class="sqsrte-text-color--white"><em>The purpose of this section is to show some of the main processes running on MacOS and iOS, to give you a general idea of where to look, given your specific needs.</em></span></p>


  




&nbsp;
  
  <p class="">To get started, in an Xcode project, go to<strong> Xcode -&gt; Open Developer Tool -&gt; Instruments</strong>. Then, select <strong>Logging</strong> as the initial instrument. You can add multiple other instruments. In fact, you can select a different set of instruments for each recording, to make the process easier to follow.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp" data-image-dimensions="3491x1230" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=1000w" width="3491" height="1230" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/01e17ccd-e66e-4baa-9614-58fa8999d079/Instruments_Open.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Xcode - Opening the Instruments App</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">The Instruments application's User Interface is designed to streamline the viewing, organizing, and filtering of captured events while serving as a centralized repository for recording sessions across multiple test cases. The screenshot below demonstrates a scenario where logs were captured across various test cases, which are displayed in the Recorded Sessions section.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp" data-image-dimensions="3840x2045" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=1000w" width="3840" height="2045" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/32b4d6a7-ec55-4a60-bbb1-a1eeb24d5765/Instruments_InterfaceOverview.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Instruments Application - Interface Overview</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">The Instruments UI also features a central <strong>Toolbar</strong>, similar to Xcode, allowing you to select your target device and specify which process to monitor during recording sessions. You can configure the recorder to capture all processes system-wide or focus on a specific application, system process, or app extension. The toolbar also displays currently running applications on the connected device, functioning similarly to the <code>ps -eaf</code> command in the macOS Terminal.</p><p class="">As shown in the screenshot below, you have several monitoring options to choose from. Like macOS, iOS assigns process IDs (PIDs) sequentially. In other words, lower PIDs indicate processes that started closer to system boot time. Note that certain processes cannot be monitored through Instruments due to security restrictions.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp" data-image-dimensions="3840x2045" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=1000w" width="3840" height="2045" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/9a024dbe-f114-4c35-92d0-1df33db07777/Instruments_ProcessBindSelection.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Instruments Application - Process Bind Selection</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">As an example, we could analyze the processes that are involved in starting the Music Application on an iPhone, in iOS26, when pressing its icon on the Dock (the area at the bottom of the Home Screen on an iOS device). </p><p class="">To make it easier to follow, switch the device to Airplane mode. Since Airplane Mode turns off all wireless services, make sure the device is paired with your Mac via a cable. Also, make sure the device is unlocked and that TrueTone is off ( also to minimize the amount of events the logger would capture). Finally, make sure you switch the Recording Mode to <strong>Deferred</strong>, to make sure the logger doesn’t drop events.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp" data-image-dimensions="3840x2044" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=1000w" width="3840" height="2044" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7696c38a-2ce8-4aa1-9e18-a35b79b8f38e/Instruments_RecordingSessionSetup.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Instruments Application - Prepare Recording Session</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">In the Instruments Application, start recording, then touch the Music Icon and, finally, stop the recording. The faster you execute this chain of events (start recording, start the music app, stop recording), the shorter the session would be and the easier it becomes to parse.</p><p class="">While your mouse cursor hovers over the timeline, you can zoom in and out, using the <strong>option (⎇) key + the mouse wheel up or down</strong>. Alternatively, you could use <strong>cmd (⌘)</strong> and <strong>+/-</strong>, to zoom in and out, respectively. Zooming in and out is particularly useful when, besides the logger, you have other instruments in the recording, because you can more clearly find correlations between events across processes and various CPU/GPU/Memory operations. In the screenshot below, the highlighted area represents an area of interest. It starts around the moment the Music Icon was touched and it ends around the moment the Music application starts up.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp" data-image-dimensions="3840x2044" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=1000w" width="3840" height="2044" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/187a5ac2-8568-49ec-a4c1-9af249f35f7b/Instruments_RecordingSessionExample.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Instruments Application - Recorded Session Example</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  



  
  <p class="">According to the log timestamps, the entire process, from the moment the touch was registered until the application fully loaded, took under 200 milliseconds (or a little under 12 frames).</p>


  





  
     <table>
        
        <thead>
            <tr>
                <th>
                				Timestamp
                			</th>
                <th>
                				Process
                			</th>
                <th>
                				Subsystem
                			</th>
                <th>
                				Event (Observations)
                			</th>
                <th>
                				Message
                			</th>
            </tr>
        </thead>
        <tbody>
            <tr>
                <td>
                				00:01.769.459
                			</td>
                <td>
                				backboardd
                			</td>
                <td>
                				com.apple.Multitouch
                			</td>
                <td>
                				TouchEvent
                			</td>
                <td>
                				Dispatching event with 1 children, _eventMask=0x23 _childEventMask=0x3 Cancel=0 Touching=1 inRange=1 (deviceID 0x…)
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.769.934
                			</td>
                <td>
                				backboardd
                			</td>
                <td>
                				com.apple.BackBoard
                			</td>
                <td>
                				TouchEvents
                			</td>
                <td>
                				Touch entered (insideExclusive) &lt;BKSHitTestRegion: 0x77e7bfb80; {{0, 0}, {428, 926}}; exclusive: {{0, 0}, {428, 926}}&gt;
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.783.911
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.UIKit
                			</td>
                <td>
                				EventDispatch
                			</td>
                <td>
                				Evaluating dispatch of UIEvent: 0xd0cf42c00; type: 0; subtype: 0; backing type: 11; shouldSend: 1; ignoreInteractionEvents: 0, systemGestureStateChange: 0
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.784.005
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.UIKit
                			</td>
                <td>
                				EventDispatch ( send to application window)
                			</td>
                <td>
                				Sending UIEvent type: 0; subtype: 0; to window: &lt;_UISystemGestureWindow: 0xd0ce22300&gt;; contextId: 0xa0e094e4
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.793.924
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.SpringBoard
                			</td>
                <td>
                				Icon (register tap on Icon)
                			</td>
                <td>
                				Allowing tap for icon view &#39;com.apple.Music&#39;
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.795.476
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.UIKit
                			</td>
                <td>
                				EventDispatch ( send to application window)
                			</td>
                <td>
                				Sending UIEvent type: 0; subtype: 0; to window: &lt;SBHomeScreenWindow: 0xd0b129c00&gt;; contextId: 0xe51eda8f
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.799.194
                			</td>
                <td>
                				SpringBoard 
                			</td>
                <td>
                				com.apple.SpringBoard
                			</td>
                <td>
                				Icon ( process Touch)
                			</td>
                <td>

                    				SBIconView touches began with event: &lt;UITouchesEvent: 0xd0c20f200&gt; timestamp: 6200.01 touches: {(&lt;UITouch: 0xd097501c0&gt; type: Direct; phase: Began; is pointer: NO; tap count: 1; … 
                    <strong>location in window</strong>
                    : {373, 859.33333333333326}; …
                    			
                </td>
            </tr>
            <tr>
                <td>
                				00:01.869.166
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.SpringBoard
                			</td>
                <td>
                				Icon (Start Launching)
                			</td>
                <td>
                				Launching application com.apple.Music from icon &lt;private&gt;, location: SBIconLocationDock
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.878.367
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.runningboard
                			</td>
                <td>
                				General (Send Launch Request to running board)
                			</td>
                <td>
                				Sending launch request: &lt;RBSLaunchRequest| app&lt;com.apple.Music(…)&gt;; &quot;FBApplicationProcess&quot;&gt;
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.881.191
                			</td>
                <td>
                				runningboardd
                			</td>
                <td>
                				com.apple.runningboard
                			</td>
                <td>
                				Job (Start application)
                			</td>
                <td>
                				Creating and launching job for: app&lt;com.apple.Music(…)&gt;
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.889.972
                			</td>
                <td>
                				launchd
                			</td>
                <td>
                				-
                			</td>
                <td>
                				-
                			</td>
                <td>
                				Starting Music App
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.894.277
                			</td>
                <td>
                				runningboardd
                			</td>
                <td>
                				com.apple.runningboard
                			</td>
                <td>
                				Assertion (Record PID)
                			</td>
                <td>
                				Added pid 1.098 to RBSAssertionManagerStore; count 21; size 4.096
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.894.651
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.FrontBoard
                			</td>
                <td>
                				Process (Register PID with Window Server)
                			</td>
                <td>
                				[app&lt;com.apple.Music&gt;:1098] Bootstrap success!
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.938.164
                			</td>
                <td>
                				Music
                			</td>
                <td>
                				com.apple.amp.Music
                			</td>
                <td>
                				Application (App Started)
                			</td>
                <td>
                				Welcome to MusicX!
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.949.997
                			</td>
                <td>
                				Music
                			</td>
                <td>
                				com.apple.amp.mediaremote
                			</td>
                <td>
                				MediaRemote (Initialization)
                			</td>
                <td>
                				MediaRemote server initializing
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.951.237
                			</td>
                <td>
                				SpringBoard
                			</td>
                <td>
                				com.apple.FrontBoard
                			</td>
                <td>
                				ProcessScene ( App is now In Focus)
                			</td>
                <td>
                				[0xd0a890300:(FBSceneManager):sceneID:com.apple.Music-default] Scene lifecycle state did change: Foreground
                			</td>
            </tr>
            <tr>
                <td>
                				00:01.964.835
                			</td>
                <td>
                				mediaremoted
                			</td>
                <td>
                				com.apple.amp.mediaremote
                			</td>
                <td>
                				NowPlaying
                			</td>
                <td>
                				Set: 【 􁊸 LOCL (iPhone) ❯ 􀘪 com.apple.Music (1098) Music ❯ 􀃪 Music 】 setting playbackQueueCapabilities from &lt;request&gt; to &lt;private&gt;
                			</td>
            </tr>
        </tbody>
    </table>
  


  
  <p class="">Finally, the Instruments application supports <strong>Input Filters</strong>, to help quickly locate specific log entries. The filtering mechanism accepts two input fields, which can be combined for more precise results. </p><p class="">First, you can use the filter search box with the syntax <code>&lt;filter category&gt;:&lt;filter value&gt;</code> to isolate specific log entries. For example, <code>process:SpringBoard</code> would show SpringBoard related logs and nothing else. You can also search for any text strings across all categories, simply by typing it out, without a category identifier. For example, the string <code>Welcome to MusicX</code> would identify the event that signals the start of the Music Application. </p><p class="">Secondly, although not clearly marked as such, you can press the <strong>Input Filter</strong> button, to <em>filter results by threads</em>. Notice you can specify the filters independently. Contradicting filters ( <code>runningboardd</code> in the <em>thread filter</em> and <code>process:Music</code> in the <em>search box</em>) are valid selections, but they would display an empty list of results. The screenshot below showcases an example of these filters.</p>


  















































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp" data-image-dimensions="3840x2205" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=1000w" width="3840" height="2205" sizes="(max-width: 640px) 100vw, (max-width: 767px) 100vw, 100vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/0144d3df-08d2-4e63-974c-694e5ff6d115/Instruments_RecordingSessionFilters.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Instruments Application - Recorded Session Example with Filters</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class=""><span class="sqsrte-text-color--white"><em>This section introduced you to some of the tools that will help you better understand the vast ecosystem Apple has put together over the years. These techniques </em><strong><em>do not</em></strong><em> replace perusing Apple’s documentation or their WWDC sessions. In fact, you will find references to many of them, scattered throughout this book. Instead, together with the </em></span><a href=""><span><span class="sqsrte-text-color--white"><strong><em>Xcode Debugger</em></strong></span></span></a><span class="sqsrte-text-color--white"><em>, they will help you verify your assumptions and put the information in a more practical context.</em></span></p>


  




&nbsp;
  
  <p class="sqsrte-large"><em>To be Continued…</em></p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759083839101-4NNLCADJE2LAMJX56D9H/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">Starting with the basics - Part II</media:title></media:content></item><item><title>Starting with the basics - Part I</title><category>Intro to Apple</category><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 28 Sep 2025 16:29:09 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/about-swift-applications-and-apple-operating-systems</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68d94e2cf71e0a60450e84a1</guid><description><![CDATA[During a conversation at work, it became apparent just how disconnected 
modern software development is from the underlying hardware, or even from 
the Operating Systems the application processes run on. Coincidentally, I 
was getting ready to finish up my first SwiftUI introductory book. I 
decided to push the release of that book a bit farther, and capture the 
information I wanted to convey in that conversation, in a better 
articulated manner and through the lens of Apple’s systems.]]></description><content:encoded><![CDATA[<h3>A foreword of sorts…</h3><p class="">Some months ago, I had a conversation with a colleague of mine, about cloud applications and <em>why</em> Kubernetes workloads need to scale <em>out</em> (more individual pods, or <em>horizontal scaling</em>) and not just <em>up</em> (more powerful pods, or <em>vertical scaling</em>). This went on a tangent about distributed systems and, eventually, on why creating a large number of threads and even processes to be handled by a processor eventually hits a ceiling. Whether that ceiling is truly relevant, in the context of modern hardware and software is an entirely separate topic. </p><p class="">During our conversation, it became apparent just how disconnected modern software development is from the underlying hardware, or even from the Operating Systems the application processes run on. Coincidentally, I was getting ready to finish up my first SwiftUI introductory book. I decided to push the release of that book a bit farther, and capture the information I wanted to convey in that conversation, in a better articulated manner and through the lens of Apple’s systems.</p><p class="">In this post, I will cover the first two sections of that introductory chapter. The rest will be included in a few other posts.</p><h3>About (Swift) Applications and (Apple) Operating Systems</h3><p class="">Today, <strong>Software Development</strong> as a profession is often performed on very high level software frameworks, which usually hide away many of the more complex tasks. </p><p class="">Some frameworks are developed by large companies, such as Apple, Google or Microsoft, to help developers create software for their ecosystems, or to solve very specific problems they encounter in their business. Others are created by individual developers or small teams, to solve a problem they encounter ( such as <em>Laravel</em>, <em>Ruby on Rails</em>, <em>Vue.js</em> etc.). This model works very well because it allows framework <em>users</em> to focus on their <em>application’s core logic</em>, while framework <em>developers</em> handle the <em>underlying technical challenges</em>.</p><p class="">Applications work well when both sides understand their roles: users must grasp the framework's rules, and developers must understand the types of applications their users are building. However, this mutual understanding breaks down. Sometimes, the framework evolves in a direction that excludes specific types of applications. Other times, we use the wrong framework to solve a particular problem, influenced by hype or by the incomplete understanding of the framework’s purpose. More often still, framework creators lock users into their particular vision of how applications should function. The degree to which framework maintainers establish and enforce their own views (opinions) over how their users should build applications determines if a framework is <em>opinionated</em> or not. Apple’s <strong>Combine</strong> framework, together with the <strong>SwiftUI</strong> framework, for example, are <em>opinionated frameworks</em>. They expect applications to be constructed by <em>declaring how</em> data is transformed over time and <em>how</em>the User Interface should use the data, without requiring the users of the frameworks to specifically call a <code>render</code> or an <code>update</code> function, for example. They are <em>declarative</em>, <em>reactive</em> frameworks which, by their very nature, require a specific structure. </p><p class="">Since frameworks hide complex lower-level interactions, developers often make assumptions that are partially or completely incorrect, leading to various issues. Meanwhile, framework maintainers, focused on their own development priorities, may create constraints that make certain application requirements difficult to implement. When these constraints are not properly documented, the resulting applications are more likely to behave incorrectly. </p><p class="">With experience and as frameworks evolve, users become experienced enough with architecture of the frameworks they use and form their thinking models in line with the framework maintainers’. In the same time, the framework documentation improves, leaving less room for incorrect assumptions.</p><p class="">Because this model generally works well and allows application developers to <em>move faster</em>, it also allows developers to become <em>completely disconnected</em> from mechanisms and notions that are not made visible (or usable) by their frameworks of choice.</p><p class="">Eventually, some developers (hopefully most of them) reach a point where they wonder what actually lies beneath the surface exposed by those frameworks. They may, for example, wonder how a <em>touch on their screen</em> (or <em>click of their mouse</em>) results in the <em>actions</em> they see on their screens - whether it’s their favorite application starting up or it’s they favorite game character moving. Or, perhaps, they would start wondering how macOS works and why it’s built the way it is. Not just in terms of design and polish, but the <em>internal mechanisms</em>.</p><p class="">As a developer, you can spend most of your career writing applications for the Apple platform without knowing what the <a href="https://www.cs.cmu.edu/afs/cs/project/mach/public/www/mach.html" title="Carnegie Mellon University - the Mach (microkernel) project"><span><strong><em>Mach kernel</em></strong></span></a> is or ever seeing an <a href="https://web.mit.edu/darwin/src/modules/xnu/osfmk/man/mach_msg.html" title="MIT - a Mach Inter Process Communication Message"><span><strong><em>IPC message</em></strong></span></a>. You can write fast and safe concurrent multithreaded applications, without knowing anything about <a href="https://www.cs.cmu.edu/afs/cs/academic/class/15492-f07/www/pthreads.html" title="Carnegie Mellon University - POSIX Thread Library"><span><strong><em>POSIX Threads</em></strong></span></a>. In fact, <a href="https://github.com/swiftlang/swift-evolution/blob/main/proposals/0296-async-await.md" title="Swift Evolution - Async/Await"><span><strong><em>since Swift 5.5</em></strong></span></a>, you may not even need to know what the <a href="https://developer.apple.com/library/archive/documentation/General/Conceptual/ConcurrencyProgrammingGuide/ConcurrencyandApplicationDesign/ConcurrencyandApplicationDesign.html#//apple_ref/doc/uid/TP40008091-CH100-SW1" title="Apple Archives - Concurrencty Programming Guice (GCD)"><span><strong><em>Grand Central Dispatch</em></strong></span></a> is.</p><p class="">However, <em>if and when you do</em>, you suddenly start to understand <em>why</em> <strong>Operating Systems</strong>, <strong>Application Frameworks</strong> and <strong>Programming Languages</strong> work the way they do. You start understanding the techniques others used to solve problems - and you gain the ability to apply those notions independently. You begin to realize just how complex and vast the technology space is, even in areas that seem specialized, such as the development of applications for the iPhone.</p><p class="">The purpose of this section is to explore how the Swift source code we write fits into the wider and more complex context of <em>Apple Operating Systems</em> running on Apple devices. This will help form a mental model of the way your own application’s code fits within this entire ecosystem. Then, we are going to follow the events that take place when <em>an end-user touches the screen of their phone</em> (or clicks a button on their mouse) <em>to activate a button in an application’s user interface</em>.</p><p class="">I find this helpful, though not necessarily mandatory, for a few key reasons:</p><ul data-rte-list="default"><li><p class="">You gain the <strong><em>vocabulary</em></strong> you need to <em>express concepts and implementations</em>. This is particularly useful when collaborating with other developers, or during your own research. Many technical terms carry specific implementation details, so a shared understanding between you and your interlocutors can go a long way. Additionally, the richer your vocabulary becomes, the more precisely you can ask the questions you need, to get the answers you seek;</p></li><li><p class="">All Operating Systems essentially solve the same problem: they act as a <em>bridge</em> between the <em>end-user</em>, the <em>applications</em> they need to use and the <em>hardware</em> those applications run on. This is all orchestrated in a <strong>safe</strong>&nbsp;(relative to the system itself, not your data) and hopefully <strong>intuitive</strong> manner. The mechanisms they use to accomplish these tasks differ in certain implementation details, but the general <em>ideas</em> and <em>concepts</em> remain the same across all ecosystems (partly for convenience, partly because the ideas were that good to begin with). For example, all Operating Systems use <strong>drivers</strong> to describe <strong>devices</strong> (such as a mouse or printer) in ways that are useful to the software running on the operating system. However, each OS may (and usually does) have its own systems used to build those drivers.</p></li><li><p class="">Generally, <em>low level systems do not change in fundamental ways and if they do, they rarely change quickly and suddenly</em>. Therefore, if you know how a specific <strong><em>kernel</em></strong> worked 5 years ago, it will likely work the same now and 5 years from now. The same goes for <strong><em>kernel extensions</em></strong>. As this section will show, there are portions of the macOS Operating System that were written in the 1980s and 1990s. They are still relevant and important, close to half a century later. This is why some of the references provided throughout this chapter are sourced from Apple’s archives website.</p></li></ul><p class="">When referring to software components and concepts, we can generally use the terms “<em>low level</em>” and “<em>high level</em>”. For example, <strong>SwiftUI</strong> is the <em>higher level framework</em>, while <strong>UIKit</strong> is the <em>lower level framework</em>. Even <em>lower level</em> framework examples would be <strong>CoreAnimation</strong> or <strong>CoreGraphics</strong>. Regardless of the framework, the <strong>direction</strong> remains consistent. <em>Lower level</em> indicates that a concept is closer to the <strong>device’s hardware space</strong>, while <strong>higher level</strong> indicates that the concept is closer to the <strong>end-user space</strong>. This is a useful way to describe frameworks and programming languages because, the closer we get to the hardware, the less protection mechanisms we generally. As a result, the impact of potential programming errors is generally higher in lower level environments.</p><p class="">When analyzing complex software systems, it’s important to set a <em>context</em> or <em>abstraction domain</em> and then ground the analysis to that context. This is especially useful because a single term can mean multiple things, depending on the context.</p><p class="">For example, the diagram below presents a <strong><em>Button</em></strong> in various contexts. While “<em>Button</em>” conceptually means an interactive control, its technical definition and characteristics vary, based on the domain (or even frameworks of the same domain). Specifically:</p><ul data-rte-list="default"><li><p class="">In <strong>SwiftUI</strong>, the button exists as a <em>Button</em> view built upon the <em>UIControl</em> class from <em>UIKit</em>, which itself derives from the <em>UIResponder</em> class. </p></li><li><p class="">At a lower level, the button is expressed in the context of the screen ( display device) and it can be represented by a <em>Touch Area</em> and a <em>CoreAnimation Layer</em></p></li><li><p class="">In <strong>Memory</strong>, it simply becomes a block of data, representing its <em>state</em> and, potentially, <em>references to other objects</em>(such as the associated <em>functions</em> to be executed when touched)</p></li></ul><p class="">All of these perspectives are valid and individually accurate, but no single one tells the whole story.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp" data-image-dimensions="2919x810" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=1000w" width="2919" height="810" sizes="(max-width: 640px) 100vw, (max-width: 767px) 83.33333333333334vw, 83.33333333333334vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/8c237047-a557-4bb3-9084-2fb24a2745d6/Button_Contexts.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>The concept of a Button, represented in various contexts</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">When referring to a button in <strong>UI Frameworks</strong>, you are referring to the <em>control</em>, which is a high level abstraction that contains a visual element (the rendered button) and a logical element (what should happen when the button’s action is called). These concepts exist entirely in the Application’s User Interface domain. In the context of the <strong>screen’s display component</strong> (the OLED), you have <em>an area of pixels</em>. In the context of the <strong>screen’s digitizer</strong> (the assembly that converts physical interactions to digital signals) , you have various <em>signals from a sensor array</em> (changes in the electrical/magnetic field, recorded for each sensor) which the Hardware Controller registers and processes to extract information about <em>touched areas</em>. In the context of <strong>Memory</strong>, you have the <em>frame buffer</em>, which contains the information related to the visual representation of the Button, so that it can be drawn on the screen - and so on.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>The terms </em><strong><em>framework</em></strong><em> and </em><strong><em>library</em></strong><em> are sometimes used interchangeably, especially in front end development and in discussions about </em><strong><em>React</em></strong><em>. This is usually because they both provide code that you can reuse and because some developers have very specific expectations from frameworks.</em></span></p><p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>However, the two terms describe </em><strong><em>different requirements</em></strong><em> - and it’s useful to clarify them here, even though you will likely also use the terms loosely.</em></span></p>


  




&nbsp;
  
  <p class=""><strong>Libraries</strong>, such as the <strong>ArgumentParser</strong> library, provide reusable code you can integrate into your application. Their purpose is to add additional functionality, by allowing you to import their files and use them directly, as an integral part of your code. In other words,<strong> libraries integrate into your project</strong>. </p><p class=""><strong>Frameworks</strong>, such as <strong>SwiftUI</strong>, <strong>UIKit</strong> and others also provide some reusable code you can integrate into your application, but with a <em>different purpose</em>. The code exposed by frameworks acts as a <em>connection point between the framework and your own code</em>. In the case of SwiftUI, you use the <strong><em>View</em></strong> <em>protocol</em> to define views. When SwiftUI builds or updates views, it looks at structures <em>conforming</em> to that <em>protocol</em>. When it needs to render a view, it checks the structure’s <strong><em>body</em></strong> property, which is required by the <strong><em>View</em></strong> protocol. In other words, <em>your code provides specialized behavior, conforming to the framework’s requirements, which the framework uses in certain cases</em>. This is known as <strong>inversion of control</strong>.</p><p class="">Put simply, <strong>libraries</strong> <em>integrate into your code</em>, whereas <em>your code integrates into</em> <strong>frameworks</strong>.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp" data-image-dimensions="1988x1158" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=1000w" width="1988" height="1158" sizes="(max-width: 640px) 100vw, (max-width: 767px) 66.66666666666666vw, 66.66666666666666vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/c14e7482-9268-4569-bc46-bcb50dd41e6a/FrameworksVsLibraries.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Frameworks vs Libraries</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <h3>Brief introduction to Operating Systems</h3><p class="">If you own a personal computer, it is likely running an operating system that is either <strong>Apple’s</strong> <strong><em>macOS</em></strong>, <strong>Microsoft’s<em>Windows</em></strong> or a variant of <strong><em>Linux</em></strong>. Of the three, two use a <em>kernel</em> (a complex set or abstractions that bridge the hardware of the device with the ) that represents a variant of <strong>POSIX/Unix</strong> (<em>Portable Operating System Interface</em>/ <strong><em>UNI</em></strong><em>plexed </em><strong><em>I</em></strong><em>nformation and </em><strong><em>C</em></strong><em>omputing </em><strong><em>S</em></strong><em>ervice</em>) and the other uses the <strong>Windows NT</strong> <em>kernel</em>.</p><p class="">There are numerous criteria to classify <strong>Operating Systems</strong> - and one of them is the way they structure <em>Input and Output abstractions</em>. There are others, such as <em>promises related to Response Times</em> (Real Time Operating Systems vs Regular Operating Systems). </p><p class="">In <strong>POSIX</strong>, the core philosophy is that <em>everything is a stream of bytes</em> (or a <em>file</em>). In <strong>Windows NT</strong>, on the other hand, <em>everything is a specialized, securable object</em>. This may seem like a small difference, but it essentially dictated the path of the operating systems that used them. For example:</p><ul data-rte-list="default"><li><p class="">As its name suggests, the <strong>POSIX</strong> approach is to ensure the conforming <em>Kernels are portable</em>. By abstracting everything as a stream of bytes, you can effectively configure anything in a <em>file</em> (in Apple’s case, two files, because you typically include a .plist file as well). With <strong>Windows</strong>, on the other hand, you use the <em>Windows Registry</em>. The kernel is not as easily ported to another operating system.</p></li><li><p class="">On <strong>POSIX</strong>, you use <em>Native, Lower Level Abstractions</em>. You can write native C tools to interact with the POSIX APIs, or you can use Apple’s C-level Core APIs - all the way up to Objective-C or Swift code. On the <strong>NT Kernel</strong>, however, since it is proprietary from the ground up, you usually interact with <em>Higher Level Abstractions</em> provided by Microsoft (.NET Core, UWP/WPF) . This does not necessarily make one approach safer or easier to use than the other, they just require different mindsets. </p></li></ul><p class="">Overall, both ecosystems encourage developers to use the highest-level APIs possible and have invested heavily in toolsets that help developers build more complex, stable, and safe applications faster.</p><p class="">Despite these differences in approach, <em>every Operating System</em> (<strong><em>macOS</em></strong>, <strong><em>iOS</em></strong>, <strong><em>watchOS</em></strong> etc. for <strong>Apple</strong> or <strong><em>Windows</em></strong> for <strong>Microsoft</strong>) comprises, among many other complex systems, the same main components: an <strong><em>OS Kernel</em></strong>, a <strong><em>Service Manager</em></strong>, one or several <strong><em>Event Manager(s)</em></strong> and a <strong><em>User Interface Manager</em></strong>:</p><ul data-rte-list="default"><li><p class="">The <strong><em>OS Kernel</em></strong> is the piece of software that acts as the bridge between the Hardware and the Operating System’s higher order components (daemons, UI and end-user facing applications). It is a collection of services (often C functions, objects and data structures) that provide fundamental capabilities (assigning work to the CPU, allocating memory, reading from memory etc.) to higher level components. A key concept to grasp is that kernel services are loaded in memory when the operating system boots up and, unlike higher order constructs (like applications or web services), kernel services never stop. If an error is encountered in the kernel functions, it has the potential to bring the whole machine down, requiring a complete reboot and reload of the kernel. Apple, uses the <a href="https://github.com/apple-oss-distributions/xnu" title="Github - XNU"><span><strong><em>XNU</em></strong></span></a> kernel, which is a combination of <strong><em>FreeBSD</em></strong> and <strong><em>Mach</em></strong>. Together with other core components in Apple’s ecosystem, the XNU kernel is part of the <a href="https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/KernelProgramming/Architecture/Architecture.html" title="Apple Archive Documentation - Darwin Kernel Architecture"><span><strong><em>Darwin Operating System</em></strong></span></a>. Over the years, Apple developed numerous <a href="https://developer.apple.com/documentation/kernel" title="Apple Documentation - Kernel"><span><strong><em>Kernel Extensions</em></strong></span></a>, to enhance the functionality and organized them in various ways, to be bundled with their various devices. </p></li><li><p class="">The <strong><em>Service Manager</em></strong> is a <strong>process</strong> (a workload container in which a program’s code runs, identified by a <strong><em>process ID</em></strong> - <strong><em>PID</em></strong>, for short) which starts after the OS completes its loading process. For Apple’s ecosystem, it’s <strong><em>launchd</em></strong>and for <strong>Linux</strong> - it’s <strong><em>init</em></strong>. Sometimes, it’s called a <strong><em>System Manager</em></strong> and it starts with the <strong><em>Operating System</em></strong>, as the very first <em>User Mode</em> process created, which is why it receives the <em>Process ID</em> (<strong>PID</strong>) value of <strong>1</strong>. It runs as a <a href="https://developer.apple.com/library/archive/documentation/MacOSX/Conceptual/BPSystemStartup/Chapters/CreatingLaunchdJobs.html" title="Apple Documentation Archive - Creating launchd Daemons"><span><strong><em>daemon process</em></strong></span></a> (a background process which runs in an infinite loop and performs tasks without user interaction) and it starts the applications that should load up after the Operating System boots. Finally, once the startup level applications are started, the system manager daemon process continues running and waiting for instructions, for as long as the Operating System runs. </p></li><li><p class="">The <strong><em>Event Manager</em></strong> is responsible for handling various types of events. There are usually multiple event handlers, each handling specific types of interactions - and in various activity spaces ( file system events, system events, touch events and so on). For <strong>macOS</strong>, mouse clicks are registered via <strong><em>WindowServer</em></strong>, while on <strong>iOS</strong>, touches are handled by <strong><em>backboardd</em></strong> (which was introduced in iOS6, as a break-out daemon from <strong><em>SpringBoard</em></strong>. They all receive the the events from the low-level <a href="https://developer.apple.com/library/archive/documentation/DeviceDrivers/Conceptual/IOKitFundamentals/HandlingEvents/HandlingEvents.html" title="Apple Archive - IOKit"><span><strong><em>IOKit</em></strong></span></a> kernel extensions.</p></li><li><p class="">The <strong><em>User Interface Server</em></strong>, is responsible for managing the User Interface elements. For <strong>MacOS</strong>, it is <strong><em>WindowServer</em></strong>, whose main function is to open <strong><em>CGXServer</em></strong> (Core Graphics X Server). For <strong>iOS</strong>, it’s <strong><em>SpringBoard</em></strong>. Additionally, there is a <strong><em>WindowManager</em></strong> component, as well - and it is responsible for grouping managing window positions in various configurations (for example, on multiple virtual desktops). There is also a <strong><em>RederServer</em></strong>component, which ensures that the correct data is processed by the GPU at the correct time, so that the connected displays can show the UI correctly.</p></li></ul><p class="">In most scenarios, you will rarely (if at all) interact with these components directly - especially the OS Kernel. Instead, you would typically write code that integrates into Apple’s frameworks which, in turn, would interact with the OS Kernel on your behalf.</p><p class="">Since the Operating System is responsible for the management of hardware resources and all applications that try to use them, providing higher level frameworks for lower level interactions is not sufficient to ensure its integrity and performance. The OS is not inherently secure simply by following the distinction between high level or low level, nor is it secure by dividing work among more processes. For this reason, as <em>a bare minimum</em>, the software running on a device is divided into two main spaces:</p><ul data-rte-list="default"><li><p class="">The <strong>User Space</strong>, where <em>applications external to the OS</em>, such as eBook Readers, browsers, games etc. run. Software running in this space needs to be protected from other software running in the same space. This is accomplished by ensuring that each application receives its own memory (memory address space) they can read into. Generally, one application cannot read another application’s memory directly.</p></li><li><p class="">The <strong>Kernel Space</strong>, where the Kernel runs. The software running in this space can potentially access any location in memory, can run any operations and can control all input/output address spaces. Since it essentially has unrestricted control everywhere, this space needs to be separated from the <strong>User Space</strong> - and access to its resources is tightly controlled.</p></li></ul><p class="">This separation ensures that an application running in the user space cannot take up more resources than the Operating System considers to be safe - and that an application crash does not take the whole operating system down. There are cases where these issues do still occur - but the separation between the User Space and the Kernel Space aims to lower the number of occurrences.</p><p class="">In general, <em>User Space</em> applications use <strong>System Calls</strong> (wrapper functions within the <strong><em>libSystem</em></strong> dynamic library) to prepare instructions to request from the the software running in the <em>Kernel Space</em>. The Kernel (in the case of XNU, either the BSD components or the Mach components) can then request the CPU to execute the commands. Lastly, The CPU executes kernel-issued commands only when it receives a specific trap, which signals its transition from the <strong>User Mode</strong> (<strong><em>Ring 3</em></strong> on x<strong>86/64</strong> architectures or <strong><em>EL0</em></strong> - Exception Level 0 - on <strong>ARM</strong>) to the <strong>Kernel Mode</strong> (<strong><em>Ring 0</em></strong> on <strong>x86/64</strong> architecture or <strong><em>EL1</em></strong> on <strong>ARM</strong>). </p><p class="">Besides the logical separation between the kernel and end-user-facing applications, the design of an operating system includes other concerns. To function effectively, applications need to interact with memory. From the application’s executable code, to dynamic libraries, to runtime variables, every useful piece of an application is stored somewhere in memory. As such, memory use, management and security are critical concerns for any operating system and its surrounding components. </p><p class="">When any application starts up, it goes through a <strong>Setup Phase</strong>, where the OS Kernel assigns a dedicated <a href="https://developer.apple.com/library/archive/documentation/Performance/Conceptual/ManagingMemory/Articles/AboutMemory.html" title="Apple Archives - Virtual Memory System"><span><strong><em>Virtual Memory Address Space</em></strong></span></a> for internal use. This serves as the application’s <em>addressable memory space</em>. During the application’s normal operation (or the <strong>Frequent Access Phase</strong>), its instructions access addresses from the Virtual Memory Address Space. </p><p class="">This system enables true multitasking and the parallel execution of processes. Since every process operates within its own <em>virtual</em> memory address space, two processes could store different data, at the same <em>virtual</em> address. When each of the two processes needs to access the address, the MMU converts the virtual address to a <strong>unique, real memory location</strong>. Since the physical locations are unique, the two processes can run without interfering with each other.</p><p class="">The access to physical memory is orchestrated through a separate hardware component known as the <a href="https://developer.arm.com/documentation/den0013/d/The-Memory-Management-Unit" title="ARM Developer - Memory Management Unit"><span><strong><em>Memory Management Unit</em></strong></span></a>, which translates between virtual memory space and physical memory space, using a <em>page table</em>. By using a separate orchestrator for memory access, a given process cannot access the data of another process, unless the MMU itself becomes compromised. </p><p class="">The diagram below showcases the separation between Exception Levels, as well as the way memory is allocated on setup and used during the frequent access state of applications.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp" data-image-dimensions="1920x781" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=1000w" width="1920" height="781" sizes="(max-width: 640px) 100vw, (max-width: 767px) 75vw, 75vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/7db39cc8-2a3b-40bb-92a5-97ccc1cbb0f5/SetupvsFrequentAccessMemory.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Separation between Application Memory Addresses</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">To further improve security, the Operating System (and the dynamic loader) use a mechanism called <strong>ASLR</strong> (<strong><em>A</em></strong><em>ddress </em><strong><em>S</em></strong><em>pace </em><strong><em>L</em></strong><em>ayout </em><strong><em>R</em></strong><em>andomization</em>), which randomizes the virtual memory addresses where critical program components are loaded when a process starts up. Rather than always using the same address to load executables, libraries or other data, ASLR introduces controlled randomness to the memory layout. For example, the configuration data of a process may be loaded in address <strong>0x01020304</strong> during one program execution, but in address <strong>0x0402301</strong> during the next execution. Similarly, the stack could begin at <strong>0x7ffdb9v3e0000</strong> in one instance and in <strong>0x8ffe123456789</strong> in another one. This makes it exponentially difficult for attackers to predict where a specific piece of data is stored by a certain application. <strong>ASLR</strong> has a <em>direct consequence on the compilers</em>, because the Object files they generate need to be expressed into <strong>Position-Independent Machine Code</strong>. In most cases, addresses are expressed relative to the current position of the stack pointer.</p><p class="">Perhaps equally as important, because virtual memory abstracts away the physical memory layer, it can essentially create the illusion of an extended memory lane. For example, a 32 bit application can use up to 4GB of <em>addressable space</em>. A 64 bit application could use up to 18 ExaBytes (1,000,000 GB) of <em>addressable space</em>. This allows the operating systems to use various types of non-RAM memory (such as the Hard Disk) as extensions to RAM.</p><p class=""><br></p><h3>About Memory and Data Transfer over various mediums</h3><p class="">Computers (and phones) function by processing electrical signals as data. Any piece of information is represented as a sequence of <strong>1</strong> and <strong>0</strong>, or a <em>Binary Sequence</em>. To illustrate this concept, we can take the letter <strong>F</strong> as a simple example. To represent it in a way that can be expressed as a sequence of <strong>1</strong> and <strong>0</strong>, we need to choose an <em>encoding</em>. This represents the <em>protocol</em> that both the <em>sender</em> and the <em>receiver</em> need to use, in order to understand the binary sequence. This protocol is used by the <em>sender</em> to <em>encode</em> the letter <strong>F</strong> into a <em>binary sequence</em>. Then, the sequence is <em>decoded</em> by the <em>receiver</em> using the same protocol. As long as the two parties use the same encoding, they can communicate with each other. There are several encodings for characters (String Runes):</p><ul data-rte-list="default"><li><p class=""><strong>ASCII</strong> is the foundational character set for the English language. It uses 7 bits to represent 128 characters, but the data is almost always stored in a standard 8-bit byte.</p></li><li><p class=""><strong>UTF-8</strong> is the dominant encoding for the web. It's a variable-length encoding designed to represent every character in the Unicode standard. It is fully backward-compatible with ASCII.</p></li><li><p class=""><strong>UTF-16</strong> is another variable-length encoding for Unicode. Its basic unit is a 16-bit (2-byte) chunk.</p></li><li><p class=""><strong>UTF-32</strong> is another variable-length encoding for Unicode. Its basic unit is a 32-bit (4-byte) chunk.</p></li></ul><p class="">Wherever an encoding uses more than 1 byte, we usually specify whether the sequence is read with the most significant byte first (<strong>Big Endian</strong>) or the least significant byte first (<strong>Little Endian</strong>). Essentially, endianness represents the order in which the sequence is read. Big Endian represents the “<em>natural order</em>”, whereas Little Endian represents the “<em>reverse order</em>”. Depending on endianness, the same sequence can mean two different things. For example, the <strong>UTF-16</strong> character represented by the binary sequence <strong>00000000 01000110</strong>, can be read as:</p><ul data-rte-list="default"><li><p class=""><em>The Uppercase English Letter</em> ‘<strong>F</strong>’, when using <strong>Big Endian</strong>, because <strong>00000000 01000110</strong> is read as <strong>0x0046</strong> in Hexadecimal</p></li><li><p class=""><em>The CJK Unified Ideograph</em> ‘<strong>䘀</strong>’, when using <strong>Little Endian</strong>, because <strong>00000000 01000110</strong> is read as <strong>0x4600</strong> in Hexadecimal</p></li></ul><p class="">Because the order in which bytes are read matters, it’s useful to know that, in <strong>network protocols</strong>, <strong>Big Endian</strong> is the default. For <strong>file formats</strong> (especially text), we specify the endianness using a <strong>BOM</strong> (<strong>Byte Order Mark</strong>), <strong>U+FEFF</strong> (<em>ZERO-WIDTH NO BREAK-SPACE</em>) which is an invisible character at the beginning of the file. For example, if a UTF-16 starts with the byte sequence <strong>FE FF</strong>, the file was saved as <em>Big Endian</em>. If it starts with <strong>FF FE</strong>, it was saved as <em>Little Endian</em>.</p><p class="">Since any type of information can be expressed in binary, as long as both the sender and the receiver agree on the <em>encoding</em> ( or <em>format</em>, or <em>protocol</em>, or <em>standard</em>, depending on the context) and <em>endianness</em> , we can save information, for later use, in some type of memory - or we can transfer it to other systems.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>Although encoding can slightly obfuscate the data being converted, it is not a security mechanism. For example, you can easily take a </em><strong><em>base64</em></strong><em> encoded string and then decode it. To properly secure data, you would need to use </em><strong><em>encryption</em></strong><em>, which requires cryptographic algorithms.</em></span></p>


  




&nbsp;
  
  <p class="">In the context of storing data for later use, one of the simpler examples consists of the common <strong>Dynamic Random Access Memory </strong>(DRAM), which stores data by using billions of small constructs, known as <em>memory cells</em>. Each memory cell stores a single bit and it consists of a <em>capacitor</em> (which can store charge) and a <em>transistor</em> (which acts as a switch). In a nutshell, when a capacitor is charged, it represents the value <strong>1</strong> - and if it’s discharged, it represents the value <strong>0</strong>. The diagram below represents a single DRAM memory cell. DRAM cells are arranged in a matrix. This type of memory is wiped when the system is powered off, since the capacitors are very small and they cannot hold a charged state for too long. For this reason this type of memory is known as <strong>Volatile Memory.</strong></p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp" data-image-dimensions="3601x1379" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=1000w" width="3601" height="1379" sizes="(max-width: 640px) 100vw, (max-width: 767px) 75vw, 75vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/d084baf6-5f97-4dd6-abc9-176e140dd5f4/MemoryCells.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>DRAM Memory Cell (Left) and DRAM Memory SubArray (Right)</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">To read from DRAM, the system places a small electrical current on a specific <strong>Word Line</strong>, which closes the transistors connected to it. As a result, the circuit between each transistor’s <strong>capacitor</strong> and the associated <strong>Bit Line</strong> is <em>closed</em>, as well. This <em>activates</em> the <em>cell</em>. If the cell’s <strong>capacitor</strong> is <em>full</em>, the <strong>Sense Amplifier</strong> on the <strong>Bit Line</strong> detects a <em>small increase</em> in the Bit Line’s current, which signifies a <strong>1</strong>. On the other hand, If the <strong>capacitor</strong> is <em>empty</em>, the <strong>Sense Amplifier</strong> detects a small decrease in the <strong>Bit Line’s </strong>electrical current, signifying a <strong>0</strong>. When a capacitor is discharged, it needs to be recharged to show the same value on the next read operation. For this reason, read operations on the DRAM are destructive and they usually result in an immediate write operation (essentially, the controller reads the memory, then writes it again, while also sending the information to the requestor). Since DRAM is based on capacitors, which leak current, this type of memory needs to be periodically <em>refreshed</em>, by reading and rewriting the data. Most modern DRAM memory blocks refresh <em>once every 64ms</em>.</p><p class="">There are many other storage mechanisms, such as <strong>SRAM</strong> (<em>Static</em> RAM), based on a transistors feedback loop, which does not require a refresh ( hence, the name <em>static</em>), <strong>Magnetic Discs</strong> ( the normal <strong>HDD</strong>) ,<strong> Flash memory</strong> and so on - each with their own requirements and implementations. At a high enough level, though, all memory storage mechanisms serve the same function: they store binary data, which can be retrieved, or modified.</p>


  




&nbsp;
  
  <p class="sqsrte-small"><span class="sqsrte-text-color--white"><em>All operations (</em><strong><em>reading</em></strong><em>, </em><strong><em>writing</em></strong><em>, </em><strong><em>refreshing</em></strong><em>) are highly dependent on a well synchronized </em><strong><em>clock signal</em></strong><em>. For example, it’s important to ensure that the data is read by the controller exactly when the sense amplifiers would detect the discharge. For this reason, everything in computing is, at the lowest possible level, </em><strong><em>completely synchronous</em></strong><em>. Clock speeds are measured in Hertz (Hz). 1 Hz is the equivalent of a tick per second. To put this in perspective, a 2GHz clock ( a modern processor) would “tick” 2 billion times every second.</em></span></p>


  




&nbsp;
  
  <p class="">Lastly, data can be <em>transferred</em> from one medium to another. This is accomplished by passing electrical current, either through independent (floating) wires or through traces of conductor, printed on <strong><em>P</em></strong><em>rinted </em><strong><em>C</em></strong><em>ircuit </em><strong><em>B</em></strong><em>oard</em>. Generally, the wires or traces that make up a bus are called <em>lines</em>. A grouping of these lines is known as a <strong>Bus</strong>. There are 3 main types of buses, conceptually (though at a very high level, the <em>Bus</em> represents the whole group). If the bus is used to transfer the address for which the read or write operation should be performed (<em>Read from Address</em> or <em>Write to Address</em>), it’s an <strong>Address Bus</strong>. If it’s used to send a <em>clock signal</em>, which acts as a <em>synchronization mechanism</em> (metronome) for the system, it’s a <strong>Clock Bus</strong> . Finally, if the bus is used to transfer the actual binary data, it’s a <strong>Data Bus</strong>.</p><p class="">Besides their purpose, buses are characterized by a <em>width</em>, or a number of <em>lanes</em>, which represent the number of <em>wires the bus uses to transfer signals</em>. Generally, <em>Address Buses</em> and <em>Data Buses</em> have a <em>higher width</em>, while the <em>Clock Bus</em> is usually 1 line wide (only has one wire). </p><p class="">Not all standards separate the address from the data lines. In many cases (<a href="https://www.ti.com/lit/an/sbaa565/sbaa565.pdf?ts=1751627774446&amp;ref_url=https%253A%252F%252Fwww.google.com%252F" title="Texas Instruments - Understanding the I2C Protocol for Microcontroller communication"><span><strong><em>I2C</em></strong></span></a> for example), both data and address information are sent on the same wire, but in different <em>frames</em> (a data frame and an address frame). </p><p class=""><strong>Data Buses</strong> can be <em>serial</em> (they transmit the value of one bit at a time) or <em>parallel</em> (they transmit the value of several bits at a time). The advantage with parallel buses is that they can send more data faster, because they can send more bits in a single clock cycle. However, when printed in a curve, the lines have different lengths, so it’s more difficult to ensure that all bits reach their destination at the same time. </p><p class="">The diagram below showcases the example of how the uppercase letter <strong>F</strong> is transmitted, in <strong>UTF-8</strong> <em>binary encoding</em>, over a serial and over a parallel bus. As a reminder, the binary value for this character is <strong>01000110</strong>. </p><p class="">Data can be transferred either in a <em>Little Endian</em> or in a <em>Big Endian</em> order (<strong><em>L</em></strong><em>east </em><strong><em>S</em></strong><em>ignificant </em><strong><em>B</em></strong><em>it</em> is transferred first or <strong><em>M</em></strong><em>ost </em><strong><em>S</em></strong><em>ignificant </em><strong><em>B</em></strong><em>it</em> is transferred first). In this case, we are looking at a <strong>Little Endian</strong> system, so <em>the least significant bit is transferred first</em>. The example is simplified, as parallel buses usually transmit data through much wider data buses. For example, a parallel bus might use 32 wires, to transmit 32 bits per clock cycle, instead of the 4 bit wide data bus in the example.</p>


  




&nbsp;










































  

    
  
    

      

      
        <figure class="
              sqs-block-image-figure
              intrinsic
            "
        >
          
        
        

        
          
            
              
              
          
            
                
                
                
                
                
                
                
                <img data-stretch="false" data-image="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp" data-image-dimensions="3205x1266" data-image-focal-point="0.5,0.5" alt="" data-load="false" elementtiming="system-image-block" src="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=1000w" width="3205" height="1266" sizes="(max-width: 640px) 100vw, (max-width: 767px) 66.66666666666666vw, 66.66666666666666vw" onload="this.classList.add(&quot;loaded&quot;)" srcset="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=100w 100w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=300w 300w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=500w 500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=750w 750w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=1000w 1000w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=1500w 1500w, https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/cfd3cc09-fd6a-4b4e-a8d6-6f88ef46966f/DataTransfer_SerialVsParallel.webp?format=2500w 2500w" loading="lazy" decoding="async" data-loader="sqs">

            
          
        
            
          
        

        
          
          <figcaption class="image-caption-wrapper">
            <p data-rte-preserve-empty="true"><em>Serial (left) and Parallel(right) Data Buses with dedicated global clock line</em></p>
          </figcaption>
        
      
        </figure>
      

    
  


  


&nbsp;
  
  <p class="">Over long distances (more than the usual distance between components on a single circuit board), this mechanism with a dedicated clock line becomes less and less feasible. Due to the physical characteristics of the wires, the timing signals can drift slightly, between timed objects. This is known as the <em>clock skew</em> or <em>timing skew</em>. To address this, most hardware protocols used to transfer data outside the realm of a single computer board (such as SATA, USB, Ethernet etc.) use various mechanisms to transfer the clock information with the data signal itself. This is colloquially known as <em>Clock Embedding</em> or <em>Self-Clocking</em>. For example, the <strong>Ethernet</strong> <em>protocol</em> specifies that, at the beginning of each transferred <strong>frame</strong> (more specifically, in the <strong>Preamble</strong> of each <em>frame</em>, or the first 8 bytes of the frame) there must be a sequence of <strong>high</strong> and <strong>low</strong> <em>bits</em> (<em>10101010101</em>), which are used by the receiving Network Controller to synchronize its own internal clock. This mechanism is known as <strong><em>C</em></strong><em>lock and </em><strong><em>D</em></strong><em>ata </em><strong><em>R</em></strong><em>ecovery</em>.</p><p class="sqsrte-large"><em>To be Continued…</em></p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759076966835-ZA94SGQIHHJVA75LN51X/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">Starting with the basics - Part I</media:title></media:content></item><item><title>Taking the first byte</title><dc:creator>Samwise Prudent</dc:creator><pubDate>Sun, 28 Sep 2025 12:42:40 +0000</pubDate><link>https://www.prudentleap.com/prudent-protocol/2025/9/taking-the-first-byte</link><guid isPermaLink="false">6721f9295c6d593f58a1c57b:68d86448421a7f587b06ce19:68d8646f61cffb43282be130</guid><description><![CDATA[All journeys have to start somehow, somewhere. This post and this space are 
as good as any…]]></description><content:encoded><![CDATA[<p class="">I created <strong>The Prudent Protocol</strong> as a place to share ideas and, hopefully, encourage others to explore software development “the old-fashioned way”: by learning and then doing on our own. </p><p class="">I will focus on ideas through the lens of Apple’s ecosystem, but the underlying concepts would generally be transferrable to other platforms and technologies. No large ecosystem exists in a vacuum. Ideas, especially the good ones, often cross the boundaries of individual companies and their platforms. </p><p class="">Throughout my career, I have always documented my projects. I recorded what I did and why - and it all came in handy, sooner or later. When I wanted to take new challenges, I used my documentation to ramp up my replacements, so my managers and teams found it easier to support me. When I needed to explain why I designed a software system the way I did, I used my documents to show my process and gain trust. And when I needed to know why I didn’t apply one pattern over another, I had my documents to tell me why… I still document everything I do, some 18 years later - though now it’s also a more prevalent part of my day job. It is, therefore, only natural for me to continue to write, as I walk the journey of <strong>Prudent Leap Software</strong>, in my free time. </p><p class="">I don’t claim to know how technology will evolve and where it’s headed, but I believe accurate, structured information will always matter. As tools evolve, understanding the concepts, models and patterns behind modern software architecture will still remain just as important, if not more so. Although <strong>Large Language Models</strong> may change how we find information and even how we build <em>production, enterprise applications</em>, <strong>context</strong> (pun intended) does and will still matter. In a system where the inputs are words, sentences and paragraphs, often framed as questions, knowing what words to use and what questions to ask still feels <strong>foundational</strong>. I am not so sure that AI will fully replace software developers, or even that developers who use AI would. But I am convinced well trained, knowledgeable developers will replace those who are not, with or without AI.</p><p class="">Some posts will be full excerpts from the books I am or had been working on. Others will represent ideas I found interesting enough to explore, but did not really fit in a book (<em>yet</em>). I am not looking to cover every new feature Apple releases for their frameworks, nor every new evolution initiative in the Swift community. There already many creators who do - and I try to follow all of them, as my time permits. But, if I feel I can add to the conversation in some way, I will.</p><p class=""><strong>As for you,</strong> <strong>my dear reader,</strong> I hope you’ll enjoy your time here - and that you will allow others to enjoy their time here, as well. Be decent. Be humble. If you can, teach others. If you know you’re right, defend your point of view, with arguments and examples. <strong>Don’t be disrespectful</strong>. Become better every day, bit by bit. And in doing so, improve those around you, so we can all grow, together.</p>]]></content:encoded><media:content type="image/jpeg" url="https://images.squarespace-cdn.com/content/v1/6721f9295c6d593f58a1c57b/1759065019012-S5M7GFL0WB2HXSGDU9LE/unsplash-image-QweeHI91iVY.jpg?format=1500w" medium="image" isDefault="true" width="1500" height="844"><media:title type="plain">Taking the first byte</media:title></media:content></item></channel></rss>