Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to Import and Use FlexDelegate in TensorFlowLite in iOS #67538

Closed
tanpengshi opened this issue May 14, 2024 · 9 comments
Closed

Unable to Import and Use FlexDelegate in TensorFlowLite in iOS #67538

tanpengshi opened this issue May 14, 2024 · 9 comments
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting response Status - Awaiting response from author TF 2.9 Issues found in the TF 2.9 release (or RCs) type:support Support issues

Comments

@tanpengshi
Copy link

IDE: Xcode 15
Platform: iOS17
TensorFlow version: r2.9

I am developing both iOS and Android apps that are running with TensorFlow Lite model. Because my model uses LSTM, I have to make use of TFSelectOps.

In addition, because the TensorFlowLiteSelectTFOps library is large in memory size, I have to do a selective build. After much effort, I have succeeded in making my TensorFlowLite model running smoothly on the Android app based on the selectively built libraries.

On the iOS however, I used a bazel build:

bash tensorflow/lite/ios/build_frameworks.sh \
  --input_models=model1.tflite,model2.tflite \
  --target_archs=x86_64,armv7,arm64

to generate the:

  1. TensorFlowLiteSelectTfOps.framework
  2. TensorFlowLiteC.framework

After that I edited the TensorFlowLiteSwift.podspec and TensorFlowLiteSelectTfOps.podspec to include these 2 frameworks, then I successfully built the library on XCode, using Podfile. I have gone very far and I am able to import TensorFlowLiteSelectTfOps in my codes and run the "Interpreter" variable:

class TensorFlowModel {
    var interpreter: Interpreter?

    init(modelPath: String) {
        do {
            // Initialize the Flex Delegate
            let flexDelegate = MetalDelegate()  // Create an instance of FlexDelegate

            // Initialize the Interpreter with the Flex Delegate
            let options = Interpreter.Options()  // You can configure options if needed
            let delegates = [flexDelegate]  // Create an array of delegates
            interpreter = try Interpreter(modelPath: modelPath, options: options, delegates: delegates)
            
            // Allocate tensors
            try interpreter?.allocateTensors()
        } catch {
            print("Failed to create TensorFlow Lite interpreter: \(error)")
        }
    }

    func runModel() {
        do {
            try interpreter?.invoke()
            // Process the output
        } catch {
            print("Failed to invoke TensorFlow Lite interpreter: \(error)")
        }
    }
}

There are no errors which is good, but the problem is I cannot find and import the FlexDelegate library because my TFlite model runs on SelectOps. There are no FlexDelegate swift files that were being built.

Thanks in advance! And I hope to complete my iOS project.

@tanpengshi tanpengshi added the comp:lite TF Lite related issues label May 14, 2024
@sushreebarsa sushreebarsa added type:support Support issues TF 2.9 Issues found in the TF 2.9 release (or RCs) labels May 15, 2024
@sushreebarsa
Copy link
Contributor

@tanpengshi The FlexDelegate functionality is integrated within the TensorFlow Lite interpreter itself when built with Select Ops enabled. Could you please upgrade to the latest TF version and let us know as the older TF versions are not actively supported. Thank you!

@sushreebarsa sushreebarsa added the stat:awaiting response Status - Awaiting response from author label May 15, 2024
@tanpengshi
Copy link
Author

I have tried invoking the interpreter and I get the error

        guard let interpreter = tflite else {
            print("TFLite Error: Interpreter is nil.")
            return
        }
        do {
            let inputData = Data(buffer: UnsafeBufferPointer(start: dataBuffer, count: dataBuffer.count))
            try interpreter.copy(inputData, toInputAt: 0)
            try interpreter.invoke()

            let output = getTensorOutput(interpreter: interpreter)
            let detectedActionIndex = output.argmax()
            print("Detected Action Index is: \(detectedActionIndex)")
        } catch {
            print("TFLite Error: \(error.localizedDescription)")
        }

TensorFlow Lite Error: Select TensorFlow op(s), included in the given model, is(are) not supported by this interpreter. Make sure you apply/link the Flex delegate before inference. For the Android, it can be resolved by adding "org.tensorflow:tensorflow-lite-select-tf-ops" dependency. See instructions: https://www.tensorflow.org/lite/guide/ops_select
TensorFlow Lite Error: Node number 76 (FlexTensorListReserve) failed to prepare.
TFLite Error: Must call allocateTensors().

I have already "import TensorFlowLiteSelectTfOps" in my Swift file.

image

I have accidentally deleted the CoreMLDelegate and also the MetalDelegate from the library. Would it cause this error? I have deleted these libraries and I don't know how to recover them. When I do:

rm -rf Pods Podfile.lock
pod install

The files are still missing. I really appreciate your help! I have come very close to the solution for my App!

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label May 16, 2024
@sushreebarsa
Copy link
Contributor

@tanpengshi Directly recovering deleted libraries from within your project isn't possible. Could you try to use the latest TFlite version as newer versions might include these delegates within the framework itself, eliminating the need for separate libraries. After reinstalling/updating, clean and rebuild your project to ensure the changes take effect.

Thank you!

@sushreebarsa sushreebarsa added the stat:awaiting response Status - Awaiting response from author label May 16, 2024
@tanpengshi
Copy link
Author

Are you suggesting that I recreate a new project?

Now i proceed to try a more conventional method using the Podfile:

# Uncomment the next line to define a global platform for your project
platform :ios, '17.0'

target 'FacialRecognition' do
  # Comment the next line if you don't want to use dynamic frameworks
  use_frameworks!
  #pod 'TensorFlowLiteSwift', :path => '../../local-podspecs/TensorFlowLiteSwift.podspec'
  # pod 'TensorFlowLiteSelectTfOps', :path => '../../local-podspecs/TensorFlowLiteSelectTfOps.podspec'
  # pod 'TensorFlowLiteSwift'   # or 'TensorFlowLiteObjC'
  # pod 'TensorFlowLiteSelectTfOps', '~> 0.0.1-nightly'


  pod 'TensorFlowLiteSwift'
  pod 'TensorFlowLiteSelectTfOps', '~> 0.0.1-nightly'

  # Pods for FacialRecognition

  target 'FacialRecognitionTests' do
    inherit! :search_paths
    # Pods for testing
  end

  target 'FacialRecognitionUITests' do
    # Pods for testing
  end

  # Add these lines to ensure consistent EXCLUDED_ARCHS settings
  post_install do |installer|
    installer.pods_project.targets.each do |target|
      target.build_configurations.each do |config|
        config.build_settings['EXCLUDED_ARCHS[sdk=iphonesimulator*]'] = 'arm64'
      end
    end
  end

end

But when I build my project, I get:

image

When I check the 'Pods' directory, I don't see a 'resources-to-copy-FacialRecognition' file

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Status - Awaiting response from author label May 16, 2024
@tanpengshi
Copy link
Author

I am now able to solved the issue above and the TensorFlow Lite model with SelectOps is able to successfully run by following the suggestion here:

https://stackoverflow.com/questions/76792138/sandbox-bash72986-deny1-file-write-data-users-xxx-ios-pods-resources-to-co

However my app size is over 200MB because of the library! Hence, I need to do a Selective Build which effectively return me to the first solution. In my Podfile, I did for the first solution:

platform :ios, '17.0'

target 'FacialRecognition' do
  # Comment the next line if you don't want to use dynamic frameworks
  use_frameworks!
  pod 'TensorFlowLiteSwift', :path => '../../local-podspecs/TensorFlowLiteSwift.podspec'

  # Pods for FacialRecognition

  target 'FacialRecognitionTests' do
    inherit! :search_paths
    # Pods for testing
  end

  target 'FacialRecognitionUITests' do
    # Pods for testing
  end

  # Add these lines to ensure consistent EXCLUDED_ARCHS settings
  post_install do |installer|
    installer.pods_project.targets.each do |target|
      target.build_configurations.each do |config|
        config.build_settings['EXCLUDED_ARCHS[sdk=iphonesimulator*]'] = 'arm64'
      end
    end
  end

end

In my TensorFlowLiteSwift.podspec file, I have:

Pod::Spec.new do |s|
  s.name             = 'TensorFlowLiteSwift'
  s.version          = '2.7.0'
  s.authors          = 'Google Inc.'
  s.license          = { :type => 'Apache' }
  s.homepage         = 'https://github.com/tensorflow/tensorflow'
  s.source           = { :git => 'https://github.com/tensorflow/tensorflow.git', :tag => "v#{s.version}" }
  s.summary          = 'TensorFlow Lite for Swift'
  s.description      = <<-DESC

  TensorFlow Lite is TensorFlow's lightweight solution for Swift developers. It
  enables low-latency inference of on-device machine learning models with a
  small binary size and fast performance supporting hardware acceleration.
                       DESC

  s.ios.deployment_target = '17.0'

  s.module_name = 'TensorFlowLite'
  s.static_framework = true

  tfl_dir = 'tensorflow/lite/'
  swift_dir = tfl_dir + 'swift/'

  s.default_subspec = 'Core'

  s.subspec 'Core' do |core|
    # Adjust the path to point to your custom frameworks
    core.vendored_frameworks = [
      'frameworks/TensorFlowLiteC.framework',
      'frameworks/TensorFlowLiteSelectTfOps.framework'
    ]
    
    core.source_files = swift_dir + 'Sources/*.swift'
    core.exclude_files = swift_dir + 'Sources/{CoreML,Metal}Delegate.swift'

    core.test_spec 'Tests' do |ts|
      ts.source_files = swift_dir + 'Tests/*.swift'
      ts.exclude_files = swift_dir + 'Tests/MetalDelegateTests.swift'
      ts.resources = [
        tfl_dir + 'testdata/add.bin',
        tfl_dir + 'testdata/add_quantized.bin',
      ]
    end
  end

  s.subspec 'CoreML' do |coreml|
    coreml.source_files = swift_dir + 'Sources/CoreMLDelegate.swift'
    coreml.dependency 'TensorFlowLiteSwift/Core', "#{s.version}"
  end

  s.subspec 'Metal' do |metal|
    metal.source_files = swift_dir + 'Sources/MetalDelegate.swift'
    metal.dependency 'TensorFlowLiteSwift/Core', "#{s.version}"

    metal.test_spec 'Tests' do |ts|
      ts.source_files = swift_dir + 'Tests/{Interpreter,MetalDelegate}Tests.swift'
      ts.resources = [
        tfl_dir + 'testdata/add.bin',
        tfl_dir + 'testdata/add_quantized.bin',
        tfl_dir + 'testdata/multi_add.bin',
      ]
    end
  end
end

But using this method, with MetalDelegate and CoreMLDelegate libraries deleted, I have FlexDelegate not found error.

@sushreebarsa
Copy link
Contributor

@tanpengshi Thank you for the update. Glad the issue has been resolved.
Could you please let us know if we can close the issue?
Thank you!

@sushreebarsa sushreebarsa added the stat:awaiting response Status - Awaiting response from author label May 17, 2024
@tanpengshi
Copy link
Author

Sure, we can close this issue, and I will open a next one regarding the new issue

Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@sushreebarsa
Copy link
Contributor

@tanpengshi Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting response Status - Awaiting response from author TF 2.9 Issues found in the TF 2.9 release (or RCs) type:support Support issues
Projects
None yet
Development

No branches or pull requests

2 participants