Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

BUILD FAILED .. SwiftEmitModule normal arm64 Emitting module for react_native_llm_mediapipe #17

Open
SpaceTimeEvent2 opened this issue Oct 30, 2024 · 1 comment

Comments

@SpaceTimeEvent2
Copy link

SpaceTimeEvent2 commented Oct 30, 2024

I managed with some effort, to spin up the project from example app,
but after I install llm-mediapipe using npm on freshly generated project, I get an error after running the project

 nvm use --lts
 # also tried with nvm use 18
 # also tried latest RN 
npx @react-native-community/cli@latest init LlmMediapipeExample --version 0.73.7 
cd LlmMediapipeExample
npm install react-native-llm-mediapipe

# in xcode copy gemma-2b-it-cpu-int4.bin to the root of the project
# in xcode copy falcon-rw-1b-gpu.bin to the root of the project 
 
 cd ios && pod install && cd ..
 npx react-native run-ios

Output:
...
Lots of errors
...
/Users/user/Library/Developer/Xcode/DerivedData/LlmProject-afbekltqetkvwhabyqjxrvrkpevz/Build/Intermediates.noindex/Pods.build/Debug-iphonesimulator/react-native-llm-mediapipe.build/Objects-normal/arm64/react_native_llm_mediapipe.abi.json
/Users/user/Data_local/projects/mntl/llm_research/makingitwork/LlmProject/node_modules/react-native-llm-mediapipe/ios/LlmInferenceModel.swift:32:16: error: value of type 'LlmInference.Options' has no member 'topk'
llmOptions.topk = self.topK
....
warning: Run script build phase 'Bundle React Native code and images' will be run during every build because it does not specify any outputs. To address this issue, either add output dependencies to the script phase, or configure it to run in every build by unchecking "Based on dependency analysis" in the script phase. (in target 'LlmProject' from project 'LlmProject')

--- xcodebuild: WARNING: Using the first of multiple matching destinations:
{ platform:iOS Simulator, id:A0DDE1B1-E28F-45E5-964F-3D8FEA09663D, OS:18.0, name:iPhone SE (3rd generation) }
{ platform:iOS Simulator, id:A0DDE1B1-E28F-45E5-964F-3D8FEA09663D, OS:18.0, name:iPhone SE (3rd generation) }
** BUILD FAILED **

The following build commands failed:
SwiftEmitModule normal arm64 Emitting\ module\ for\ react_native_llm_mediapipe (in target 'react-native-llm-mediapipe' from project 'Pods')
EmitSwiftModule normal arm64 (in target 'react-native-llm-mediapipe' from project 'Pods')
Building workspace LlmProject with scheme LlmProject and configuration Debug
(3 failures)

@SpaceTimeEvent2
Copy link
Author

UPD.
I commented these lines
in node_modules/react-native-llm-mediapipe/ios/LlmInferenceModel.swift
and it's running now, although i'm not sure yet, if it runs as intended, it responds something on the first prompt, and stops responding for the second prompt.

final class LlmInferenceModel {

  weak var delegate: LlmInferenceModelDelegate?

  private lazy var inference: LlmInference! = {
    let llmOptions = LlmInference.Options(modelPath: self.modelPath)
    llmOptions.maxTokens = self.maxTokens
    //llmOptions.topK = self.topK
    //llmOptions.temperature = self.temperature
    //llmOptions.randomSeed = self.randomSeed
    return try? LlmInference(options: llmOptions)
  }()

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant