Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Show HN: CLI for working with Apple Core ML models (github.com/schappim)
45 points by schappim 1 day ago | hide | past | favorite | 5 comments




Looks really nice. I plan to try it out this weekend. I am not familiar with all the Core ML models. Where can I get their names to try out?

Does this handle conversion and quantization from PyTorch? Or is it strictly for running existing Core ML files?

Nope, but Apple released the python lib "coremltools"[1] and it can do the conversion. It supports the conversion of PyTorch, TF/TF Lite, ONNX etc...

1. https://pypi.org/project/coremltools/


> It supports the conversion of PyTorch, TF/TF Lite, ONNX

I think it doesn't support TF Lite (on TF SavedModels) and ONNX haven't been supported anymore for quite a while sadly

As for the repo I like it, actually yesterday and today had to convert few models and that would be useful. I see you use Swift instead of coremltools so thats great - for benchmarking should have less overhead.

Some ideas:

1) Would love to have this also as agent skill

2) Would be good if we could parse xcode performance report file and print in human readable format (to pass to AI) - gemini pro was struggling for a while to figure out the json format.


The JSON output makes it easy to wrap as a tool for frameworks like LangGraph, but I would be worried about the latency. Since it is a CLI, you are likely reloading the whole model for every invocation. That overhead is significant compared to a persistent service where the model stays loaded in memory.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: