Hacker News new | past | comments | ask | show | jobs | submit login

I agree, this generally left me feeling skeptical. I know of Luca Delle Monache on the advisory team, through colleagues who have researched under him at Scripps and they spoke highly of him. But yes, there is a lot left to the imagination here.

With regards to the sfbay specifically I used to work with a fairly high resolution wind model for the bay (this was a more traditional dynamic based simulation) and it worked pretty well overall, but every time a storm blew through it would crash. This ultimately had to do with the relatively steep terrain in the bay specifically (and the physics configurations we were using in the actual model).

Even if they are using DL they still need initial and boundary conditions. As I said there are a ton of weather stations around so I could imagine a DL type approach that looked at terrain elevation, and recent + historical observations to initialize a forecast, but I still imagine that boundary conditions would have to be provided by nesting this in a larger model somehow. Then again, I'm not a DL expert at all so there are probably some newer stuff in this field that I'm just out of date on.

Its really expensive to run your own dynamic forecast model, at a refresh rate acceptable for an actual forecast, at this resolution. That's why I suspected its taking existing weather models and downscaling them with DL techniques, but I can't really know just by looking.




(For clarity, I was referring to the company leadership proper, not the advisory team.)




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: