ChanServ changed the topic of #etnaviv to: #etnaviv - the home of the reverse-engineered Vivante GPU driver - Logs https://oftc.irclog.whitequark.org/etnaviv
pcercuei has quit [Quit: dodo]
alarumbe has quit []
SpieringsAE has joined #etnaviv
frieder has joined #etnaviv
pcercuei has joined #etnaviv
lynxeye has joined #etnaviv
szemzoa_ has quit [Remote host closed the connection]
szemzoa has joined #etnaviv
szemzoa has quit [Remote host closed the connection]
szemzoa has joined #etnaviv
szemzoa has quit [Remote host closed the connection]
szemzoa has joined #etnaviv
frieder has quit [Ping timeout: 480 seconds]
frieder has joined #etnaviv
szemzoa has quit [Remote host closed the connection]
szemzoa has joined #etnaviv
<mwalle> anyone seen something like that lately? https://pastebin.com/raw/p3rPepQ9
<mwalle> (this is on a LS1028A)
<lynxeye> mwalle: Not sure if there are any other systems with etnaviv + iommu aside from ls1028a. This may be triggered by the quite unusual setup where we try to configure DMA for the drm device part of the driver while the module is probed. This may be before the iommu is probed, as the drm driver doesn't have any direct dependency onto the iommu. Only the individual GPU nodes do have this link.
<lynxeye> Maybe it would be a good idea to defer this dma configuration setup for the drm device to gpu_load time, where we can be sure that the GPU nodes are probed and thus the iommu must also be present.
SpieringsAE has quit [Remote host closed the connection]
frieder has quit [Remote host closed the connection]
alarumbe has joined #etnaviv
lynxeye has quit [Quit: Leaving.]
pcercuei has quit [Quit: dodo]