Apr 16 19:27:36.455389 ip-10-0-129-155 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 19:27:36.455400 ip-10-0-129-155 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 19:27:36.455407 ip-10-0-129-155 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 19:27:36.455635 ip-10-0-129-155 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 19:27:46.537159 ip-10-0-129-155 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 19:27:46.537174 ip-10-0-129-155 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot d6b01b61406940fbacdc4e4e7d73aff4 -- Apr 16 19:30:16.325529 ip-10-0-129-155 systemd[1]: Starting Kubernetes Kubelet... Apr 16 19:30:16.678022 ip-10-0-129-155 kubenswrapper[2579]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:16.678022 ip-10-0-129-155 kubenswrapper[2579]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 19:30:16.678022 ip-10-0-129-155 kubenswrapper[2579]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:16.678022 ip-10-0-129-155 kubenswrapper[2579]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 19:30:16.678022 ip-10-0-129-155 kubenswrapper[2579]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 19:30:16.679565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.679477 2579 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 19:30:16.682318 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682303 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:16.682318 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682318 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682322 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682327 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682329 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682332 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682337 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682341 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682344 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682347 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682349 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682352 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682355 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682366 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682370 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682373 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682375 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682380 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682384 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682387 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:16.682381 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682391 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682394 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682397 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682400 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682403 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682406 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682409 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682411 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682414 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682417 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682419 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682422 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682424 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682427 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682430 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682433 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682435 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682438 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682440 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682443 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682445 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:16.682835 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682448 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682451 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682453 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682456 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682458 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682461 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682463 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682466 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682469 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682471 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682474 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682477 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682479 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682484 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682487 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682490 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682492 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682495 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682498 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682500 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:16.683363 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682503 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682505 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682509 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682511 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682514 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682517 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682519 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682522 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682524 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682527 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682529 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682532 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682534 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682537 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682539 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682542 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682544 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682547 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682549 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682552 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:16.683857 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682554 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682557 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682559 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682562 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682564 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682916 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682921 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682924 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682927 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682930 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682933 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682936 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682938 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682941 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682944 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682946 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682949 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682952 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682955 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682958 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:16.684352 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682960 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682963 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682965 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682968 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682972 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682976 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682980 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682983 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682986 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682989 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682992 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682995 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.682998 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683000 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683003 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683005 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683008 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683011 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:16.684832 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683013 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683017 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683019 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683022 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683025 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683027 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683030 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683032 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683035 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683037 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683040 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683042 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683045 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683048 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683050 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683053 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683055 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683058 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683060 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683063 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:16.685304 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683065 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683068 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683070 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683073 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683075 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683078 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683080 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683083 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683085 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683088 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683091 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683093 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683096 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683098 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683101 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683103 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683106 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683108 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683112 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683114 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683117 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:16.685803 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683119 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683122 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683124 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683127 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683130 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683132 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683135 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683138 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683140 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683143 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683145 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.683148 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.683988 2579 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.683997 2579 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684004 2579 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684009 2579 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684013 2579 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684016 2579 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684021 2579 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684025 2579 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684029 2579 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 19:30:16.686333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684033 2579 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684037 2579 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684040 2579 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684043 2579 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684046 2579 flags.go:64] FLAG: --cgroup-root="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684049 2579 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684052 2579 flags.go:64] FLAG: --client-ca-file="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684055 2579 flags.go:64] FLAG: --cloud-config="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684058 2579 flags.go:64] FLAG: --cloud-provider="external" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684061 2579 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684065 2579 flags.go:64] FLAG: --cluster-domain="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684068 2579 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684071 2579 flags.go:64] FLAG: --config-dir="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684073 2579 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684077 2579 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684081 2579 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684087 2579 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684090 2579 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684093 2579 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684096 2579 flags.go:64] FLAG: --contention-profiling="false" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684099 2579 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684102 2579 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684105 2579 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684108 2579 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684112 2579 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 19:30:16.686846 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684115 2579 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684118 2579 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684121 2579 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684124 2579 flags.go:64] FLAG: --enable-server="true" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684127 2579 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684132 2579 flags.go:64] FLAG: --event-burst="100" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684135 2579 flags.go:64] FLAG: --event-qps="50" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684139 2579 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684142 2579 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684145 2579 flags.go:64] FLAG: --eviction-hard="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684149 2579 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684151 2579 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684154 2579 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684157 2579 flags.go:64] FLAG: --eviction-soft="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684160 2579 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684163 2579 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684166 2579 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684169 2579 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684172 2579 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684175 2579 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684178 2579 flags.go:64] FLAG: --feature-gates="" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684182 2579 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684185 2579 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684190 2579 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684193 2579 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684196 2579 flags.go:64] FLAG: --healthz-port="10248" Apr 16 19:30:16.687486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684199 2579 flags.go:64] FLAG: --help="false" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684216 2579 flags.go:64] FLAG: --hostname-override="ip-10-0-129-155.ec2.internal" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684219 2579 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684223 2579 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684226 2579 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684229 2579 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684232 2579 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684235 2579 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684238 2579 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684241 2579 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684244 2579 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684247 2579 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684250 2579 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684253 2579 flags.go:64] FLAG: --kube-reserved="" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684256 2579 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684259 2579 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684262 2579 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684265 2579 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684268 2579 flags.go:64] FLAG: --lock-file="" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684270 2579 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684273 2579 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684276 2579 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684281 2579 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 19:30:16.688156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684284 2579 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684287 2579 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684290 2579 flags.go:64] FLAG: --logging-format="text" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684293 2579 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684296 2579 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684299 2579 flags.go:64] FLAG: --manifest-url="" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684303 2579 flags.go:64] FLAG: --manifest-url-header="" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684307 2579 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684311 2579 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684315 2579 flags.go:64] FLAG: --max-pods="110" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684318 2579 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684321 2579 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684324 2579 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684327 2579 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684330 2579 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684333 2579 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684336 2579 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684343 2579 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684346 2579 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684349 2579 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684352 2579 flags.go:64] FLAG: --pod-cidr="" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684355 2579 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684362 2579 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684365 2579 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 19:30:16.688764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684368 2579 flags.go:64] FLAG: --pods-per-core="0" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684371 2579 flags.go:64] FLAG: --port="10250" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684374 2579 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684377 2579 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0fc99941c8320decd" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684380 2579 flags.go:64] FLAG: --qos-reserved="" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684383 2579 flags.go:64] FLAG: --read-only-port="10255" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684386 2579 flags.go:64] FLAG: --register-node="true" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684389 2579 flags.go:64] FLAG: --register-schedulable="true" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684391 2579 flags.go:64] FLAG: --register-with-taints="" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684395 2579 flags.go:64] FLAG: --registry-burst="10" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684398 2579 flags.go:64] FLAG: --registry-qps="5" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684401 2579 flags.go:64] FLAG: --reserved-cpus="" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684404 2579 flags.go:64] FLAG: --reserved-memory="" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684407 2579 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684410 2579 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684415 2579 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684418 2579 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684421 2579 flags.go:64] FLAG: --runonce="false" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684424 2579 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684427 2579 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684430 2579 flags.go:64] FLAG: --seccomp-default="false" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684433 2579 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684436 2579 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684440 2579 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684443 2579 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684446 2579 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 19:30:16.689378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684449 2579 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684452 2579 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684455 2579 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684457 2579 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684460 2579 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684463 2579 flags.go:64] FLAG: --system-cgroups="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684466 2579 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684471 2579 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684474 2579 flags.go:64] FLAG: --tls-cert-file="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684477 2579 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684480 2579 flags.go:64] FLAG: --tls-min-version="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684483 2579 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684486 2579 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684489 2579 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684491 2579 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684494 2579 flags.go:64] FLAG: --v="2" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684499 2579 flags.go:64] FLAG: --version="false" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684503 2579 flags.go:64] FLAG: --vmodule="" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684507 2579 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.684510 2579 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684604 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684609 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684612 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684615 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:16.690024 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684618 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684621 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684624 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684627 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684629 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684632 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684635 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684638 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684640 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684643 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684646 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684648 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684651 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684653 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684656 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684658 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684661 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684664 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684666 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684669 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:16.690718 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684671 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684674 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684677 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684679 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684681 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684684 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684686 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684689 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684691 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684695 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684698 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684701 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684703 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684707 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684709 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684712 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684714 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684717 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684719 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684722 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:16.691243 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684725 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684727 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684730 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684733 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684735 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684738 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684740 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684743 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684745 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684748 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684750 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684753 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684755 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684758 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684761 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684763 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684765 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684768 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684770 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684773 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:16.691737 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684775 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684779 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684782 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684784 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684787 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684791 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684795 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684799 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684803 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684806 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684809 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684812 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684814 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684817 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684820 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684823 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684825 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684827 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684830 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:16.692520 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684832 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:16.693089 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684835 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:16.693089 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.684838 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:16.693089 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.685382 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:16.693089 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.693077 2579 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.693095 2579 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693142 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693147 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693151 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693154 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693157 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693160 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693163 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693166 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693168 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693171 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693173 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693176 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693178 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693181 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693183 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693186 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693189 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693192 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:16.693198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693194 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693197 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693199 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693216 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693221 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693226 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693228 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693231 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693234 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693237 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693239 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693242 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693244 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693247 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693250 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693252 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693255 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693257 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693260 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693263 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:16.693700 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693266 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693270 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693273 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693276 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693278 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693281 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693284 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693286 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693288 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693291 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693294 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693296 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693298 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693301 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693304 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693307 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693311 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693316 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693319 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:16.694246 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693322 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693324 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693327 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693329 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693332 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693335 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693337 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693339 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693342 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693344 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693347 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693349 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693352 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693354 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693357 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693359 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693362 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693364 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693367 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:16.694730 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693371 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693373 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693376 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693378 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693381 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693383 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693386 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693388 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693392 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693395 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.693400 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693490 2579 feature_gate.go:328] unrecognized feature gate: Example Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693496 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693499 2579 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693502 2579 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693505 2579 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 19:30:16.695198 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693507 2579 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693510 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693513 2579 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693516 2579 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693518 2579 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693521 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693524 2579 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693527 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693529 2579 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693532 2579 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693534 2579 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693538 2579 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693542 2579 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693545 2579 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693548 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693550 2579 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693553 2579 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693557 2579 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693559 2579 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693562 2579 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 19:30:16.695603 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693564 2579 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693567 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693570 2579 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693572 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693576 2579 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693579 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693582 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693585 2579 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693588 2579 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693590 2579 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693593 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693595 2579 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693598 2579 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693600 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693603 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693605 2579 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693608 2579 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693610 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693613 2579 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 19:30:16.696092 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693616 2579 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693618 2579 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693621 2579 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693623 2579 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693626 2579 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693628 2579 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693631 2579 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693633 2579 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693636 2579 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693638 2579 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693641 2579 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693644 2579 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693647 2579 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693649 2579 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693652 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693655 2579 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693657 2579 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693660 2579 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693662 2579 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693665 2579 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 19:30:16.696569 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693667 2579 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693670 2579 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693673 2579 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693675 2579 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693678 2579 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693680 2579 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693683 2579 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693685 2579 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693688 2579 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693690 2579 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693693 2579 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693695 2579 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693698 2579 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693701 2579 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693703 2579 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693706 2579 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693709 2579 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693711 2579 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693714 2579 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 19:30:16.697055 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693717 2579 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 19:30:16.697642 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693719 2579 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 19:30:16.697642 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:16.693722 2579 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 19:30:16.697642 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.693727 2579 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 19:30:16.697642 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.694379 2579 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 19:30:16.697642 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.697495 2579 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 19:30:16.698382 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.698371 2579 server.go:1019] "Starting client certificate rotation" Apr 16 19:30:16.698486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.698468 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:16.698521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.698512 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 19:30:16.719451 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.719434 2579 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:16.724962 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.724941 2579 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 19:30:16.738308 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.738291 2579 log.go:25] "Validated CRI v1 runtime API" Apr 16 19:30:16.743142 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.743126 2579 log.go:25] "Validated CRI v1 image API" Apr 16 19:30:16.745879 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.745867 2579 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 19:30:16.747463 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.747448 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:16.748667 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.748637 2579 fs.go:135] Filesystem UUIDs: map[0419cd3e-c1bf-4700-b1ab-424294708d21:/dev/nvme0n1p4 409a759d-8686-4207-85fd-fad1cdc9f5e0:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 16 19:30:16.748705 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.748668 2579 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 19:30:16.756197 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.756081 2579 manager.go:217] Machine: {Timestamp:2026-04-16 19:30:16.754420713 +0000 UTC m=+0.331198916 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3112495 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21a1cd9711d4798cab123c1aa18dc1 SystemUUID:ec21a1cd-9711-d479-8cab-123c1aa18dc1 BootID:d6b01b61-4069-40fb-acdc-4e4e7d73aff4 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:23:9e:9d:b7:0d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:23:9e:9d:b7:0d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:ca:67:27:65:dc:04 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 19:30:16.756197 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.756190 2579 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 19:30:16.756328 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.756281 2579 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 19:30:16.758354 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.758330 2579 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 19:30:16.758488 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.758358 2579 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-155.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 19:30:16.758535 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.758498 2579 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 19:30:16.758535 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.758506 2579 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 19:30:16.758535 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.758520 2579 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:16.759104 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.759094 2579 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 19:30:16.760342 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.760332 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:16.760444 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.760436 2579 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 19:30:16.762536 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.762527 2579 kubelet.go:491] "Attempting to sync node with API server" Apr 16 19:30:16.762579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.762540 2579 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 19:30:16.762579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.762552 2579 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 19:30:16.762579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.762561 2579 kubelet.go:397] "Adding apiserver pod source" Apr 16 19:30:16.762579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.762569 2579 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 19:30:16.763490 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.763478 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:16.763529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.763498 2579 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 19:30:16.765929 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.765913 2579 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 19:30:16.767252 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.767239 2579 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 19:30:16.768838 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768828 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768844 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768850 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768856 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768862 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768867 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 19:30:16.768874 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768872 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 19:30:16.769041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768878 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 19:30:16.769041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768891 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 19:30:16.769041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768898 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 19:30:16.769041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768922 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 19:30:16.769041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.768931 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 19:30:16.770134 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.770122 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 19:30:16.770170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.770140 2579 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 19:30:16.773840 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.773825 2579 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 19:30:16.773910 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.773860 2579 server.go:1295] "Started kubelet" Apr 16 19:30:16.774010 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.773964 2579 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 19:30:16.774010 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.773974 2579 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 19:30:16.774113 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.774024 2579 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 19:30:16.774673 ip-10-0-129-155 systemd[1]: Started Kubernetes Kubelet. Apr 16 19:30:16.775043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.775018 2579 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 19:30:16.775865 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.775836 2579 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-155.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 19:30:16.775865 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.775838 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 19:30:16.775998 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.775891 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-155.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 19:30:16.775998 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.775956 2579 server.go:317] "Adding debug handlers to kubelet server" Apr 16 19:30:16.779938 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.779918 2579 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:16.780426 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.780409 2579 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 19:30:16.781343 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.781177 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:16.781343 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781316 2579 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 19:30:16.781343 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781321 2579 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 19:30:16.781531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781352 2579 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 19:30:16.781531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781467 2579 factory.go:55] Registering systemd factory Apr 16 19:30:16.781531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781484 2579 factory.go:223] Registration of the systemd container factory successfully Apr 16 19:30:16.781531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781487 2579 reconstruct.go:97] "Volume reconstruction finished" Apr 16 19:30:16.781531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.781494 2579 reconciler.go:26] "Reconciler: start to sync state" Apr 16 19:30:16.782431 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.782369 2579 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-155.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 19:30:16.783978 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.782353 2579 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-155.ec2.internal.18a6ed1e253d9f96 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-155.ec2.internal,UID:ip-10-0-129-155.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-155.ec2.internal,},FirstTimestamp:2026-04-16 19:30:16.773836694 +0000 UTC m=+0.350614896,LastTimestamp:2026-04-16 19:30:16.773836694 +0000 UTC m=+0.350614896,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-155.ec2.internal,}" Apr 16 19:30:16.784689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.784670 2579 factory.go:153] Registering CRI-O factory Apr 16 19:30:16.784689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.784690 2579 factory.go:223] Registration of the crio container factory successfully Apr 16 19:30:16.784814 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.784779 2579 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 19:30:16.784814 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.784808 2579 factory.go:103] Registering Raw factory Apr 16 19:30:16.784928 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.784823 2579 manager.go:1196] Started watching for new ooms in manager Apr 16 19:30:16.785283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.785263 2579 manager.go:319] Starting recovery of all containers Apr 16 19:30:16.785692 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.785670 2579 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 19:30:16.791383 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.791354 2579 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 19:30:16.794402 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.794381 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4jxfr" Apr 16 19:30:16.797170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.797151 2579 manager.go:324] Recovery completed Apr 16 19:30:16.798395 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.798379 2579 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 19:30:16.799769 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.799754 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-4jxfr" Apr 16 19:30:16.801132 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.801120 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:16.803494 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803481 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:16.803556 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803507 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:16.803556 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803517 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:16.803967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803954 2579 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 19:30:16.803967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803964 2579 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 19:30:16.804063 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.803978 2579 state_mem.go:36] "Initialized new in-memory state store" Apr 16 19:30:16.807430 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.807418 2579 policy_none.go:49] "None policy: Start" Apr 16 19:30:16.807466 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.807435 2579 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 19:30:16.807466 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.807460 2579 state_mem.go:35] "Initializing new in-memory state store" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.851757 2579 manager.go:341] "Starting Device Plugin manager" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.851790 2579 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.851800 2579 server.go:85] "Starting device plugin registration server" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.852009 2579 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.852022 2579 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.852115 2579 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.852186 2579 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.852194 2579 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.852647 2579 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 19:30:16.860520 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.852681 2579 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:16.937480 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.937406 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 19:30:16.938552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.938536 2579 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 19:30:16.938619 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.938561 2579 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 19:30:16.938619 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.938581 2579 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 19:30:16.938619 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.938588 2579 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 19:30:16.938619 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.938616 2579 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 19:30:16.941415 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.941395 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:16.952250 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.952233 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:16.953035 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.953018 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:16.953112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.953049 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:16.953112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.953060 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:16.953112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.953083 2579 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-155.ec2.internal" Apr 16 19:30:16.962051 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:16.962034 2579 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-155.ec2.internal" Apr 16 19:30:16.962106 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.962053 2579 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-155.ec2.internal\": node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:16.975317 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:16.975293 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.039482 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.039448 2579 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal"] Apr 16 19:30:17.039606 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.039540 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.040412 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.040395 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.040472 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.040427 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.040472 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.040438 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.042699 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.042687 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.042833 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.042817 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.042868 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.042849 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.043423 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043407 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.043525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043434 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.043525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043407 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.043525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043444 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.043525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043462 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.043525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.043476 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.045669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.045653 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.045719 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.045682 2579 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 19:30:17.046348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.046332 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientMemory" Apr 16 19:30:17.046432 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.046362 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 19:30:17.046432 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.046376 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeHasSufficientPID" Apr 16 19:30:17.060745 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.060727 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-155.ec2.internal\" not found" node="ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.064950 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.064927 2579 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-155.ec2.internal\" not found" node="ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.075999 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.075983 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.083059 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.083045 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.083102 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.083074 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.083102 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.083094 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad3ef22c0a8b4cbdc51ac89991654b86-config\") pod \"kube-apiserver-proxy-ip-10-0-129-155.ec2.internal\" (UID: \"ad3ef22c0a8b4cbdc51ac89991654b86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.176434 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.176411 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.183760 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.183833 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.183833 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183785 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad3ef22c0a8b4cbdc51ac89991654b86-config\") pod \"kube-apiserver-proxy-ip-10-0-129-155.ec2.internal\" (UID: \"ad3ef22c0a8b4cbdc51ac89991654b86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.183833 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183811 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ad3ef22c0a8b4cbdc51ac89991654b86-config\") pod \"kube-apiserver-proxy-ip-10-0-129-155.ec2.internal\" (UID: \"ad3ef22c0a8b4cbdc51ac89991654b86\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.183936 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.183936 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.183840 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/be5a5919c15e8acb1a91eca10420db96-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal\" (UID: \"be5a5919c15e8acb1a91eca10420db96\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.277188 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.277129 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.362706 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.362670 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.367268 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.367254 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.377615 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.377594 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.477719 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.477688 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.578302 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.578229 2579 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-155.ec2.internal\" not found" Apr 16 19:30:17.591021 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.591006 2579 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:17.677907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.677881 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:17.680731 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.680714 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.695957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.695937 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:17.697330 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.697318 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.698511 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.698497 2579 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 19:30:17.698636 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.698620 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:17.698707 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.698659 2579 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 19:30:17.698707 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.698683 2579 kubelet.go:3342] "Failed creating a mirror pod" err="Post \"https://ae2a61e8933b44875b465826cd149d6d-86a1e1a8389b150a.elb.us-east-1.amazonaws.com:6443/api/v1/namespaces/openshift-machine-config-operator/pods\": read tcp 10.0.129.155:34692->34.237.225.80:6443: use of closed network connection" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:17.763266 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.763236 2579 apiserver.go:52] "Watching apiserver" Apr 16 19:30:17.773922 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.773898 2579 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 19:30:17.774260 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.774238 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-nh8b7","kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr","openshift-image-registry/node-ca-l5xrl","openshift-multus/multus-additional-cni-plugins-whspd","openshift-multus/multus-ncrmg","openshift-network-operator/iptables-alerter-j45kn","openshift-ovn-kubernetes/ovnkube-node-mx7dg","openshift-cluster-node-tuning-operator/tuned-bppgj","openshift-multus/network-metrics-daemon-7mh9f","openshift-network-diagnostics/network-check-target-6glh4"] Apr 16 19:30:17.777361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.777346 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.779885 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.779870 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 19:30:17.780012 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.779992 2579 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 19:30:17.780372 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.780354 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 19:30:17.780491 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.780434 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mpp7w\"" Apr 16 19:30:17.781533 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.781516 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.781623 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.781600 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.783794 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.783779 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.784090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784075 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 19:30:17.784150 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784110 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-67p2p\"" Apr 16 19:30:17.784184 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784170 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.784322 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784307 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4kztg\"" Apr 16 19:30:17.784396 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784322 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.784396 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784348 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 19:30:17.784464 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784404 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.784555 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.784542 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.786003 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.785979 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.786003 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.785995 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 19:30:17.786328 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.786311 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.786328 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.786321 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.786447 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.786390 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 19:30:17.786577 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.786565 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 19:30:17.787109 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787080 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-os-release\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787246 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787246 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787151 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d478cc2e-78cb-4140-9eaf-2624faf8382b-agent-certs\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.787246 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787171 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-m7qsg\"" Apr 16 19:30:17.787246 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787199 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-system-cni-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787250 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d478cc2e-78cb-4140-9eaf-2624faf8382b-konnectivity-ca\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787298 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-socket-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-sys-fs\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787346 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-482ls\" (UniqueName: \"kubernetes.io/projected/09889d6a-515b-4fb3-acfb-41009ecb5107-kube-api-access-482ls\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-registration-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787375 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-device-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787388 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9859a426-6968-44ca-b63e-42baba2b957d-host\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.787428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787404 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9859a426-6968-44ca-b63e-42baba2b957d-serviceca\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn4zw\" (UniqueName: \"kubernetes.io/projected/9859a426-6968-44ca-b63e-42baba2b957d-kube-api-access-hn4zw\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-binary-copy\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787482 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787515 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6k6n\" (UniqueName: \"kubernetes.io/projected/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-kube-api-access-d6k6n\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787558 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787584 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-etc-selinux\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.787758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787617 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cnibin\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.788000 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.787967 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 19:30:17.788233 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.788221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.788424 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.788405 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-fqv6z\"" Apr 16 19:30:17.790366 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.790347 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.790526 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.790512 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.790579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.790536 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.790579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.790572 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-8zsmq\"" Apr 16 19:30:17.790983 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.790970 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 19:30:17.792818 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.792801 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.793182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.793162 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 19:30:17.793514 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.793495 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 19:30:17.793592 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.793581 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.793796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.793781 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dkf26\"" Apr 16 19:30:17.794745 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.794726 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.794812 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.794756 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 19:30:17.795089 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.795072 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:17.795159 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.795123 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:17.795646 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.795625 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 19:30:17.795727 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.795660 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:17.795727 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.795694 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2zhxj\"" Apr 16 19:30:17.797277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.797264 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:17.797326 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.797308 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:17.799582 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.799563 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:17.801604 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.801574 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 19:25:16 +0000 UTC" deadline="2027-09-28 21:55:17.837558297 +0000 UTC" Apr 16 19:30:17.801604 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.801602 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12722h25m0.035959124s" Apr 16 19:30:17.804762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.804744 2579 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 19:30:17.841776 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.841758 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-ls7dc"] Apr 16 19:30:17.843641 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.843623 2579 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-g8wvn" Apr 16 19:30:17.844490 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.844474 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.844573 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.844539 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:17.852743 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.852725 2579 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-g8wvn" Apr 16 19:30:17.882182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.882165 2579 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 19:30:17.888331 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888311 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-device-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888411 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888341 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wbx\" (UniqueName: \"kubernetes.io/projected/4b53a341-a257-4e51-866a-7aaefe569885-kube-api-access-95wbx\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888411 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888357 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-var-lib-kubelet\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.888411 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888393 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888419 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-socket-dir-parent\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888429 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-device-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888436 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-k8s-cni-cncf-io\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888484 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-multus-certs\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888504 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-tmp\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888519 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-ovn\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888537 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d478cc2e-78cb-4140-9eaf-2624faf8382b-agent-certs\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888553 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888576 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-kubelet-config\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888591 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-kubelet\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888606 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-bin\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888627 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-netd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888652 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-hostroot\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888684 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-config\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888731 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-registration-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888752 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hn4zw\" (UniqueName: \"kubernetes.io/projected/9859a426-6968-44ca-b63e-42baba2b957d-kube-api-access-hn4zw\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888767 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-binary-copy\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888769 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-registration-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888788 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-multus-daemon-config\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888804 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-var-lib-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888835 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-etc-selinux\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888867 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-os-release\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.888916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888894 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888929 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-lib-modules\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-etc-selinux\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888941 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-os-release\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-host\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888983 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-os-release\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.888999 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-systemd\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-etc-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889040 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d478cc2e-78cb-4140-9eaf-2624faf8382b-konnectivity-ca\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889068 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-482ls\" (UniqueName: \"kubernetes.io/projected/09889d6a-515b-4fb3-acfb-41009ecb5107-kube-api-access-482ls\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889093 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-cnibin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889117 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-modprobe-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-run\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889146 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqhhm\" (UniqueName: \"kubernetes.io/projected/ca23e8db-bb88-449f-8286-27f2978eb0ca-kube-api-access-nqhhm\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889173 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-slash\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889193 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-script-lib\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889232 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9859a426-6968-44ca-b63e-42baba2b957d-host\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.889277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889248 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9859a426-6968-44ca-b63e-42baba2b957d-serviceca\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889283 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6k6n\" (UniqueName: \"kubernetes.io/projected/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-kube-api-access-d6k6n\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889281 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9859a426-6968-44ca-b63e-42baba2b957d-host\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889309 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysconfig\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889324 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-etc-tuned\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889363 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-node-log\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889383 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-system-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889416 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r9r9\" (UniqueName: \"kubernetes.io/projected/f6b1a716-a116-40d7-bd7e-8947f3cfea04-kube-api-access-7r9r9\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889433 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-netns\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889453 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-bin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889479 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b86022-39d0-4850-a61d-778f535ea12e-iptables-alerter-script\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889507 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889525 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889545 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-env-overrides\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889576 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-socket-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.889958 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889611 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-sys-fs\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889660 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-sys-fs\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889677 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/09889d6a-515b-4fb3-acfb-41009ecb5107-socket-dir\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889676 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-dbus\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889726 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-etc-kubernetes\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889774 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-kubernetes\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889799 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889823 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b86022-39d0-4850-a61d-778f535ea12e-host-slash\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889880 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovn-node-metrics-cert\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889917 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889949 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cnibin\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889978 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.889979 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cnibin\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890006 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-cni-binary-copy\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890035 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-multus\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.890679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890059 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-conf-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890082 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-systemd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890107 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-system-cni-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890131 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-conf\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890134 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-tuning-conf-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890155 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghvp7\" (UniqueName: \"kubernetes.io/projected/14b86022-39d0-4850-a61d-778f535ea12e-kube-api-access-ghvp7\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890179 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-systemd-units\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890182 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-binary-copy\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890219 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-log-socket\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890221 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890186 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-system-cni-dir\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890248 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-kubelet\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890245 2579 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890288 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-sys\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890305 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-netns\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890323 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8nl\" (UniqueName: \"kubernetes.io/projected/b04a0377-865b-45eb-b85e-384e518a7c12-kube-api-access-tj8nl\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.891230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.890874 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.891729 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.891030 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9859a426-6968-44ca-b63e-42baba2b957d-serviceca\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.891729 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.891093 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d478cc2e-78cb-4140-9eaf-2624faf8382b-konnectivity-ca\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.892754 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.892739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d478cc2e-78cb-4140-9eaf-2624faf8382b-agent-certs\") pod \"konnectivity-agent-nh8b7\" (UID: \"d478cc2e-78cb-4140-9eaf-2624faf8382b\") " pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:17.896015 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.895994 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn4zw\" (UniqueName: \"kubernetes.io/projected/9859a426-6968-44ca-b63e-42baba2b957d-kube-api-access-hn4zw\") pod \"node-ca-l5xrl\" (UID: \"9859a426-6968-44ca-b63e-42baba2b957d\") " pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:17.896363 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.896342 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-482ls\" (UniqueName: \"kubernetes.io/projected/09889d6a-515b-4fb3-acfb-41009ecb5107-kube-api-access-482ls\") pod \"aws-ebs-csi-driver-node-5sjqr\" (UID: \"09889d6a-515b-4fb3-acfb-41009ecb5107\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:17.896793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.896775 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6k6n\" (UniqueName: \"kubernetes.io/projected/6c0704bd-2f0f-4e78-8573-cf9346b4ae16-kube-api-access-d6k6n\") pod \"multus-additional-cni-plugins-whspd\" (UID: \"6c0704bd-2f0f-4e78-8573-cf9346b4ae16\") " pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:17.959622 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.959597 2579 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:17.991053 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991029 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-kubelet\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991059 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-sys\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-netns\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991098 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8nl\" (UniqueName: \"kubernetes.io/projected/b04a0377-865b-45eb-b85e-384e518a7c12-kube-api-access-tj8nl\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-kubelet\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991147 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-sys\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-netns\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-run-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-95wbx\" (UniqueName: \"kubernetes.io/projected/4b53a341-a257-4e51-866a-7aaefe569885-kube-api-access-95wbx\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991269 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-var-lib-kubelet\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991294 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-socket-dir-parent\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991346 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-socket-dir-parent\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991351 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-var-lib-kubelet\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991378 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-k8s-cni-cncf-io\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991407 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-multus-certs\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991432 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-tmp\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991466 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-multus-certs\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991470 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-k8s-cni-cncf-io\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991491 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-ovn\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991514 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-kubelet-config\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991538 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-kubelet\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991541 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-ovn\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991566 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-bin\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991580 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-netd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991595 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991610 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-hostroot\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.991638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991633 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-config\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991653 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-multus-daemon-config\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991652 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-bin\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991657 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-cni-netd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991668 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-var-lib-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991658 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-kubelet\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991699 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-hostroot\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991714 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991740 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-var-lib-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991801 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-kubelet-config\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991845 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-lib-modules\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991884 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-host\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991910 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-os-release\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991915 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-systemd\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991964 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-host\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.991976 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-etc-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.992487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-lib-modules\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992004 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-cnibin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992023 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-os-release\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992042 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-modprobe-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992067 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-etc-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992085 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-run\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-cnibin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992060 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-systemd\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nqhhm\" (UniqueName: \"kubernetes.io/projected/ca23e8db-bb88-449f-8286-27f2978eb0ca-kube-api-access-nqhhm\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992138 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-modprobe-d\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992152 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-slash\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992178 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-script-lib\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992188 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-slash\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992153 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-run\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992226 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysconfig\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992275 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-etc-tuned\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992276 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-config\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-multus-daemon-config\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.993401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992302 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992326 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysconfig\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992347 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-node-log\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992374 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-system-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.992395 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992401 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-node-log\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r9r9\" (UniqueName: \"kubernetes.io/projected/f6b1a716-a116-40d7-bd7e-8947f3cfea04-kube-api-access-7r9r9\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992437 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-netns\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992469 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-bin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992500 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b86022-39d0-4850-a61d-778f535ea12e-iptables-alerter-script\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992560 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992597 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.992594 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:18.492564149 +0000 UTC m=+2.069342366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992647 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-run-netns\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992675 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-system-cni-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-bin\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992687 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovnkube-script-lib\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992704 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-openvswitch\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992803 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-env-overrides\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992838 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-dbus\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992865 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-etc-kubernetes\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992890 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-kubernetes\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992924 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992951 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b86022-39d0-4850-a61d-778f535ea12e-host-slash\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.992975 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovn-node-metrics-cert\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993001 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-cni-binary-copy\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-multus\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993081 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-conf-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993111 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-systemd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993154 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-conf\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993183 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ghvp7\" (UniqueName: \"kubernetes.io/projected/14b86022-39d0-4850-a61d-778f535ea12e-kube-api-access-ghvp7\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993192 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/14b86022-39d0-4850-a61d-778f535ea12e-iptables-alerter-script\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.994873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993225 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-systemd-units\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993278 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-systemd-units\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6b1a716-a116-40d7-bd7e-8947f3cfea04-env-overrides\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993268 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-etc-kubernetes\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993325 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-kubernetes\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993331 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-log-socket\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993327 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/14b86022-39d0-4850-a61d-778f535ea12e-host-slash\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-host-var-lib-cni-multus\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993359 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4b53a341-a257-4e51-866a-7aaefe569885-multus-conf-dir\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993399 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-run-systemd\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993405 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6b1a716-a116-40d7-bd7e-8947f3cfea04-log-socket\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993528 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/b04a0377-865b-45eb-b85e-384e518a7c12-etc-sysctl-conf\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.993598 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-dbus\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.993788 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:18.493768062 +0000 UTC m=+2.070546265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4b53a341-a257-4e51-866a-7aaefe569885-cni-binary-copy\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.993925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-tmp\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.995647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.995618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/b04a0377-865b-45eb-b85e-384e518a7c12-etc-tuned\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:17.996236 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:17.995967 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6b1a716-a116-40d7-bd7e-8947f3cfea04-ovn-node-metrics-cert\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:17.999417 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.999396 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:17.999417 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.999419 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:17.999619 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.999433 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:17.999619 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:17.999493 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:18.499477689 +0000 UTC m=+2.076255901 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:18.000429 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.000408 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8nl\" (UniqueName: \"kubernetes.io/projected/b04a0377-865b-45eb-b85e-384e518a7c12-kube-api-access-tj8nl\") pod \"tuned-bppgj\" (UID: \"b04a0377-865b-45eb-b85e-384e518a7c12\") " pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:18.000529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.000484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wbx\" (UniqueName: \"kubernetes.io/projected/4b53a341-a257-4e51-866a-7aaefe569885-kube-api-access-95wbx\") pod \"multus-ncrmg\" (UID: \"4b53a341-a257-4e51-866a-7aaefe569885\") " pod="openshift-multus/multus-ncrmg" Apr 16 19:30:18.001091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.001068 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqhhm\" (UniqueName: \"kubernetes.io/projected/ca23e8db-bb88-449f-8286-27f2978eb0ca-kube-api-access-nqhhm\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:18.002161 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.002145 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghvp7\" (UniqueName: \"kubernetes.io/projected/14b86022-39d0-4850-a61d-778f535ea12e-kube-api-access-ghvp7\") pod \"iptables-alerter-j45kn\" (UID: \"14b86022-39d0-4850-a61d-778f535ea12e\") " pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:18.002338 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.002322 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r9r9\" (UniqueName: \"kubernetes.io/projected/f6b1a716-a116-40d7-bd7e-8947f3cfea04-kube-api-access-7r9r9\") pod \"ovnkube-node-mx7dg\" (UID: \"f6b1a716-a116-40d7-bd7e-8947f3cfea04\") " pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:18.084051 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.084010 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad3ef22c0a8b4cbdc51ac89991654b86.slice/crio-887f540eb45690731bd26e99529ad2e93c24bd446c5aaf89ffd166e1d3b32e18 WatchSource:0}: Error finding container 887f540eb45690731bd26e99529ad2e93c24bd446c5aaf89ffd166e1d3b32e18: Status 404 returned error can't find the container with id 887f540eb45690731bd26e99529ad2e93c24bd446c5aaf89ffd166e1d3b32e18 Apr 16 19:30:18.085412 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.085363 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe5a5919c15e8acb1a91eca10420db96.slice/crio-481a9eec580fe0cf1fb7dde34ea868f6170c17e68e7fd0bc9da3d66fce767812 WatchSource:0}: Error finding container 481a9eec580fe0cf1fb7dde34ea868f6170c17e68e7fd0bc9da3d66fce767812: Status 404 returned error can't find the container with id 481a9eec580fe0cf1fb7dde34ea868f6170c17e68e7fd0bc9da3d66fce767812 Apr 16 19:30:18.088411 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.088391 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:30:18.102146 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.102127 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:18.107859 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.107840 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd478cc2e_78cb_4140_9eaf_2624faf8382b.slice/crio-0d3f9dc271124068ad128bae29158416afafb41e408aa1fe9c538cc635da62fc WatchSource:0}: Error finding container 0d3f9dc271124068ad128bae29158416afafb41e408aa1fe9c538cc635da62fc: Status 404 returned error can't find the container with id 0d3f9dc271124068ad128bae29158416afafb41e408aa1fe9c538cc635da62fc Apr 16 19:30:18.112757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.112741 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" Apr 16 19:30:18.118111 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.118093 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09889d6a_515b_4fb3_acfb_41009ecb5107.slice/crio-268674c010eaf2fbd4446b5bca336536083a1e0519721b4b7881a49d7d725d72 WatchSource:0}: Error finding container 268674c010eaf2fbd4446b5bca336536083a1e0519721b4b7881a49d7d725d72: Status 404 returned error can't find the container with id 268674c010eaf2fbd4446b5bca336536083a1e0519721b4b7881a49d7d725d72 Apr 16 19:30:18.133687 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.133671 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-l5xrl" Apr 16 19:30:18.137164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.137148 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-whspd" Apr 16 19:30:18.139088 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.139065 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9859a426_6968_44ca_b63e_42baba2b957d.slice/crio-70a07fb7f31e5b1158704b0e07657ebcc9bc083637f46fd258bf5d4acd6c25c2 WatchSource:0}: Error finding container 70a07fb7f31e5b1158704b0e07657ebcc9bc083637f46fd258bf5d4acd6c25c2: Status 404 returned error can't find the container with id 70a07fb7f31e5b1158704b0e07657ebcc9bc083637f46fd258bf5d4acd6c25c2 Apr 16 19:30:18.144028 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.144008 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c0704bd_2f0f_4e78_8573_cf9346b4ae16.slice/crio-0f406542538519b2bca407a5a8cfb98e570f1663d43084a3d6d4d2d9e637f1a9 WatchSource:0}: Error finding container 0f406542538519b2bca407a5a8cfb98e570f1663d43084a3d6d4d2d9e637f1a9: Status 404 returned error can't find the container with id 0f406542538519b2bca407a5a8cfb98e570f1663d43084a3d6d4d2d9e637f1a9 Apr 16 19:30:18.175709 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.175681 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-ncrmg" Apr 16 19:30:18.181771 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.181754 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b53a341_a257_4e51_866a_7aaefe569885.slice/crio-de2f0921c3064313949cc64efa39105b81a42e4ffe1f99f005491ea22e6377f0 WatchSource:0}: Error finding container de2f0921c3064313949cc64efa39105b81a42e4ffe1f99f005491ea22e6377f0: Status 404 returned error can't find the container with id de2f0921c3064313949cc64efa39105b81a42e4ffe1f99f005491ea22e6377f0 Apr 16 19:30:18.205316 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.205298 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-j45kn" Apr 16 19:30:18.209943 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.209925 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14b86022_39d0_4850_a61d_778f535ea12e.slice/crio-9d96274b4b894ea999a7b6f51e75aae677a30f2d954865de2105cb72d39388a6 WatchSource:0}: Error finding container 9d96274b4b894ea999a7b6f51e75aae677a30f2d954865de2105cb72d39388a6: Status 404 returned error can't find the container with id 9d96274b4b894ea999a7b6f51e75aae677a30f2d954865de2105cb72d39388a6 Apr 16 19:30:18.216677 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.216659 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:18.221243 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.221221 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-bppgj" Apr 16 19:30:18.221938 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.221921 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6b1a716_a116_40d7_bd7e_8947f3cfea04.slice/crio-1961cc11660f425b30bc4eba492f10c6c04d90677953d7530ae6a37741531c92 WatchSource:0}: Error finding container 1961cc11660f425b30bc4eba492f10c6c04d90677953d7530ae6a37741531c92: Status 404 returned error can't find the container with id 1961cc11660f425b30bc4eba492f10c6c04d90677953d7530ae6a37741531c92 Apr 16 19:30:18.226524 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:18.226501 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04a0377_865b_45eb_b85e_384e518a7c12.slice/crio-f28c0ffab1f8d67a446487258fc0f7c5316e7d4191b00a07739e3ee893c92187 WatchSource:0}: Error finding container f28c0ffab1f8d67a446487258fc0f7c5316e7d4191b00a07739e3ee893c92187: Status 404 returned error can't find the container with id f28c0ffab1f8d67a446487258fc0f7c5316e7d4191b00a07739e3ee893c92187 Apr 16 19:30:18.497758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.497706 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:18.497758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.497753 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:18.497891 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.497837 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:18.497922 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.497890 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:19.497875188 +0000 UTC m=+3.074653380 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:18.498024 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.497840 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:18.498024 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.497965 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:19.497954578 +0000 UTC m=+3.074732767 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:18.598992 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.598973 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:18.599085 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.599075 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:18.599120 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.599087 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:18.599120 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.599096 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:18.599185 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:18.599135 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:19.599125091 +0000 UTC m=+3.175903280 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:18.854736 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.854649 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:17 +0000 UTC" deadline="2028-01-25 02:03:02.81605291 +0000 UTC" Apr 16 19:30:18.854736 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.854695 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15558h32m43.961362754s" Apr 16 19:30:18.873465 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.873392 2579 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:18.967130 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.967019 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bppgj" event={"ID":"b04a0377-865b-45eb-b85e-384e518a7c12","Type":"ContainerStarted","Data":"f28c0ffab1f8d67a446487258fc0f7c5316e7d4191b00a07739e3ee893c92187"} Apr 16 19:30:18.976793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.976278 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerStarted","Data":"0f406542538519b2bca407a5a8cfb98e570f1663d43084a3d6d4d2d9e637f1a9"} Apr 16 19:30:18.994009 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:18.993938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5xrl" event={"ID":"9859a426-6968-44ca-b63e-42baba2b957d","Type":"ContainerStarted","Data":"70a07fb7f31e5b1158704b0e07657ebcc9bc083637f46fd258bf5d4acd6c25c2"} Apr 16 19:30:19.003355 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.003288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" event={"ID":"be5a5919c15e8acb1a91eca10420db96","Type":"ContainerStarted","Data":"481a9eec580fe0cf1fb7dde34ea868f6170c17e68e7fd0bc9da3d66fce767812"} Apr 16 19:30:19.012139 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.012110 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" event={"ID":"ad3ef22c0a8b4cbdc51ac89991654b86","Type":"ContainerStarted","Data":"887f540eb45690731bd26e99529ad2e93c24bd446c5aaf89ffd166e1d3b32e18"} Apr 16 19:30:19.016961 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.016939 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"1961cc11660f425b30bc4eba492f10c6c04d90677953d7530ae6a37741531c92"} Apr 16 19:30:19.037328 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.037253 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j45kn" event={"ID":"14b86022-39d0-4850-a61d-778f535ea12e","Type":"ContainerStarted","Data":"9d96274b4b894ea999a7b6f51e75aae677a30f2d954865de2105cb72d39388a6"} Apr 16 19:30:19.044326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.044301 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ncrmg" event={"ID":"4b53a341-a257-4e51-866a-7aaefe569885","Type":"ContainerStarted","Data":"de2f0921c3064313949cc64efa39105b81a42e4ffe1f99f005491ea22e6377f0"} Apr 16 19:30:19.047156 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.047112 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" event={"ID":"09889d6a-515b-4fb3-acfb-41009ecb5107","Type":"ContainerStarted","Data":"268674c010eaf2fbd4446b5bca336536083a1e0519721b4b7881a49d7d725d72"} Apr 16 19:30:19.055411 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.055358 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nh8b7" event={"ID":"d478cc2e-78cb-4140-9eaf-2624faf8382b","Type":"ContainerStarted","Data":"0d3f9dc271124068ad128bae29158416afafb41e408aa1fe9c538cc635da62fc"} Apr 16 19:30:19.408617 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.408416 2579 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.505249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.505423 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.505579 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.505637 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:21.505619148 +0000 UTC m=+5.082397340 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.506040 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:19.506182 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.506090 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:21.506074735 +0000 UTC m=+5.082852945 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:19.606513 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.606480 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:19.606703 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.606643 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:19.606703 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.606660 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:19.606703 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.606673 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:19.606859 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.606725 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:21.60670782 +0000 UTC m=+5.183486023 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:19.855798 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.855672 2579 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 19:25:17 +0000 UTC" deadline="2028-01-09 01:25:21.155811156 +0000 UTC" Apr 16 19:30:19.855798 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.855708 2579 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15173h55m1.300106872s" Apr 16 19:30:19.939707 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.939677 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:19.939878 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.939799 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:19.940247 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.940223 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:19.940370 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.940338 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:19.940433 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:19.940417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:19.940500 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:19.940482 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:21.525226 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.524970 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:21.525646 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.525276 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:21.525646 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.525443 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:21.525646 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.525511 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.525491414 +0000 UTC m=+9.102269605 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:21.525829 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.525762 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:21.525829 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.525811 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.525796981 +0000 UTC m=+9.102575184 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:21.625811 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.625778 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:21.626006 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.625984 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:21.626006 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.626006 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:21.626155 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.626017 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:21.626155 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.626071 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:25.62605279 +0000 UTC m=+9.202830994 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:21.939337 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.939252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:21.939491 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.939258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:21.939556 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.939511 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:21.939556 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.939391 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:21.939556 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:21.939258 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:21.939689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:21.939646 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:23.939392 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:23.939358 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:23.939839 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:23.939491 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:23.939913 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:23.939884 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:23.940184 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:23.939985 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:23.940184 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:23.940054 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:23.940184 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:23.940139 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:25.557672 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.557532 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:25.557672 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.557602 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:25.558163 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.557697 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:25.558163 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.557749 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:25.558163 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.557774 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:33.557752524 +0000 UTC m=+17.134530726 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:25.558163 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.557805 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:33.557787762 +0000 UTC m=+17.134565956 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:25.657986 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.657953 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:25.658168 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.658154 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:25.658281 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.658173 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:25.658281 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.658185 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:25.658281 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.658264 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:33.658245833 +0000 UTC m=+17.235024024 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:25.939469 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.939438 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:25.939652 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.939564 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:25.939887 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.939863 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:25.939994 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.939978 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:25.940054 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:25.940005 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:25.940102 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:25.940061 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:27.939342 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:27.939317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:27.939751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:27.939341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:27.939751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:27.939427 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:27.939751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:27.939317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:27.939751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:27.939581 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:27.939751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:27.939686 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:28.080123 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:28.080011 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-bppgj" event={"ID":"b04a0377-865b-45eb-b85e-384e518a7c12","Type":"ContainerStarted","Data":"a33eceb96ee840818c5d0437cc1bb7643308d80acee719cc1edbb3413a150a67"} Apr 16 19:30:28.082089 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:28.082058 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" event={"ID":"ad3ef22c0a8b4cbdc51ac89991654b86","Type":"ContainerStarted","Data":"4327a3ba47435b75359a44420d419be6f7b8ca9fa0197f947c55fcd648b71459"} Apr 16 19:30:28.107096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:28.107045 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-bppgj" podStartSLOduration=1.501988598 podStartE2EDuration="11.107029642s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.227881639 +0000 UTC m=+1.804659828" lastFinishedPulling="2026-04-16 19:30:27.832922679 +0000 UTC m=+11.409700872" observedRunningTime="2026-04-16 19:30:28.095268511 +0000 UTC m=+11.672046723" watchObservedRunningTime="2026-04-16 19:30:28.107029642 +0000 UTC m=+11.683807853" Apr 16 19:30:28.107444 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:28.107403 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-155.ec2.internal" podStartSLOduration=11.107389674 podStartE2EDuration="11.107389674s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:28.107252995 +0000 UTC m=+11.684031209" watchObservedRunningTime="2026-04-16 19:30:28.107389674 +0000 UTC m=+11.684167886" Apr 16 19:30:29.086041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.085799 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" event={"ID":"09889d6a-515b-4fb3-acfb-41009ecb5107","Type":"ContainerStarted","Data":"bc940ba817cd7e3322760f77a4789790ed6ea137e0be745765b2f7a094f508dc"} Apr 16 19:30:29.087303 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.087272 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nh8b7" event={"ID":"d478cc2e-78cb-4140-9eaf-2624faf8382b","Type":"ContainerStarted","Data":"81f80e3695af5d0fd050f6cf9c063600ca913c85f8aee17dadaf665f2b605ebb"} Apr 16 19:30:29.088699 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.088671 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerStarted","Data":"dfcaf39e5bfd94e3018b4bdfc8383cb7d024d4aa0bdbf6a1be10b51ec34b7902"} Apr 16 19:30:29.090045 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.090012 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-l5xrl" event={"ID":"9859a426-6968-44ca-b63e-42baba2b957d","Type":"ContainerStarted","Data":"ddf1fe337085ad0de21a0df9b4fd2f88c224a7f3cff79bf3e996726a7c555fb4"} Apr 16 19:30:29.091658 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.091631 2579 generic.go:358] "Generic (PLEG): container finished" podID="be5a5919c15e8acb1a91eca10420db96" containerID="3298bd765be6e4ea24a84093f34a8a30294929d728450837f4742e2221a16971" exitCode=0 Apr 16 19:30:29.091763 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.091726 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" event={"ID":"be5a5919c15e8acb1a91eca10420db96","Type":"ContainerDied","Data":"3298bd765be6e4ea24a84093f34a8a30294929d728450837f4742e2221a16971"} Apr 16 19:30:29.091845 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.091817 2579 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" Apr 16 19:30:29.101767 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.101734 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nh8b7" podStartSLOduration=3.404336043 podStartE2EDuration="13.101722331s" podCreationTimestamp="2026-04-16 19:30:16 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.109344624 +0000 UTC m=+1.686122814" lastFinishedPulling="2026-04-16 19:30:27.806730905 +0000 UTC m=+11.383509102" observedRunningTime="2026-04-16 19:30:29.100092106 +0000 UTC m=+12.676870317" watchObservedRunningTime="2026-04-16 19:30:29.101722331 +0000 UTC m=+12.678500542" Apr 16 19:30:29.102900 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.102874 2579 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 19:30:29.103695 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.103676 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal"] Apr 16 19:30:29.111759 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.111722 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-l5xrl" podStartSLOduration=2.468718661 podStartE2EDuration="12.111707084s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.140751543 +0000 UTC m=+1.717529733" lastFinishedPulling="2026-04-16 19:30:27.783739958 +0000 UTC m=+11.360518156" observedRunningTime="2026-04-16 19:30:29.111536991 +0000 UTC m=+12.688315203" watchObservedRunningTime="2026-04-16 19:30:29.111707084 +0000 UTC m=+12.688485296" Apr 16 19:30:29.933175 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.933097 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:29.933832 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.933810 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:29.938819 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.938795 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:29.938914 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.938829 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:29.938914 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:29.938834 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:29.939012 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:29.938919 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:29.939084 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:29.939045 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:29.939163 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:29.939138 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:30.094506 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:30.094454 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-j45kn" event={"ID":"14b86022-39d0-4850-a61d-778f535ea12e","Type":"ContainerStarted","Data":"a9e39939f787475f7f6fa17879d66ecb23ec6835a52866b702cc5ad9977ae901"} Apr 16 19:30:31.095804 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:31.095777 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:30:31.939327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:31.939293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:31.939490 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:31.939299 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:31.939490 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:31.939393 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:31.939490 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:31.939304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:31.939630 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:31.939480 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:31.939630 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:31.939601 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:33.101441 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.101401 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="dfcaf39e5bfd94e3018b4bdfc8383cb7d024d4aa0bdbf6a1be10b51ec34b7902" exitCode=0 Apr 16 19:30:33.102228 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.101460 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"dfcaf39e5bfd94e3018b4bdfc8383cb7d024d4aa0bdbf6a1be10b51ec34b7902"} Apr 16 19:30:33.120788 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.120729 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-j45kn" podStartSLOduration=6.518365285 podStartE2EDuration="16.120712775s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.211277391 +0000 UTC m=+1.788055580" lastFinishedPulling="2026-04-16 19:30:27.813624864 +0000 UTC m=+11.390403070" observedRunningTime="2026-04-16 19:30:30.108384176 +0000 UTC m=+13.685162388" watchObservedRunningTime="2026-04-16 19:30:33.120712775 +0000 UTC m=+16.697490987" Apr 16 19:30:33.620085 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.620053 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:33.620314 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.620119 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:33.620314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.620198 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:33.620314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.620280 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:49.620263213 +0000 UTC m=+33.197041403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:33.620314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.620198 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:33.620532 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.620357 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:30:49.620342838 +0000 UTC m=+33.197121041 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:33.720817 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.720786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:33.720968 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.720951 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:33.720968 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.720969 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:33.721068 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.720978 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:33.721068 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.721026 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:49.721010197 +0000 UTC m=+33.297788406 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:33.939337 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.939310 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:33.939495 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.939347 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:33.939495 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:33.939313 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:33.939495 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.939434 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:33.939626 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.939526 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:33.939626 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:33.939614 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:35.939038 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:35.939002 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:35.939492 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:35.939043 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:35.939492 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:35.939104 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:35.939492 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:35.939160 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:35.939492 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:35.939169 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:35.939492 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:35.939296 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:37.117616 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.115949 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" event={"ID":"be5a5919c15e8acb1a91eca10420db96","Type":"ContainerStarted","Data":"4631c3905282d5ce3d93fbf064b34cbecbc865c80874c9e46c2b98eae620182f"} Apr 16 19:30:37.120968 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.119178 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"9d565221f73c5ce50b43fff3aae9e02c0305b959758cc4be2dbe00588fe7e169"} Apr 16 19:30:37.124234 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.124180 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-ncrmg" event={"ID":"4b53a341-a257-4e51-866a-7aaefe569885","Type":"ContainerStarted","Data":"16da1b74d245026fa009b22dcec95924dfd5b1ea578a4f1ed5c5fb4879888477"} Apr 16 19:30:37.152994 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.152779 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-ncrmg" podStartSLOduration=1.314513238 podStartE2EDuration="20.152761267s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.183065317 +0000 UTC m=+1.759843506" lastFinishedPulling="2026-04-16 19:30:37.021313332 +0000 UTC m=+20.598091535" observedRunningTime="2026-04-16 19:30:37.15249496 +0000 UTC m=+20.729273172" watchObservedRunningTime="2026-04-16 19:30:37.152761267 +0000 UTC m=+20.729539479" Apr 16 19:30:37.154772 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.153590 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-155.ec2.internal" podStartSLOduration=8.153575919 podStartE2EDuration="8.153575919s" podCreationTimestamp="2026-04-16 19:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:37.13328977 +0000 UTC m=+20.710067994" watchObservedRunningTime="2026-04-16 19:30:37.153575919 +0000 UTC m=+20.730354131" Apr 16 19:30:37.186373 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.186329 2579 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 19:30:37.864668 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.864562 2579 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T19:30:37.186349433Z","UUID":"27d3c06f-98ab-486c-a2b6-066696bbd75f","Handler":null,"Name":"","Endpoint":""} Apr 16 19:30:37.866639 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.866283 2579 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 19:30:37.866639 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.866309 2579 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 19:30:37.939552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.939404 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:37.939552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.939442 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:37.939552 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:37.939515 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:37.939552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:37.939407 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:37.939852 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:37.939636 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:37.939852 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:37.939740 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:38.128341 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.128246 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" event={"ID":"09889d6a-515b-4fb3-acfb-41009ecb5107","Type":"ContainerStarted","Data":"da0f86155a4b2fec42c8b2c5a371c475a664e80930bd252c655d5812e8dcb857"} Apr 16 19:30:38.131256 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.131230 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"a0e8e0f162366c7b199545521cda124e77b11fc7ac3173492023ba46014b61ca"} Apr 16 19:30:38.131377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.131262 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"774ad5613c690ff16fb9e4aa0093fb05b7854e758f8631ad527ea862265c5370"} Apr 16 19:30:38.131377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.131275 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"cb24cec65c22d96042760ee31c4fdb420cf389e901008bfdb4655ab5e5e0c691"} Apr 16 19:30:38.131377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.131288 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"24a90c15fa940ed704c51e6c130df7496ed61835bc21d345f36ff3474931e1e4"} Apr 16 19:30:38.131377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.131300 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"918352fad3295f8de82a7301204daad2bbe3b85ba8003d5d5441d1aabbdc6686"} Apr 16 19:30:38.574391 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.574363 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-stc5r"] Apr 16 19:30:38.577056 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.577040 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.579633 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.579613 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 19:30:38.579735 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.579697 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-fr6fn\"" Apr 16 19:30:38.579956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.579943 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 19:30:38.663155 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.663129 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b153866d-121b-4ac1-a27e-c2aea8f9de02-tmp-dir\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.663454 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.663163 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b153866d-121b-4ac1-a27e-c2aea8f9de02-hosts-file\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.663454 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.663218 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqrdq\" (UniqueName: \"kubernetes.io/projected/b153866d-121b-4ac1-a27e-c2aea8f9de02-kube-api-access-vqrdq\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.763706 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.763676 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b153866d-121b-4ac1-a27e-c2aea8f9de02-tmp-dir\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.763706 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.763708 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b153866d-121b-4ac1-a27e-c2aea8f9de02-hosts-file\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.763911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.763783 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b153866d-121b-4ac1-a27e-c2aea8f9de02-hosts-file\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.763911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.763811 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqrdq\" (UniqueName: \"kubernetes.io/projected/b153866d-121b-4ac1-a27e-c2aea8f9de02-kube-api-access-vqrdq\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.764011 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.763988 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b153866d-121b-4ac1-a27e-c2aea8f9de02-tmp-dir\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.775275 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.775250 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqrdq\" (UniqueName: \"kubernetes.io/projected/b153866d-121b-4ac1-a27e-c2aea8f9de02-kube-api-access-vqrdq\") pod \"node-resolver-stc5r\" (UID: \"b153866d-121b-4ac1-a27e-c2aea8f9de02\") " pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.885495 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:38.885429 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-stc5r" Apr 16 19:30:38.893525 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:38.893494 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb153866d_121b_4ac1_a27e_c2aea8f9de02.slice/crio-8f6cb0fcdb0deba8836c5e588cb2c2d00ac94d3bc684f1b62c44e3028f6ba985 WatchSource:0}: Error finding container 8f6cb0fcdb0deba8836c5e588cb2c2d00ac94d3bc684f1b62c44e3028f6ba985: Status 404 returned error can't find the container with id 8f6cb0fcdb0deba8836c5e588cb2c2d00ac94d3bc684f1b62c44e3028f6ba985 Apr 16 19:30:39.134875 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.134843 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" event={"ID":"09889d6a-515b-4fb3-acfb-41009ecb5107","Type":"ContainerStarted","Data":"642946063dfe5446c8bb76a05d31fe4b0947cb4427a80e1d020e9ead1f9a8c31"} Apr 16 19:30:39.136198 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.136154 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stc5r" event={"ID":"b153866d-121b-4ac1-a27e-c2aea8f9de02","Type":"ContainerStarted","Data":"b4b650422b415c8fccba59c02cc5d41470438a0163544e3105bc632ee5447e59"} Apr 16 19:30:39.136198 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.136179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-stc5r" event={"ID":"b153866d-121b-4ac1-a27e-c2aea8f9de02","Type":"ContainerStarted","Data":"8f6cb0fcdb0deba8836c5e588cb2c2d00ac94d3bc684f1b62c44e3028f6ba985"} Apr 16 19:30:39.154311 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.154261 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5sjqr" podStartSLOduration=3.055010664 podStartE2EDuration="23.154245967s" podCreationTimestamp="2026-04-16 19:30:16 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.119553206 +0000 UTC m=+1.696331395" lastFinishedPulling="2026-04-16 19:30:38.218788505 +0000 UTC m=+21.795566698" observedRunningTime="2026-04-16 19:30:39.151091446 +0000 UTC m=+22.727869661" watchObservedRunningTime="2026-04-16 19:30:39.154245967 +0000 UTC m=+22.731024177" Apr 16 19:30:39.163106 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.163063 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-stc5r" podStartSLOduration=1.163052042 podStartE2EDuration="1.163052042s" podCreationTimestamp="2026-04-16 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:30:39.162637019 +0000 UTC m=+22.739415231" watchObservedRunningTime="2026-04-16 19:30:39.163052042 +0000 UTC m=+22.739830252" Apr 16 19:30:39.939626 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.939594 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:39.939626 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.939622 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:39.939831 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:39.939697 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:39.939831 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:39.939736 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:39.939831 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:39.939813 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:39.939952 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:39.939867 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:40.140811 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:40.140775 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"1f2d68b9be3498651ece7ceca5b418cee15521dcbe824ae6764aea4a5f87e1fe"} Apr 16 19:30:40.942253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:40.942227 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:40.942437 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:40.942350 2579 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 19:30:40.942570 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:40.942553 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nh8b7" Apr 16 19:30:41.938892 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:41.938768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:41.939170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:41.938768 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:41.939170 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:41.938971 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:41.939170 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:41.939069 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:41.939170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:41.938786 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:41.939170 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:41.939144 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:42.147613 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:42.147584 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" event={"ID":"f6b1a716-a116-40d7-bd7e-8947f3cfea04","Type":"ContainerStarted","Data":"2bd54bd3087f3f3cbd1ead3676fba91bb787e1ec66c82e867bcd47bc080a41ec"} Apr 16 19:30:42.147909 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:42.147886 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:42.148024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:42.147915 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:42.162755 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:42.162687 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:42.183691 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:42.183639 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" podStartSLOduration=6.459954122 podStartE2EDuration="25.183625468s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.223746348 +0000 UTC m=+1.800524537" lastFinishedPulling="2026-04-16 19:30:36.947417671 +0000 UTC m=+20.524195883" observedRunningTime="2026-04-16 19:30:42.183375517 +0000 UTC m=+25.760153741" watchObservedRunningTime="2026-04-16 19:30:42.183625468 +0000 UTC m=+25.760403695" Apr 16 19:30:43.150647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.150613 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="ab018cc3e9098271a3d4ef81fb75f57de55e0d27c8d0edf1977dfc515eba82e0" exitCode=0 Apr 16 19:30:43.151049 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.150682 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"ab018cc3e9098271a3d4ef81fb75f57de55e0d27c8d0edf1977dfc515eba82e0"} Apr 16 19:30:43.151049 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.150981 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:43.165456 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.165438 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:30:43.938898 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.938873 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:43.939005 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.938874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:43.939005 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:43.938982 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:43.939121 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.938874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:43.939121 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:43.939041 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:43.939239 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:43.939117 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:43.967327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.967295 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ls7dc"] Apr 16 19:30:43.968022 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.967996 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6glh4"] Apr 16 19:30:43.969931 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:43.969912 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7mh9f"] Apr 16 19:30:44.154408 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:44.154335 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="7aff0a0a60b348af7be254b24d5b7ad85b38082470e59a4a9e0e917a566c128c" exitCode=0 Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:44.154416 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"7aff0a0a60b348af7be254b24d5b7ad85b38082470e59a4a9e0e917a566c128c"} Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:44.154434 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:44.154440 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:44.154527 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:44.154536 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:44.154621 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:44.154751 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:44.154659 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:45.157934 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:45.157848 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="96b37b9537757044a637758446df301037cff06ba2048bfc472158a583358a9c" exitCode=0 Apr 16 19:30:45.158421 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:45.157928 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"96b37b9537757044a637758446df301037cff06ba2048bfc472158a583358a9c"} Apr 16 19:30:45.939690 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:45.939661 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:45.939831 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:45.939666 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:45.939831 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:45.939779 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:45.939951 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:45.939865 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:45.939951 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:45.939672 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:45.940029 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:45.939979 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:47.939130 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:47.939100 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:47.939836 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:47.939100 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:47.939836 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:47.939248 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-ls7dc" podUID="afd027c2-990e-4d6c-b57c-62c9c66ce5f2" Apr 16 19:30:47.939836 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:47.939338 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7mh9f" podUID="ca23e8db-bb88-449f-8286-27f2978eb0ca" Apr 16 19:30:47.939836 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:47.939394 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:47.939836 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:47.939470 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6glh4" podUID="17f1ed1c-fff7-4d09-b029-890217b6c115" Apr 16 19:30:49.648325 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.648286 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:49.648812 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.648353 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:49.648812 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.648467 2579 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:49.648812 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.648547 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret podName:afd027c2-990e-4d6c-b57c-62c9c66ce5f2 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.64852374 +0000 UTC m=+65.225301949 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret") pod "global-pull-secret-syncer-ls7dc" (UID: "afd027c2-990e-4d6c-b57c-62c9c66ce5f2") : object "kube-system"/"original-pull-secret" not registered Apr 16 19:30:49.648812 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.648565 2579 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:49.648812 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.648632 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs podName:ca23e8db-bb88-449f-8286-27f2978eb0ca nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.648615276 +0000 UTC m=+65.225393479 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs") pod "network-metrics-daemon-7mh9f" (UID: "ca23e8db-bb88-449f-8286-27f2978eb0ca") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 19:30:49.749158 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.749117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:49.749347 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.749326 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 19:30:49.749399 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.749355 2579 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 19:30:49.749399 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.749371 2579 projected.go:194] Error preparing data for projected volume kube-api-access-x96nz for pod openshift-network-diagnostics/network-check-target-6glh4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:49.749471 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:49.749436 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz podName:17f1ed1c-fff7-4d09-b029-890217b6c115 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.74941642 +0000 UTC m=+65.326194634 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x96nz" (UniqueName: "kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz") pod "network-check-target-6glh4" (UID: "17f1ed1c-fff7-4d09-b029-890217b6c115") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 19:30:49.793474 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.793441 2579 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-155.ec2.internal" event="NodeReady" Apr 16 19:30:49.793640 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.793603 2579 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 19:30:49.826158 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.826121 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q"] Apr 16 19:30:49.846561 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.846107 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k"] Apr 16 19:30:49.847348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.846398 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:49.850385 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.850104 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-x5k7q\"" Apr 16 19:30:49.850385 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.850259 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 16 19:30:49.850385 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.850269 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 16 19:30:49.850876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.850857 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.851093 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.851069 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.857946 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.857926 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc"] Apr 16 19:30:49.857946 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.858150 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:49.861127 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.860949 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.861127 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.860952 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 19:30:49.861127 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.860967 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-gsjlj\"" Apr 16 19:30:49.861358 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.861182 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.861522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.861460 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 19:30:49.876683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.876654 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7"] Apr 16 19:30:49.876783 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.876703 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" Apr 16 19:30:49.879525 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.879503 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.879637 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.879574 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.879696 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.879632 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-tlj4w\"" Apr 16 19:30:49.888795 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.888774 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:30:49.888957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.888937 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:49.891579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.891557 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.891685 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.891583 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 19:30:49.891685 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.891559 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.891685 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.891643 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hkfmh\"" Apr 16 19:30:49.906649 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.906632 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ckt7c"] Apr 16 19:30:49.906788 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.906769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:49.910419 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.910395 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-25xm9\"" Apr 16 19:30:49.910507 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.910432 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 19:30:49.910507 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.910473 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 19:30:49.910881 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.910863 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 19:30:49.915918 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.915899 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 19:30:49.924842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924819 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q"] Apr 16 19:30:49.925186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924847 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc"] Apr 16 19:30:49.925186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924859 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k"] Apr 16 19:30:49.925186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924872 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ckt7c"] Apr 16 19:30:49.925186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924885 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-x5hqp"] Apr 16 19:30:49.925186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.924977 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:49.930055 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.930035 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.930369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.930352 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-882jq\"" Apr 16 19:30:49.930526 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.930369 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.930744 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.930724 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 19:30:49.931559 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.931394 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 19:30:49.937658 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.937640 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 19:30:49.938586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.938566 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-96nm2"] Apr 16 19:30:49.938744 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.938723 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:49.943545 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.943523 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkm8b\"" Apr 16 19:30:49.943834 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.943816 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 19:30:49.943922 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.943820 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 19:30:49.950708 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950688 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcb2c\" (UniqueName: \"kubernetes.io/projected/fba37944-a4ef-4148-aae0-6082d55a03b2-kube-api-access-hcb2c\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:49.950806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950729 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba37944-a4ef-4148-aae0-6082d55a03b2-config\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:49.950806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950765 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c6084e-53d2-4e83-85bf-1e53dca8d967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:49.950806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950798 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba37944-a4ef-4148-aae0-6082d55a03b2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:49.950956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950825 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsv4h\" (UniqueName: \"kubernetes.io/projected/33c6084e-53d2-4e83-85bf-1e53dca8d967-kube-api-access-fsv4h\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:49.950956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.950864 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c6084e-53d2-4e83-85bf-1e53dca8d967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:49.953356 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.953335 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn"] Apr 16 19:30:49.953496 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.953477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:30:49.953585 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.953501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:30:49.953585 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.953501 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:49.953916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.953744 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:30:49.956370 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956229 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.956370 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956273 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 19:30:49.956370 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956298 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:30:49.956665 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956648 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:30:49.956782 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956763 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-b4j4x\"" Apr 16 19:30:49.956862 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956846 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 19:30:49.956931 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956921 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:30:49.956982 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956929 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:30:49.957033 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.956981 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.970864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.970841 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4dv2w"] Apr 16 19:30:49.971087 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.971060 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" Apr 16 19:30:49.973628 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.973612 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-wp8sl\"" Apr 16 19:30:49.988183 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988163 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x5hqp"] Apr 16 19:30:49.988300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988190 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-96nm2"] Apr 16 19:30:49.988300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988220 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7"] Apr 16 19:30:49.988300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988233 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn"] Apr 16 19:30:49.988300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988245 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4dv2w"] Apr 16 19:30:49.988300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:30:49.988555 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.988317 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:49.990953 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.990930 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:30:49.991036 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.990983 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 19:30:49.991036 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.990997 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 19:30:49.991160 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:49.991098 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 19:30:50.051916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.051844 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c6084e-53d2-4e83-85bf-1e53dca8d967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.051916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.051889 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm69w\" (UniqueName: \"kubernetes.io/projected/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-kube-api-access-zm69w\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.052126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.051919 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-tmp-dir\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.052126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.051946 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d951a1-3e60-4517-ae8b-75bba19200c9-serving-cert\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.052126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.051969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba37944-a4ef-4148-aae0-6082d55a03b2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.052126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052027 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsv4h\" (UniqueName: \"kubernetes.io/projected/33c6084e-53d2-4e83-85bf-1e53dca8d967-kube-api-access-fsv4h\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.052422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052249 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c6084e-53d2-4e83-85bf-1e53dca8d967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.052422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052286 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.052422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052340 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052368 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.052422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052396 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-config-volume\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052438 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrq4g\" (UniqueName: \"kubernetes.io/projected/d5d951a1-3e60-4517-ae8b-75bba19200c9-kube-api-access-mrq4g\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052494 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba37944-a4ef-4148-aae0-6082d55a03b2-config\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052557 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33c6084e-53d2-4e83-85bf-1e53dca8d967-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052602 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-trusted-ca\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.052650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052683 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q4z8\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052731 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91f7dc43-a855-4712-8639-caad7b7a8458-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052766 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052863 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfz7\" (UniqueName: \"kubernetes.io/projected/599e545f-2894-45d3-ad19-ef2025af0502-kube-api-access-rdfz7\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052918 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-config\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.052967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052948 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kzjds\" (UniqueName: \"kubernetes.io/projected/eb8666d2-cbd8-4ed1-8781-eb4176da58ab-kube-api-access-kzjds\") pod \"volume-data-source-validator-7c6cbb6c87-dm5bc\" (UID: \"eb8666d2-cbd8-4ed1-8781-eb4176da58ab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" Apr 16 19:30:50.053267 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.052977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.053267 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.053048 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcb2c\" (UniqueName: \"kubernetes.io/projected/fba37944-a4ef-4148-aae0-6082d55a03b2-kube-api-access-hcb2c\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.053267 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.053091 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.056080 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.056053 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33c6084e-53d2-4e83-85bf-1e53dca8d967-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.056194 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.056152 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fba37944-a4ef-4148-aae0-6082d55a03b2-serving-cert\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.060739 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.060715 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsv4h\" (UniqueName: \"kubernetes.io/projected/33c6084e-53d2-4e83-85bf-1e53dca8d967-kube-api-access-fsv4h\") pod \"kube-storage-version-migrator-operator-6769c5d45-ph52k\" (UID: \"33c6084e-53d2-4e83-85bf-1e53dca8d967\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.061022 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.060993 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcb2c\" (UniqueName: \"kubernetes.io/projected/fba37944-a4ef-4148-aae0-6082d55a03b2-kube-api-access-hcb2c\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.065440 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.065416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fba37944-a4ef-4148-aae0-6082d55a03b2-config\") pod \"service-ca-operator-d6fc45fc5-2dz6q\" (UID: \"fba37944-a4ef-4148-aae0-6082d55a03b2\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.154168 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154130 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dccwg\" (UniqueName: \"kubernetes.io/projected/4063c915-74be-464e-845c-caccf6e297c5-kube-api-access-dccwg\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.154168 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154167 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.154361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154185 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zm69w\" (UniqueName: \"kubernetes.io/projected/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-kube-api-access-zm69w\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.154361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-tmp-dir\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.154361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154298 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d951a1-3e60-4517-ae8b-75bba19200c9-serving-cert\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.154361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154323 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.154565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154361 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.154565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.154565 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154509 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:50.154565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154519 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.154565 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154528 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154610 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154621 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.654567761 +0000 UTC m=+34.231345970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154621 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-tmp-dir\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154646 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154667 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.654652586 +0000 UTC m=+34.231430788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154708 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154703 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-config-volume\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.154750 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.65473273 +0000 UTC m=+34.231510924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154790 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mrq4g\" (UniqueName: \"kubernetes.io/projected/d5d951a1-3e60-4517-ae8b-75bba19200c9-kube-api-access-mrq4g\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154811 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.154835 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154833 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154861 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154880 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-trusted-ca\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154896 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154918 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9q4z8\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154943 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91f7dc43-a855-4712-8639-caad7b7a8458-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.154963 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.155017 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.155083 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.655066848 +0000 UTC m=+34.231845047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155106 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbgt6\" (UniqueName: \"kubernetes.io/projected/81c37a9f-fd72-4cbf-af17-3b94a3a81d69-kube-api-access-xbgt6\") pod \"network-check-source-8894fc9bd-z9djn\" (UID: \"81c37a9f-fd72-4cbf-af17-3b94a3a81d69\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155254 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfz7\" (UniqueName: \"kubernetes.io/projected/599e545f-2894-45d3-ad19-ef2025af0502-kube-api-access-rdfz7\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155261 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-config-volume\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155285 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-config\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155314 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kzjds\" (UniqueName: \"kubernetes.io/projected/eb8666d2-cbd8-4ed1-8781-eb4176da58ab-kube-api-access-kzjds\") pod \"volume-data-source-validator-7c6cbb6c87-dm5bc\" (UID: \"eb8666d2-cbd8-4ed1-8781-eb4176da58ab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" Apr 16 19:30:50.155445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.156327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155803 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/91f7dc43-a855-4712-8639-caad7b7a8458-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.156327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155829 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-trusted-ca\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.156327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155902 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.156327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.155983 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d5d951a1-3e60-4517-ae8b-75bba19200c9-config\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.157003 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.156966 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d5d951a1-3e60-4517-ae8b-75bba19200c9-serving-cert\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.158049 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.158026 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.158049 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.158043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.161901 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.161089 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" Apr 16 19:30:50.163500 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.163302 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrq4g\" (UniqueName: \"kubernetes.io/projected/d5d951a1-3e60-4517-ae8b-75bba19200c9-kube-api-access-mrq4g\") pod \"console-operator-9d4b6777b-ckt7c\" (UID: \"d5d951a1-3e60-4517-ae8b-75bba19200c9\") " pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.163581 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.163501 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm69w\" (UniqueName: \"kubernetes.io/projected/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-kube-api-access-zm69w\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.164320 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.164297 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kzjds\" (UniqueName: \"kubernetes.io/projected/eb8666d2-cbd8-4ed1-8781-eb4176da58ab-kube-api-access-kzjds\") pod \"volume-data-source-validator-7c6cbb6c87-dm5bc\" (UID: \"eb8666d2-cbd8-4ed1-8781-eb4176da58ab\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" Apr 16 19:30:50.165119 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.165096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfz7\" (UniqueName: \"kubernetes.io/projected/599e545f-2894-45d3-ad19-ef2025af0502-kube-api-access-rdfz7\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.165435 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.165412 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q4z8\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.165628 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.165576 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.171883 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.171606 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" Apr 16 19:30:50.189016 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.188994 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" Apr 16 19:30:50.234096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.234056 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:30:50.255987 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.255962 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.256101 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.256025 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xbgt6\" (UniqueName: \"kubernetes.io/projected/81c37a9f-fd72-4cbf-af17-3b94a3a81d69-kube-api-access-xbgt6\") pod \"network-check-source-8894fc9bd-z9djn\" (UID: \"81c37a9f-fd72-4cbf-af17-3b94a3a81d69\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" Apr 16 19:30:50.256101 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.256087 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dccwg\" (UniqueName: \"kubernetes.io/projected/4063c915-74be-464e-845c-caccf6e297c5-kube-api-access-dccwg\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.256331 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.256307 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:50.256431 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.256371 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:50.75635148 +0000 UTC m=+34.333129684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:30:50.264791 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.264766 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dccwg\" (UniqueName: \"kubernetes.io/projected/4063c915-74be-464e-845c-caccf6e297c5-kube-api-access-dccwg\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.264791 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.264779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbgt6\" (UniqueName: \"kubernetes.io/projected/81c37a9f-fd72-4cbf-af17-3b94a3a81d69-kube-api-access-xbgt6\") pod \"network-check-source-8894fc9bd-z9djn\" (UID: \"81c37a9f-fd72-4cbf-af17-3b94a3a81d69\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" Apr 16 19:30:50.295085 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.295063 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" Apr 16 19:30:50.660144 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.660108 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:50.660144 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.660149 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.660262 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.660342 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660278 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660389 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660292 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660446 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.660426201 +0000 UTC m=+35.237204411 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660447 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660469 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.660453604 +0000 UTC m=+35.237231794 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660316 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660488 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.660477889 +0000 UTC m=+35.237256082 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:30:50.660689 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.660506 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.660495505 +0000 UTC m=+35.237273697 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:30:50.761185 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:50.761110 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:50.761390 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.761257 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:50.761390 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:50.761331 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:51.761311056 +0000 UTC m=+35.338089245 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:30:51.285395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.285366 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-ckt7c"] Apr 16 19:30:51.297012 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.296956 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc"] Apr 16 19:30:51.300173 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.300143 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q"] Apr 16 19:30:51.300869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.300813 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k"] Apr 16 19:30:51.307478 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.307458 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn"] Apr 16 19:30:51.330311 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:51.330285 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5d951a1_3e60_4517_ae8b_75bba19200c9.slice/crio-427b093cde8e49f4f2947fe59280b7975e58f6fb78cebaa0408fd0421ff793d3 WatchSource:0}: Error finding container 427b093cde8e49f4f2947fe59280b7975e58f6fb78cebaa0408fd0421ff793d3: Status 404 returned error can't find the container with id 427b093cde8e49f4f2947fe59280b7975e58f6fb78cebaa0408fd0421ff793d3 Apr 16 19:30:51.331129 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:51.331106 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb8666d2_cbd8_4ed1_8781_eb4176da58ab.slice/crio-19998a867c844b14c1db0f88462348cfb2739560ee8d6f4491e06589f0e2194e WatchSource:0}: Error finding container 19998a867c844b14c1db0f88462348cfb2739560ee8d6f4491e06589f0e2194e: Status 404 returned error can't find the container with id 19998a867c844b14c1db0f88462348cfb2739560ee8d6f4491e06589f0e2194e Apr 16 19:30:51.332105 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:51.332053 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba37944_a4ef_4148_aae0_6082d55a03b2.slice/crio-11b9e33d200c124a9819ca4805421ffddda789fcaf51409abf02f29d4a8b75cf WatchSource:0}: Error finding container 11b9e33d200c124a9819ca4805421ffddda789fcaf51409abf02f29d4a8b75cf: Status 404 returned error can't find the container with id 11b9e33d200c124a9819ca4805421ffddda789fcaf51409abf02f29d4a8b75cf Apr 16 19:30:51.333538 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:51.333516 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33c6084e_53d2_4e83_85bf_1e53dca8d967.slice/crio-6b51992e0fa4ee9fa843e5a6161cfca697fbda363f64d9ba523b9c2c3b9a7c85 WatchSource:0}: Error finding container 6b51992e0fa4ee9fa843e5a6161cfca697fbda363f64d9ba523b9c2c3b9a7c85: Status 404 returned error can't find the container with id 6b51992e0fa4ee9fa843e5a6161cfca697fbda363f64d9ba523b9c2c3b9a7c85 Apr 16 19:30:51.337853 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:30:51.337834 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81c37a9f_fd72_4cbf_af17_3b94a3a81d69.slice/crio-38e8f8cd815f59382bf337adcdaf1febd5e4d1da0da29fab1107939a27ab95be WatchSource:0}: Error finding container 38e8f8cd815f59382bf337adcdaf1febd5e4d1da0da29fab1107939a27ab95be: Status 404 returned error can't find the container with id 38e8f8cd815f59382bf337adcdaf1febd5e4d1da0da29fab1107939a27ab95be Apr 16 19:30:51.670004 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.669977 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.670016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.670135 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670163 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670188 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670253 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670275 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.670252876 +0000 UTC m=+37.247031085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.670171 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670287 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670297 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.670282206 +0000 UTC m=+37.247060398 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670331 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.670320273 +0000 UTC m=+37.247098468 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670348 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:30:51.670377 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.670388 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.670374504 +0000 UTC m=+37.247152693 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:30:51.770819 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:51.770786 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:51.770972 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.770896 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:51.770972 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:51.770944 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:53.770931564 +0000 UTC m=+37.347709752 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:30:52.176698 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.176430 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" event={"ID":"81c37a9f-fd72-4cbf-af17-3b94a3a81d69","Type":"ContainerStarted","Data":"38e8f8cd815f59382bf337adcdaf1febd5e4d1da0da29fab1107939a27ab95be"} Apr 16 19:30:52.178657 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.178576 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" event={"ID":"eb8666d2-cbd8-4ed1-8781-eb4176da58ab","Type":"ContainerStarted","Data":"19998a867c844b14c1db0f88462348cfb2739560ee8d6f4491e06589f0e2194e"} Apr 16 19:30:52.180587 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.180508 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" event={"ID":"fba37944-a4ef-4148-aae0-6082d55a03b2","Type":"ContainerStarted","Data":"11b9e33d200c124a9819ca4805421ffddda789fcaf51409abf02f29d4a8b75cf"} Apr 16 19:30:52.182915 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.182859 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" event={"ID":"d5d951a1-3e60-4517-ae8b-75bba19200c9","Type":"ContainerStarted","Data":"427b093cde8e49f4f2947fe59280b7975e58f6fb78cebaa0408fd0421ff793d3"} Apr 16 19:30:52.184110 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.184065 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" event={"ID":"33c6084e-53d2-4e83-85bf-1e53dca8d967","Type":"ContainerStarted","Data":"6b51992e0fa4ee9fa843e5a6161cfca697fbda363f64d9ba523b9c2c3b9a7c85"} Apr 16 19:30:52.187254 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.187190 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="da30f8bd9b302617cfaa9426498887eac6f36612ee0b5b49391573227e4366dd" exitCode=0 Apr 16 19:30:52.187254 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:52.187236 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"da30f8bd9b302617cfaa9426498887eac6f36612ee0b5b49391573227e4366dd"} Apr 16 19:30:53.196723 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.196671 2579 generic.go:358] "Generic (PLEG): container finished" podID="6c0704bd-2f0f-4e78-8573-cf9346b4ae16" containerID="34d19f16a2cbddbf96ff21df2beddc77e14833caff87a04901919eb10ba54bd8" exitCode=0 Apr 16 19:30:53.197289 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.196729 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerDied","Data":"34d19f16a2cbddbf96ff21df2beddc77e14833caff87a04901919eb10ba54bd8"} Apr 16 19:30:53.692416 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.692390 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:53.692533 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.692492 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:53.692591 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692531 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:53.692591 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.692551 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:53.692688 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692600 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.692583072 +0000 UTC m=+41.269361262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:30:53.692688 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692624 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:30:53.692688 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692653 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:53.692688 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.692661 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:53.692688 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692673 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:30:53.692885 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692712 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:30:53.692885 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692728 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.692708786 +0000 UTC m=+41.269486992 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:30:53.692885 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692753 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.69273658 +0000 UTC m=+41.269514769 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:30:53.692885 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.692770 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.692760809 +0000 UTC m=+41.269539001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:30:53.793915 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:53.793830 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:53.794146 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.794097 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:53.794248 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:53.794180 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:30:57.794157979 +0000 UTC m=+41.370936195 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:30:57.726521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:57.726358 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:30:57.726521 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726503 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:30:57.726521 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726527 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:57.726544 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726578 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.726563679 +0000 UTC m=+49.303341868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:57.726638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726656 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726700 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.726686594 +0000 UTC m=+49.303464795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:57.726718 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726724 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726771 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.726756685 +0000 UTC m=+49.303534886 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726789 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:30:57.727314 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.726820 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.726812737 +0000 UTC m=+49.303590925 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:30:57.827634 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:57.827596 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:30:57.827764 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.827709 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:30:57.827834 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:57.827768 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:05.827753275 +0000 UTC m=+49.404531470 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:30:58.213565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.213530 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-whspd" event={"ID":"6c0704bd-2f0f-4e78-8573-cf9346b4ae16","Type":"ContainerStarted","Data":"1e2775fde56d3d73f210c0b5ab4f45a7bc630298f940aa134733b80ba5f95d1a"} Apr 16 19:30:58.214912 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.214885 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" event={"ID":"81c37a9f-fd72-4cbf-af17-3b94a3a81d69","Type":"ContainerStarted","Data":"11a42d6423aaed459b8af52e5d3a5612ed6ac28f9d8a589a2e1be496cb3d1ea9"} Apr 16 19:30:58.216405 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.216376 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" event={"ID":"eb8666d2-cbd8-4ed1-8781-eb4176da58ab","Type":"ContainerStarted","Data":"4112e2e2ba8a7bd5bb3b4f491ea98edb6d912aee30f8ef7b41dcf50048f1858f"} Apr 16 19:30:58.217762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.217732 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" event={"ID":"fba37944-a4ef-4148-aae0-6082d55a03b2","Type":"ContainerStarted","Data":"6c54cf990955cea0e0a9141fe338f9cbb39936a3768f6391308228cf92e157f1"} Apr 16 19:30:58.219304 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.219284 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/0.log" Apr 16 19:30:58.219444 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.219325 2579 generic.go:358] "Generic (PLEG): container finished" podID="d5d951a1-3e60-4517-ae8b-75bba19200c9" containerID="b312bc74722b38d4243365ec94ee1c93ea8ab8b3752290e3b6195b0166f2ae06" exitCode=255 Apr 16 19:30:58.219444 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.219394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" event={"ID":"d5d951a1-3e60-4517-ae8b-75bba19200c9","Type":"ContainerDied","Data":"b312bc74722b38d4243365ec94ee1c93ea8ab8b3752290e3b6195b0166f2ae06"} Apr 16 19:30:58.219668 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.219649 2579 scope.go:117] "RemoveContainer" containerID="b312bc74722b38d4243365ec94ee1c93ea8ab8b3752290e3b6195b0166f2ae06" Apr 16 19:30:58.220808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.220786 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" event={"ID":"33c6084e-53d2-4e83-85bf-1e53dca8d967","Type":"ContainerStarted","Data":"39213adc7905f8ae4cac4276dd79d47b3dcde732f8264b7e4a8d70d173d7b6ba"} Apr 16 19:30:58.238752 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.238707 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-whspd" podStartSLOduration=8.005825414 podStartE2EDuration="41.238694286s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:30:18.145391619 +0000 UTC m=+1.722169811" lastFinishedPulling="2026-04-16 19:30:51.378260493 +0000 UTC m=+34.955038683" observedRunningTime="2026-04-16 19:30:58.236828097 +0000 UTC m=+41.813606309" watchObservedRunningTime="2026-04-16 19:30:58.238694286 +0000 UTC m=+41.815472499" Apr 16 19:30:58.252530 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.252483 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-z9djn" podStartSLOduration=17.366359545999998 podStartE2EDuration="23.252469136s" podCreationTimestamp="2026-04-16 19:30:35 +0000 UTC" firstStartedPulling="2026-04-16 19:30:51.356326252 +0000 UTC m=+34.933104452" lastFinishedPulling="2026-04-16 19:30:57.242435839 +0000 UTC m=+40.819214042" observedRunningTime="2026-04-16 19:30:58.251271458 +0000 UTC m=+41.828049672" watchObservedRunningTime="2026-04-16 19:30:58.252469136 +0000 UTC m=+41.829247350" Apr 16 19:30:58.268767 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.268440 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" podStartSLOduration=30.371584798 podStartE2EDuration="36.268426928s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:51.335732469 +0000 UTC m=+34.912510659" lastFinishedPulling="2026-04-16 19:30:57.232574583 +0000 UTC m=+40.809352789" observedRunningTime="2026-04-16 19:30:58.26796545 +0000 UTC m=+41.844743677" watchObservedRunningTime="2026-04-16 19:30:58.268426928 +0000 UTC m=+41.845205140" Apr 16 19:30:58.284134 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.284100 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dm5bc" podStartSLOduration=30.385179631 podStartE2EDuration="36.284090589s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:51.333561411 +0000 UTC m=+34.910339603" lastFinishedPulling="2026-04-16 19:30:57.232472356 +0000 UTC m=+40.809250561" observedRunningTime="2026-04-16 19:30:58.283533168 +0000 UTC m=+41.860311379" watchObservedRunningTime="2026-04-16 19:30:58.284090589 +0000 UTC m=+41.860868799" Apr 16 19:30:58.310801 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:58.310756 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" podStartSLOduration=30.412850429 podStartE2EDuration="36.310743281s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:30:51.33501535 +0000 UTC m=+34.911793547" lastFinishedPulling="2026-04-16 19:30:57.232908211 +0000 UTC m=+40.809686399" observedRunningTime="2026-04-16 19:30:58.309757148 +0000 UTC m=+41.886535369" watchObservedRunningTime="2026-04-16 19:30:58.310743281 +0000 UTC m=+41.887521493" Apr 16 19:30:59.225422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.225396 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/1.log" Apr 16 19:30:59.225830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.225740 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/0.log" Apr 16 19:30:59.225830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.225773 2579 generic.go:358] "Generic (PLEG): container finished" podID="d5d951a1-3e60-4517-ae8b-75bba19200c9" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" exitCode=255 Apr 16 19:30:59.225988 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.225852 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" event={"ID":"d5d951a1-3e60-4517-ae8b-75bba19200c9","Type":"ContainerDied","Data":"fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b"} Apr 16 19:30:59.225988 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.225881 2579 scope.go:117] "RemoveContainer" containerID="b312bc74722b38d4243365ec94ee1c93ea8ab8b3752290e3b6195b0166f2ae06" Apr 16 19:30:59.226094 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:30:59.226062 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:30:59.226325 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:30:59.226293 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:00.233082 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:00.233054 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/1.log" Apr 16 19:31:00.233552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:00.233416 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:31:00.233611 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:00.233581 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:00.234467 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:00.234450 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:00.234522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:00.234484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:01.235029 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:01.234996 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:31:01.235536 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:01.235253 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:02.236624 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:02.236601 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:31:02.236993 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:02.236767 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:05.793469 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:05.793433 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:05.793497 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:05.793527 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:05.793563 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793580 2579 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793649 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls podName:e038a6a1-56fa-476b-8faf-dc54fd9afdfa nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.793633018 +0000 UTC m=+65.370411210 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls") pod "dns-default-x5hqp" (UID: "e038a6a1-56fa-476b-8faf-dc54fd9afdfa") : secret "dns-default-metrics-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793582 2579 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793661 2579 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793668 2579 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7dcf7c4679-t62l6: secret "image-registry-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793684 2579 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793720 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls podName:599e545f-2894-45d3-ad19-ef2025af0502 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.793702523 +0000 UTC m=+65.370480729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-6p4l7" (UID: "599e545f-2894-45d3-ad19-ef2025af0502") : secret "samples-operator-tls" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793736 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert podName:91f7dc43-a855-4712-8639-caad7b7a8458 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.793729385 +0000 UTC m=+65.370507575 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-96nm2" (UID: "91f7dc43-a855-4712-8639-caad7b7a8458") : secret "networking-console-plugin-cert" not found Apr 16 19:31:05.793872 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.793746 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls podName:c51aa13f-e270-4caa-b4b8-0133dfaf5f84 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.793740856 +0000 UTC m=+65.370519045 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls") pod "image-registry-7dcf7c4679-t62l6" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84") : secret "image-registry-tls" not found Apr 16 19:31:05.894727 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:05.894699 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:31:05.894850 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.894791 2579 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 19:31:05.894850 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:05.894835 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert podName:4063c915-74be-464e-845c-caccf6e297c5 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:21.894824052 +0000 UTC m=+65.471602241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert") pod "ingress-canary-4dv2w" (UID: "4063c915-74be-464e-845c-caccf6e297c5") : secret "canary-serving-cert" not found Apr 16 19:31:15.173490 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:15.173465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mx7dg" Apr 16 19:31:16.944862 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:16.944831 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:31:17.266472 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.266397 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:31:17.266783 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.266768 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/1.log" Apr 16 19:31:17.266869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.266804 2579 generic.go:358] "Generic (PLEG): container finished" podID="d5d951a1-3e60-4517-ae8b-75bba19200c9" containerID="0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514" exitCode=255 Apr 16 19:31:17.266869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.266842 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" event={"ID":"d5d951a1-3e60-4517-ae8b-75bba19200c9","Type":"ContainerDied","Data":"0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514"} Apr 16 19:31:17.266974 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.266883 2579 scope.go:117] "RemoveContainer" containerID="fda028a3966a33e00a043822bf78432b96a7e98cdeca38e15cba5d7ac93bf58b" Apr 16 19:31:17.267218 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:17.267184 2579 scope.go:117] "RemoveContainer" containerID="0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514" Apr 16 19:31:17.267409 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:17.267382 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:18.270164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:18.270139 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:31:19.619989 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.619954 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-85f49fc854-txwk9"] Apr 16 19:31:19.621822 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.621806 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.624363 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.624343 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 19:31:19.624761 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.624742 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 19:31:19.626766 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.626743 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 19:31:19.626893 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.626780 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 19:31:19.626893 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.626822 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 19:31:19.627028 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.627020 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-8knbr\"" Apr 16 19:31:19.627111 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.627092 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 19:31:19.646221 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.646184 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-85f49fc854-txwk9"] Apr 16 19:31:19.696553 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.696532 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-metrics-certs\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.696670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.696574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43c093f8-2338-4e6e-8349-b4e1575f5161-service-ca-bundle\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.696720 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.696667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzg7z\" (UniqueName: \"kubernetes.io/projected/43c093f8-2338-4e6e-8349-b4e1575f5161-kube-api-access-lzg7z\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.696720 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.696702 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-default-certificate\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.696720 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.696717 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-stats-auth\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.797965 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.797939 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-metrics-certs\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.798099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.798000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43c093f8-2338-4e6e-8349-b4e1575f5161-service-ca-bundle\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.798099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.798069 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzg7z\" (UniqueName: \"kubernetes.io/projected/43c093f8-2338-4e6e-8349-b4e1575f5161-kube-api-access-lzg7z\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.798235 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.798117 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-default-certificate\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.798235 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.798145 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-stats-auth\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.798659 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.798640 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43c093f8-2338-4e6e-8349-b4e1575f5161-service-ca-bundle\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.801880 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.801849 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-metrics-certs\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.801965 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.801944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-stats-auth\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.801965 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.801960 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/43c093f8-2338-4e6e-8349-b4e1575f5161-default-certificate\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.811436 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.811416 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzg7z\" (UniqueName: \"kubernetes.io/projected/43c093f8-2338-4e6e-8349-b4e1575f5161-kube-api-access-lzg7z\") pod \"router-default-85f49fc854-txwk9\" (UID: \"43c093f8-2338-4e6e-8349-b4e1575f5161\") " pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:19.930834 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:19.930769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:20.051980 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.051953 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-85f49fc854-txwk9"] Apr 16 19:31:20.054529 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:20.054495 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c093f8_2338_4e6e_8349_b4e1575f5161.slice/crio-8ea1819f6e01b11aeaa4721ef5975117f5c94e454646b90fdf02bf3b593af9f3 WatchSource:0}: Error finding container 8ea1819f6e01b11aeaa4721ef5975117f5c94e454646b90fdf02bf3b593af9f3: Status 404 returned error can't find the container with id 8ea1819f6e01b11aeaa4721ef5975117f5c94e454646b90fdf02bf3b593af9f3 Apr 16 19:31:20.234626 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.234595 2579 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:20.234770 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.234634 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:20.234970 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.234953 2579 scope.go:117] "RemoveContainer" containerID="0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514" Apr 16 19:31:20.235168 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:20.235151 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:20.277026 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.276993 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-85f49fc854-txwk9" event={"ID":"43c093f8-2338-4e6e-8349-b4e1575f5161","Type":"ContainerStarted","Data":"2567c6a7833c9f611200db7d6e2bbc79f3148e3c27a48435f48276d3dc6d2a4a"} Apr 16 19:31:20.277026 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.277029 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-85f49fc854-txwk9" event={"ID":"43c093f8-2338-4e6e-8349-b4e1575f5161","Type":"ContainerStarted","Data":"8ea1819f6e01b11aeaa4721ef5975117f5c94e454646b90fdf02bf3b593af9f3"} Apr 16 19:31:20.297439 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.297393 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-85f49fc854-txwk9" podStartSLOduration=1.297378536 podStartE2EDuration="1.297378536s" podCreationTimestamp="2026-04-16 19:31:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:31:20.296871793 +0000 UTC m=+63.873650006" watchObservedRunningTime="2026-04-16 19:31:20.297378536 +0000 UTC m=+63.874156748" Apr 16 19:31:20.931953 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.931919 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:20.934498 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:20.934471 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:21.283123 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.283082 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:21.284928 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.284908 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-85f49fc854-txwk9" Apr 16 19:31:21.713438 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.713413 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:31:21.713595 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.713457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:31:21.715807 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.715788 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 19:31:21.716863 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.716846 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 19:31:21.726026 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.726003 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/afd027c2-990e-4d6c-b57c-62c9c66ce5f2-original-pull-secret\") pod \"global-pull-secret-syncer-ls7dc\" (UID: \"afd027c2-990e-4d6c-b57c-62c9c66ce5f2\") " pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:31:21.726108 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.726048 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ca23e8db-bb88-449f-8286-27f2978eb0ca-metrics-certs\") pod \"network-metrics-daemon-7mh9f\" (UID: \"ca23e8db-bb88-449f-8286-27f2978eb0ca\") " pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:31:21.766989 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.766970 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p422q\"" Apr 16 19:31:21.774993 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.774976 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7mh9f" Apr 16 19:31:21.787897 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.787874 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-ls7dc" Apr 16 19:31:21.814586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.813933 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:31:21.814586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.813998 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:21.814586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.814035 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:31:21.814586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.814073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:31:21.814586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.814121 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:21.817266 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.817194 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e038a6a1-56fa-476b-8faf-dc54fd9afdfa-metrics-tls\") pod \"dns-default-x5hqp\" (UID: \"e038a6a1-56fa-476b-8faf-dc54fd9afdfa\") " pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:21.817705 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.817683 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/91f7dc43-a855-4712-8639-caad7b7a8458-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-96nm2\" (UID: \"91f7dc43-a855-4712-8639-caad7b7a8458\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:31:21.817871 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.817850 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"image-registry-7dcf7c4679-t62l6\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:21.817940 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.817924 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/599e545f-2894-45d3-ad19-ef2025af0502-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-6p4l7\" (UID: \"599e545f-2894-45d3-ad19-ef2025af0502\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:31:21.818182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.818160 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x96nz\" (UniqueName: \"kubernetes.io/projected/17f1ed1c-fff7-4d09-b029-890217b6c115-kube-api-access-x96nz\") pod \"network-check-target-6glh4\" (UID: \"17f1ed1c-fff7-4d09-b029-890217b6c115\") " pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:31:21.897982 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.897908 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7mh9f"] Apr 16 19:31:21.901021 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:21.900948 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca23e8db_bb88_449f_8286_27f2978eb0ca.slice/crio-e467c8a19c53c32b4ac93640cae2b4da07e0465f3101831872626974410805b6 WatchSource:0}: Error finding container e467c8a19c53c32b4ac93640cae2b4da07e0465f3101831872626974410805b6: Status 404 returned error can't find the container with id e467c8a19c53c32b4ac93640cae2b4da07e0465f3101831872626974410805b6 Apr 16 19:31:21.914807 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.914768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:31:21.915801 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.915777 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-ls7dc"] Apr 16 19:31:21.917228 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:21.917185 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4063c915-74be-464e-845c-caccf6e297c5-cert\") pod \"ingress-canary-4dv2w\" (UID: \"4063c915-74be-464e-845c-caccf6e297c5\") " pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:31:21.918646 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:21.918621 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafd027c2_990e_4d6c_b57c_62c9c66ce5f2.slice/crio-ef9defab740b20369a7a476c1f3d982a4dab410aeb8aab7931ae0e8ce0ae7481 WatchSource:0}: Error finding container ef9defab740b20369a7a476c1f3d982a4dab410aeb8aab7931ae0e8ce0ae7481: Status 404 returned error can't find the container with id ef9defab740b20369a7a476c1f3d982a4dab410aeb8aab7931ae0e8ce0ae7481 Apr 16 19:31:22.000072 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.000012 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-hkfmh\"" Apr 16 19:31:22.008319 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.008303 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" Apr 16 19:31:22.019463 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.019443 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-25xm9\"" Apr 16 19:31:22.027529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.027508 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:22.050875 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.050848 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fkm8b\"" Apr 16 19:31:22.058711 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.058277 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:22.077363 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.077330 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-52hfs\"" Apr 16 19:31:22.083654 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.083490 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-b4j4x\"" Apr 16 19:31:22.085751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.085717 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:31:22.091941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.091738 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" Apr 16 19:31:22.102291 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.101915 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-fqt5t\"" Apr 16 19:31:22.110547 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.110507 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4dv2w" Apr 16 19:31:22.145096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.144905 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7"] Apr 16 19:31:22.225083 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.225034 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:31:22.229435 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:22.229402 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc51aa13f_e270_4caa_b4b8_0133dfaf5f84.slice/crio-f82e667551bcf2e835db6600d0d6eb952d28fc6a8c776a1fd6ebc14df304d3a9 WatchSource:0}: Error finding container f82e667551bcf2e835db6600d0d6eb952d28fc6a8c776a1fd6ebc14df304d3a9: Status 404 returned error can't find the container with id f82e667551bcf2e835db6600d0d6eb952d28fc6a8c776a1fd6ebc14df304d3a9 Apr 16 19:31:22.248942 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.245897 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-x5hqp"] Apr 16 19:31:22.250773 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:22.250716 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode038a6a1_56fa_476b_8faf_dc54fd9afdfa.slice/crio-9d088a9faa8048e2f4acac2666b366f25604817c407a07d4aa67f26253ceac27 WatchSource:0}: Error finding container 9d088a9faa8048e2f4acac2666b366f25604817c407a07d4aa67f26253ceac27: Status 404 returned error can't find the container with id 9d088a9faa8048e2f4acac2666b366f25604817c407a07d4aa67f26253ceac27 Apr 16 19:31:22.276914 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.276468 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6glh4"] Apr 16 19:31:22.279375 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:22.279352 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17f1ed1c_fff7_4d09_b029_890217b6c115.slice/crio-cb7a63bb72885c9796d9ffbb5d3b946a6b57b7da3cc82cb007f3e0f84600d069 WatchSource:0}: Error finding container cb7a63bb72885c9796d9ffbb5d3b946a6b57b7da3cc82cb007f3e0f84600d069: Status 404 returned error can't find the container with id cb7a63bb72885c9796d9ffbb5d3b946a6b57b7da3cc82cb007f3e0f84600d069 Apr 16 19:31:22.287019 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.286983 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6glh4" event={"ID":"17f1ed1c-fff7-4d09-b029-890217b6c115","Type":"ContainerStarted","Data":"cb7a63bb72885c9796d9ffbb5d3b946a6b57b7da3cc82cb007f3e0f84600d069"} Apr 16 19:31:22.287982 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.287957 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x5hqp" event={"ID":"e038a6a1-56fa-476b-8faf-dc54fd9afdfa","Type":"ContainerStarted","Data":"9d088a9faa8048e2f4acac2666b366f25604817c407a07d4aa67f26253ceac27"} Apr 16 19:31:22.289025 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.289001 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7mh9f" event={"ID":"ca23e8db-bb88-449f-8286-27f2978eb0ca","Type":"ContainerStarted","Data":"e467c8a19c53c32b4ac93640cae2b4da07e0465f3101831872626974410805b6"} Apr 16 19:31:22.290120 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.290079 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" event={"ID":"c51aa13f-e270-4caa-b4b8-0133dfaf5f84","Type":"ContainerStarted","Data":"f82e667551bcf2e835db6600d0d6eb952d28fc6a8c776a1fd6ebc14df304d3a9"} Apr 16 19:31:22.291090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.291071 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-96nm2"] Apr 16 19:31:22.291260 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.291240 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" event={"ID":"599e545f-2894-45d3-ad19-ef2025af0502","Type":"ContainerStarted","Data":"dcabe931499c3290afa46b1363ba9e548494c62b917bc9fab76312056f7e4aab"} Apr 16 19:31:22.292321 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.292293 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ls7dc" event={"ID":"afd027c2-990e-4d6c-b57c-62c9c66ce5f2","Type":"ContainerStarted","Data":"ef9defab740b20369a7a476c1f3d982a4dab410aeb8aab7931ae0e8ce0ae7481"} Apr 16 19:31:22.294945 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:22.294926 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91f7dc43_a855_4712_8639_caad7b7a8458.slice/crio-39f4cdcbf7cfa2a857b6a683600f9bf74e207018c20830109c4e25ed714f7981 WatchSource:0}: Error finding container 39f4cdcbf7cfa2a857b6a683600f9bf74e207018c20830109c4e25ed714f7981: Status 404 returned error can't find the container with id 39f4cdcbf7cfa2a857b6a683600f9bf74e207018c20830109c4e25ed714f7981 Apr 16 19:31:22.312256 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:22.312231 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4dv2w"] Apr 16 19:31:22.315174 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:22.315147 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4063c915_74be_464e_845c_caccf6e297c5.slice/crio-3802d3340051feebc77fd874168e4313a9893dbebd46a3503f90b946f53491f1 WatchSource:0}: Error finding container 3802d3340051feebc77fd874168e4313a9893dbebd46a3503f90b946f53491f1: Status 404 returned error can't find the container with id 3802d3340051feebc77fd874168e4313a9893dbebd46a3503f90b946f53491f1 Apr 16 19:31:23.299637 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.298819 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6glh4" event={"ID":"17f1ed1c-fff7-4d09-b029-890217b6c115","Type":"ContainerStarted","Data":"1cd2297e6abd95ad87cc09b7ac74baa6f82c0a35859aa05017e9b99e2075f66f"} Apr 16 19:31:23.299637 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.299575 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:31:23.303115 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.303033 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4dv2w" event={"ID":"4063c915-74be-464e-845c-caccf6e297c5","Type":"ContainerStarted","Data":"3802d3340051feebc77fd874168e4313a9893dbebd46a3503f90b946f53491f1"} Apr 16 19:31:23.313624 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.313554 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" event={"ID":"91f7dc43-a855-4712-8639-caad7b7a8458","Type":"ContainerStarted","Data":"39f4cdcbf7cfa2a857b6a683600f9bf74e207018c20830109c4e25ed714f7981"} Apr 16 19:31:23.320095 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.319058 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6glh4" podStartSLOduration=66.319042493 podStartE2EDuration="1m6.319042493s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:31:23.318391216 +0000 UTC m=+66.895169428" watchObservedRunningTime="2026-04-16 19:31:23.319042493 +0000 UTC m=+66.895820706" Apr 16 19:31:23.321645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.321604 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" event={"ID":"c51aa13f-e270-4caa-b4b8-0133dfaf5f84","Type":"ContainerStarted","Data":"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907"} Apr 16 19:31:23.347278 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:23.346275 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" podStartSLOduration=66.346259518 podStartE2EDuration="1m6.346259518s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:31:23.345461817 +0000 UTC m=+66.922240032" watchObservedRunningTime="2026-04-16 19:31:23.346259518 +0000 UTC m=+66.923037729" Apr 16 19:31:24.328154 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:24.328120 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:26.918085 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:26.918053 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-stc5r_b153866d-121b-4ac1-a27e-c2aea8f9de02/dns-node-resolver/0.log" Apr 16 19:31:27.106768 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:27.106695 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-7dcf7c4679-t62l6_c51aa13f-e270-4caa-b4b8-0133dfaf5f84/registry/0.log" Apr 16 19:31:27.709836 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:27.709806 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l5xrl_9859a426-6968-44ca-b63e-42baba2b957d/node-ca/0.log" Apr 16 19:31:27.976708 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:27.976639 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-xzqwj"] Apr 16 19:31:28.019760 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.019726 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xzqwj"] Apr 16 19:31:28.019943 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.019875 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.022635 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.022604 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 19:31:28.022635 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.022609 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 19:31:28.022779 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.022615 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 19:31:28.023128 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.023103 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 19:31:28.023307 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.023201 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-jq9hf\"" Apr 16 19:31:28.064524 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.064496 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-data-volume\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.064655 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.064554 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.064655 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.064633 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-crio-socket\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.064766 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.064744 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xgcz\" (UniqueName: \"kubernetes.io/projected/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-api-access-2xgcz\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.064812 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.064791 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.106514 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.106483 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-85f49fc854-txwk9_43c093f8-2338-4e6e-8349-b4e1575f5161/router/0.log" Apr 16 19:31:28.165579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165555 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.165714 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165598 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-data-volume\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.165714 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165654 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.165714 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165688 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-crio-socket\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.165870 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:28.165725 2579 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.165870 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165741 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xgcz\" (UniqueName: \"kubernetes.io/projected/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-api-access-2xgcz\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.165870 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:28.165800 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls podName:e7dc6e11-9200-454f-9b2e-e9d2081a2b29 nodeName:}" failed. No retries permitted until 2026-04-16 19:31:28.665779325 +0000 UTC m=+72.242557518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls") pod "insights-runtime-extractor-xzqwj" (UID: "e7dc6e11-9200-454f-9b2e-e9d2081a2b29") : secret "insights-runtime-extractor-tls" not found Apr 16 19:31:28.166029 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165973 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-data-volume\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.166029 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.165974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-crio-socket\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.166250 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.166231 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.174856 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.174838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xgcz\" (UniqueName: \"kubernetes.io/projected/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-kube-api-access-2xgcz\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.670293 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.669886 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.673938 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.673883 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e7dc6e11-9200-454f-9b2e-e9d2081a2b29-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-xzqwj\" (UID: \"e7dc6e11-9200-454f-9b2e-e9d2081a2b29\") " pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:28.931608 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:28.931525 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-xzqwj" Apr 16 19:31:29.061674 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.061651 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-xzqwj"] Apr 16 19:31:29.063906 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:29.063879 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7dc6e11_9200_454f_9b2e_e9d2081a2b29.slice/crio-82c5026488fc8db84096a511563595bd37d8761d4f978c19143160101d59d049 WatchSource:0}: Error finding container 82c5026488fc8db84096a511563595bd37d8761d4f978c19143160101d59d049: Status 404 returned error can't find the container with id 82c5026488fc8db84096a511563595bd37d8761d4f978c19143160101d59d049 Apr 16 19:31:29.342185 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.342074 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x5hqp" event={"ID":"e038a6a1-56fa-476b-8faf-dc54fd9afdfa","Type":"ContainerStarted","Data":"dc1ec1c277f741b172c95e9c86d34fc58e53e824efca7b5e118c7aa50dd7a8f5"} Apr 16 19:31:29.342185 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.342111 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-x5hqp" event={"ID":"e038a6a1-56fa-476b-8faf-dc54fd9afdfa","Type":"ContainerStarted","Data":"2c6c81277a67d8dbe063bcc8aaf3d7bf79c342d41d514413903260400890f97e"} Apr 16 19:31:29.342185 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.342179 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:29.343554 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.343524 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" event={"ID":"91f7dc43-a855-4712-8639-caad7b7a8458","Type":"ContainerStarted","Data":"0fa5e8c320323a822170de9765dcb98623b3431a2a46389ffd8baf14514de72b"} Apr 16 19:31:29.344749 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.344723 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xzqwj" event={"ID":"e7dc6e11-9200-454f-9b2e-e9d2081a2b29","Type":"ContainerStarted","Data":"12ff5106b5d20b7ff9d7a7a1e0f2370964c43aa7afc1612f62a82f3a982d30a6"} Apr 16 19:31:29.344859 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.344753 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xzqwj" event={"ID":"e7dc6e11-9200-454f-9b2e-e9d2081a2b29","Type":"ContainerStarted","Data":"82c5026488fc8db84096a511563595bd37d8761d4f978c19143160101d59d049"} Apr 16 19:31:29.346294 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.346264 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7mh9f" event={"ID":"ca23e8db-bb88-449f-8286-27f2978eb0ca","Type":"ContainerStarted","Data":"b6593c1d0fc96a0c7efe9b982e98125736778f4ca6cf331994cdf3afe96757ea"} Apr 16 19:31:29.346374 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.346295 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7mh9f" event={"ID":"ca23e8db-bb88-449f-8286-27f2978eb0ca","Type":"ContainerStarted","Data":"f1c9a521ad31bdc9cba037c192a6caa247cbca470ff08ab392620cd014dd56f9"} Apr 16 19:31:29.347761 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.347728 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" event={"ID":"599e545f-2894-45d3-ad19-ef2025af0502","Type":"ContainerStarted","Data":"ee7d1d79588008268e67fd186153aabf4bc432ba23d9d25daddff9ab8bc63548"} Apr 16 19:31:29.347761 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.347757 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" event={"ID":"599e545f-2894-45d3-ad19-ef2025af0502","Type":"ContainerStarted","Data":"725365b07111ba2154cb82df21d1130b26245705cabf5dbe5f169cbf89c34b85"} Apr 16 19:31:29.348962 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.348931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-ls7dc" event={"ID":"afd027c2-990e-4d6c-b57c-62c9c66ce5f2","Type":"ContainerStarted","Data":"af5636e1fb7a5093cde4883bc9d3f7aa7bff1d4c745d903551ebd1b8eec9b083"} Apr 16 19:31:29.350134 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.350113 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4dv2w" event={"ID":"4063c915-74be-464e-845c-caccf6e297c5","Type":"ContainerStarted","Data":"29d8ce0bde0b9e31e5f8543040847a009286c1771aa99f077f10b6fe3c255643"} Apr 16 19:31:29.360551 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.360511 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-x5hqp" podStartSLOduration=34.178001164 podStartE2EDuration="40.360499568s" podCreationTimestamp="2026-04-16 19:30:49 +0000 UTC" firstStartedPulling="2026-04-16 19:31:22.252844836 +0000 UTC m=+65.829623025" lastFinishedPulling="2026-04-16 19:31:28.435343236 +0000 UTC m=+72.012121429" observedRunningTime="2026-04-16 19:31:29.359004976 +0000 UTC m=+72.935783188" watchObservedRunningTime="2026-04-16 19:31:29.360499568 +0000 UTC m=+72.937277780" Apr 16 19:31:29.379090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.379050 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-96nm2" podStartSLOduration=46.240685033 podStartE2EDuration="52.379039658s" podCreationTimestamp="2026-04-16 19:30:37 +0000 UTC" firstStartedPulling="2026-04-16 19:31:22.296877578 +0000 UTC m=+65.873655767" lastFinishedPulling="2026-04-16 19:31:28.435232189 +0000 UTC m=+72.012010392" observedRunningTime="2026-04-16 19:31:29.377749496 +0000 UTC m=+72.954527720" watchObservedRunningTime="2026-04-16 19:31:29.379039658 +0000 UTC m=+72.955817869" Apr 16 19:31:29.395271 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.395231 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7mh9f" podStartSLOduration=65.889677312 podStartE2EDuration="1m12.395219961s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:31:21.903161666 +0000 UTC m=+65.479939854" lastFinishedPulling="2026-04-16 19:31:28.408704314 +0000 UTC m=+71.985482503" observedRunningTime="2026-04-16 19:31:29.394174191 +0000 UTC m=+72.970952431" watchObservedRunningTime="2026-04-16 19:31:29.395219961 +0000 UTC m=+72.971998163" Apr 16 19:31:29.411724 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.411685 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4dv2w" podStartSLOduration=34.289870764 podStartE2EDuration="40.411672002s" podCreationTimestamp="2026-04-16 19:30:49 +0000 UTC" firstStartedPulling="2026-04-16 19:31:22.318123054 +0000 UTC m=+65.894901243" lastFinishedPulling="2026-04-16 19:31:28.439924275 +0000 UTC m=+72.016702481" observedRunningTime="2026-04-16 19:31:29.410558453 +0000 UTC m=+72.987336684" watchObservedRunningTime="2026-04-16 19:31:29.411672002 +0000 UTC m=+72.988450214" Apr 16 19:31:29.434200 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.434159 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-ls7dc" podStartSLOduration=65.731903012 podStartE2EDuration="1m12.434145597s" podCreationTimestamp="2026-04-16 19:30:17 +0000 UTC" firstStartedPulling="2026-04-16 19:31:21.920199905 +0000 UTC m=+65.496978108" lastFinishedPulling="2026-04-16 19:31:28.622442489 +0000 UTC m=+72.199220693" observedRunningTime="2026-04-16 19:31:29.433847771 +0000 UTC m=+73.010625982" watchObservedRunningTime="2026-04-16 19:31:29.434145597 +0000 UTC m=+73.010923809" Apr 16 19:31:29.451587 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:29.451550 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-6p4l7" podStartSLOduration=61.256425747 podStartE2EDuration="1m7.451539946s" podCreationTimestamp="2026-04-16 19:30:22 +0000 UTC" firstStartedPulling="2026-04-16 19:31:22.240156312 +0000 UTC m=+65.816934501" lastFinishedPulling="2026-04-16 19:31:28.435270498 +0000 UTC m=+72.012048700" observedRunningTime="2026-04-16 19:31:29.45056104 +0000 UTC m=+73.027339277" watchObservedRunningTime="2026-04-16 19:31:29.451539946 +0000 UTC m=+73.028318156" Apr 16 19:31:30.354454 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.354359 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xzqwj" event={"ID":"e7dc6e11-9200-454f-9b2e-e9d2081a2b29","Type":"ContainerStarted","Data":"93a30c89f91305932085e12dd2ac8e12c12e3824ed17ce0697eca6c73d2692aa"} Apr 16 19:31:30.600112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.600081 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zmnp2"] Apr 16 19:31:30.603296 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.603280 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.606286 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.606231 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 19:31:30.606376 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.606306 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 19:31:30.606376 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.606326 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 19:31:30.607326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.607311 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 19:31:30.607528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.607515 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-nkvcm\"" Apr 16 19:31:30.607641 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.607626 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 19:31:30.612097 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.612072 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zmnp2"] Apr 16 19:31:30.687537 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.687502 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.687671 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.687582 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.687671 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.687613 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.687795 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.687681 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l7ms\" (UniqueName: \"kubernetes.io/projected/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-kube-api-access-4l7ms\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.788567 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.788530 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.788683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.788586 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.788683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.788619 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.788683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.788641 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4l7ms\" (UniqueName: \"kubernetes.io/projected/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-kube-api-access-4l7ms\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.789339 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.789321 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.790771 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.790753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.790947 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.790929 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.796816 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.796798 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l7ms\" (UniqueName: \"kubernetes.io/projected/1ffbb2c0-6d4f-4842-8198-cebed3110c5d-kube-api-access-4l7ms\") pod \"prometheus-operator-5676c8c784-zmnp2\" (UID: \"1ffbb2c0-6d4f-4842-8198-cebed3110c5d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:30.913096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:30.913046 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" Apr 16 19:31:31.035436 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:31.035403 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-zmnp2"] Apr 16 19:31:31.039019 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:31.038989 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ffbb2c0_6d4f_4842_8198_cebed3110c5d.slice/crio-92944682ecd68753b5b4cd5b043d6e963ff63c24e8b0327788372cfa59aa37cd WatchSource:0}: Error finding container 92944682ecd68753b5b4cd5b043d6e963ff63c24e8b0327788372cfa59aa37cd: Status 404 returned error can't find the container with id 92944682ecd68753b5b4cd5b043d6e963ff63c24e8b0327788372cfa59aa37cd Apr 16 19:31:31.359301 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:31.359261 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" event={"ID":"1ffbb2c0-6d4f-4842-8198-cebed3110c5d","Type":"ContainerStarted","Data":"92944682ecd68753b5b4cd5b043d6e963ff63c24e8b0327788372cfa59aa37cd"} Apr 16 19:31:32.364510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:32.364435 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-xzqwj" event={"ID":"e7dc6e11-9200-454f-9b2e-e9d2081a2b29","Type":"ContainerStarted","Data":"0bd495694d62e40409075b26d3272801911db5cd5b7fd0ec721f2f09bb0fefae"} Apr 16 19:31:32.384114 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:32.384009 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-xzqwj" podStartSLOduration=2.541432676 podStartE2EDuration="5.383992227s" podCreationTimestamp="2026-04-16 19:31:27 +0000 UTC" firstStartedPulling="2026-04-16 19:31:29.156564542 +0000 UTC m=+72.733342732" lastFinishedPulling="2026-04-16 19:31:31.999124082 +0000 UTC m=+75.575902283" observedRunningTime="2026-04-16 19:31:32.383499567 +0000 UTC m=+75.960277783" watchObservedRunningTime="2026-04-16 19:31:32.383992227 +0000 UTC m=+75.960770441" Apr 16 19:31:32.940027 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:32.939993 2579 scope.go:117] "RemoveContainer" containerID="0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514" Apr 16 19:31:32.940230 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:31:32.940186 2579 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-ckt7c_openshift-console-operator(d5d951a1-3e60-4517-ae8b-75bba19200c9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podUID="d5d951a1-3e60-4517-ae8b-75bba19200c9" Apr 16 19:31:33.368688 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:33.368602 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" event={"ID":"1ffbb2c0-6d4f-4842-8198-cebed3110c5d","Type":"ContainerStarted","Data":"2f7c493757e49237876973b16d5e433b4ffe8b21540b19d07b798c62a6bd17d8"} Apr 16 19:31:33.368688 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:33.368646 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" event={"ID":"1ffbb2c0-6d4f-4842-8198-cebed3110c5d","Type":"ContainerStarted","Data":"376f60efcaa41e61fd942e79b1027953ba536ef5ceafcca3cc03ab048fe49367"} Apr 16 19:31:33.388897 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:33.388850 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-zmnp2" podStartSLOduration=1.354990925 podStartE2EDuration="3.388836287s" podCreationTimestamp="2026-04-16 19:31:30 +0000 UTC" firstStartedPulling="2026-04-16 19:31:31.041252131 +0000 UTC m=+74.618030321" lastFinishedPulling="2026-04-16 19:31:33.075097491 +0000 UTC m=+76.651875683" observedRunningTime="2026-04-16 19:31:33.387250922 +0000 UTC m=+76.964029130" watchObservedRunningTime="2026-04-16 19:31:33.388836287 +0000 UTC m=+76.965614498" Apr 16 19:31:34.950749 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.950720 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw"] Apr 16 19:31:34.956775 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.956748 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:34.959284 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.959256 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 19:31:34.959434 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.959324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 19:31:34.959616 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.959602 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-p97xg\"" Apr 16 19:31:34.962143 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.962120 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw"] Apr 16 19:31:34.972701 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.972657 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gjg22"] Apr 16 19:31:34.976509 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.976477 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:34.979388 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.979355 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 19:31:34.979519 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.979398 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 19:31:34.979519 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.979462 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 19:31:34.979519 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:34.979477 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-qtsms\"" Apr 16 19:31:35.025015 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.024976 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.025166 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.025030 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcr4\" (UniqueName: \"kubernetes.io/projected/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-kube-api-access-2fcr4\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.025166 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.025078 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.025166 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.025113 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.125758 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125724 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.125957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125778 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-root\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.125957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125810 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcr4\" (UniqueName: \"kubernetes.io/projected/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-kube-api-access-2fcr4\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.125957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-wtmp\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.125957 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125931 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.126170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.125968 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-metrics-client-ca\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.126170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126052 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-tls\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4x47\" (UniqueName: \"kubernetes.io/projected/f1d1fd5a-5106-4eca-85d4-dec132a69811-kube-api-access-d4x47\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126399 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126292 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126399 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126351 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-textfile\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126494 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126402 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-sys\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126494 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126435 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-accelerators-collector-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.126860 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.126838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.128814 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.128786 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.128947 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.128901 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.135878 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.135845 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcr4\" (UniqueName: \"kubernetes.io/projected/0b6ba5cf-ff81-4de6-bf41-1d82a9913e97-kube-api-access-2fcr4\") pod \"openshift-state-metrics-9d44df66c-6chqw\" (UID: \"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.227693 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227605 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-wtmp\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.227693 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227651 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-metrics-client-ca\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.227907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227696 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-tls\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.227907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227720 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d4x47\" (UniqueName: \"kubernetes.io/projected/f1d1fd5a-5106-4eca-85d4-dec132a69811-kube-api-access-d4x47\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.227907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227745 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.227907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227779 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-wtmp\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227911 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-textfile\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227942 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-sys\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.227968 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-accelerators-collector-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228020 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-root\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228317 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-sys\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228317 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f1d1fd5a-5106-4eca-85d4-dec132a69811-root\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228319 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-textfile\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228487 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228463 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-metrics-client-ca\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.228638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.228618 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-accelerators-collector-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.230494 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.230471 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-tls\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.230626 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.230609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f1d1fd5a-5106-4eca-85d4-dec132a69811-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.235819 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.235790 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4x47\" (UniqueName: \"kubernetes.io/projected/f1d1fd5a-5106-4eca-85d4-dec132a69811-kube-api-access-d4x47\") pod \"node-exporter-gjg22\" (UID: \"f1d1fd5a-5106-4eca-85d4-dec132a69811\") " pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.268718 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.268676 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" Apr 16 19:31:35.287733 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.287701 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gjg22" Apr 16 19:31:35.297910 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:35.297855 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1d1fd5a_5106_4eca_85d4_dec132a69811.slice/crio-25b4114e0f820dd3be84ff7f3eceb2624949e5e7b5e67ad6468932c4961a14c1 WatchSource:0}: Error finding container 25b4114e0f820dd3be84ff7f3eceb2624949e5e7b5e67ad6468932c4961a14c1: Status 404 returned error can't find the container with id 25b4114e0f820dd3be84ff7f3eceb2624949e5e7b5e67ad6468932c4961a14c1 Apr 16 19:31:35.376861 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.376829 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gjg22" event={"ID":"f1d1fd5a-5106-4eca-85d4-dec132a69811","Type":"ContainerStarted","Data":"25b4114e0f820dd3be84ff7f3eceb2624949e5e7b5e67ad6468932c4961a14c1"} Apr 16 19:31:35.410459 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:35.410433 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw"] Apr 16 19:31:35.413599 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:35.413570 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b6ba5cf_ff81_4de6_bf41_1d82a9913e97.slice/crio-d3c3e4dfd15c55eb383450a58464e1b82956d831241c1dcf05c213cf2a8a4ee2 WatchSource:0}: Error finding container d3c3e4dfd15c55eb383450a58464e1b82956d831241c1dcf05c213cf2a8a4ee2: Status 404 returned error can't find the container with id d3c3e4dfd15c55eb383450a58464e1b82956d831241c1dcf05c213cf2a8a4ee2 Apr 16 19:31:36.393422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:36.382700 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" event={"ID":"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97","Type":"ContainerStarted","Data":"6ec87b2ec2602c0c72f35b7f21bd022ce98a8f3c0259af3f26635c2b7e5aa580"} Apr 16 19:31:36.393422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:36.382750 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" event={"ID":"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97","Type":"ContainerStarted","Data":"5c3cf70dda2511288a00e483f1b0a173dbbd0c06ddafcffadba8d58b66e35315"} Apr 16 19:31:36.393422 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:36.382767 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" event={"ID":"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97","Type":"ContainerStarted","Data":"d3c3e4dfd15c55eb383450a58464e1b82956d831241c1dcf05c213cf2a8a4ee2"} Apr 16 19:31:37.388616 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:37.388573 2579 generic.go:358] "Generic (PLEG): container finished" podID="f1d1fd5a-5106-4eca-85d4-dec132a69811" containerID="e9e4da0cd31c685c974f262d9c31c19ca89083ea1f4d21bd509bc233739e8e6c" exitCode=0 Apr 16 19:31:37.388773 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:37.388699 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gjg22" event={"ID":"f1d1fd5a-5106-4eca-85d4-dec132a69811","Type":"ContainerDied","Data":"e9e4da0cd31c685c974f262d9c31c19ca89083ea1f4d21bd509bc233739e8e6c"} Apr 16 19:31:37.390738 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:37.390707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" event={"ID":"0b6ba5cf-ff81-4de6-bf41-1d82a9913e97","Type":"ContainerStarted","Data":"1ef65b1cda5f6dcdc2b3648d4d10978d31fe491687b9edfe789bfb47f2cc6af4"} Apr 16 19:31:37.433065 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:37.433005 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6chqw" podStartSLOduration=2.16767149 podStartE2EDuration="3.432985942s" podCreationTimestamp="2026-04-16 19:31:34 +0000 UTC" firstStartedPulling="2026-04-16 19:31:35.56097281 +0000 UTC m=+79.137750999" lastFinishedPulling="2026-04-16 19:31:36.826287262 +0000 UTC m=+80.403065451" observedRunningTime="2026-04-16 19:31:37.431766568 +0000 UTC m=+81.008544780" watchObservedRunningTime="2026-04-16 19:31:37.432985942 +0000 UTC m=+81.009764154" Apr 16 19:31:38.395483 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:38.395446 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gjg22" event={"ID":"f1d1fd5a-5106-4eca-85d4-dec132a69811","Type":"ContainerStarted","Data":"d5161542795c9d51525c435786090b2d70ad1e85d97f24c73b0ad1bd26081e21"} Apr 16 19:31:38.395483 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:38.395487 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gjg22" event={"ID":"f1d1fd5a-5106-4eca-85d4-dec132a69811","Type":"ContainerStarted","Data":"57efb10c834c7c14718d47d7f9f7321289ad64772ac32bee6217349d9fe26653"} Apr 16 19:31:38.417594 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:38.417549 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gjg22" podStartSLOduration=3.284375136 podStartE2EDuration="4.417536789s" podCreationTimestamp="2026-04-16 19:31:34 +0000 UTC" firstStartedPulling="2026-04-16 19:31:35.299867445 +0000 UTC m=+78.876645634" lastFinishedPulling="2026-04-16 19:31:36.433029087 +0000 UTC m=+80.009807287" observedRunningTime="2026-04-16 19:31:38.416438764 +0000 UTC m=+81.993216985" watchObservedRunningTime="2026-04-16 19:31:38.417536789 +0000 UTC m=+81.994315001" Apr 16 19:31:39.357701 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:39.357666 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-x5hqp" Apr 16 19:31:42.960493 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:42.960454 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:31:42.964830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:42.964806 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:31:46.940791 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:46.940765 2579 scope.go:117] "RemoveContainer" containerID="0a9293e8d101828bef0b0651a8b8139db4b3b36e2df1927c48eb5e0b327ae514" Apr 16 19:31:47.425795 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:47.425728 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:31:47.425925 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:47.425817 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" event={"ID":"d5d951a1-3e60-4517-ae8b-75bba19200c9","Type":"ContainerStarted","Data":"77bf820d766bdc0bea3806a13eb6583f4d3945669df99ea00213da37e875d0b0"} Apr 16 19:31:47.426250 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:47.426226 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:47.444182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:47.444124 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" podStartSLOduration=78.543954461 podStartE2EDuration="1m24.444111379s" podCreationTimestamp="2026-04-16 19:30:23 +0000 UTC" firstStartedPulling="2026-04-16 19:30:51.332402593 +0000 UTC m=+34.909180798" lastFinishedPulling="2026-04-16 19:30:57.232559516 +0000 UTC m=+40.809337716" observedRunningTime="2026-04-16 19:31:47.443574508 +0000 UTC m=+91.020352724" watchObservedRunningTime="2026-04-16 19:31:47.444111379 +0000 UTC m=+91.020889591" Apr 16 19:31:48.028096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.028068 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-ckt7c" Apr 16 19:31:48.208521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.208490 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-bwjxs"] Apr 16 19:31:48.212326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.212304 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:31:48.215351 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.215329 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-n92kz\"" Apr 16 19:31:48.215784 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.215764 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 19:31:48.215886 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.215869 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 19:31:48.223323 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.223299 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bwjxs"] Apr 16 19:31:48.346443 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.346354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddlj8\" (UniqueName: \"kubernetes.io/projected/8685294b-3f49-44ea-a7b8-0b967dd8ddbe-kube-api-access-ddlj8\") pod \"downloads-6bcc868b7-bwjxs\" (UID: \"8685294b-3f49-44ea-a7b8-0b967dd8ddbe\") " pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:31:48.447324 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.447290 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddlj8\" (UniqueName: \"kubernetes.io/projected/8685294b-3f49-44ea-a7b8-0b967dd8ddbe-kube-api-access-ddlj8\") pod \"downloads-6bcc868b7-bwjxs\" (UID: \"8685294b-3f49-44ea-a7b8-0b967dd8ddbe\") " pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:31:48.455110 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.455082 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddlj8\" (UniqueName: \"kubernetes.io/projected/8685294b-3f49-44ea-a7b8-0b967dd8ddbe-kube-api-access-ddlj8\") pod \"downloads-6bcc868b7-bwjxs\" (UID: \"8685294b-3f49-44ea-a7b8-0b967dd8ddbe\") " pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:31:48.521923 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.521895 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:31:48.641061 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:48.641037 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-bwjxs"] Apr 16 19:31:48.643357 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:48.643332 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8685294b_3f49_44ea_a7b8_0b967dd8ddbe.slice/crio-26e69de60f139eee0fc68c0f2451d540ef677512f381f51767d3c701195fcf82 WatchSource:0}: Error finding container 26e69de60f139eee0fc68c0f2451d540ef677512f381f51767d3c701195fcf82: Status 404 returned error can't find the container with id 26e69de60f139eee0fc68c0f2451d540ef677512f381f51767d3c701195fcf82 Apr 16 19:31:49.434034 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:49.433978 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bwjxs" event={"ID":"8685294b-3f49-44ea-a7b8-0b967dd8ddbe","Type":"ContainerStarted","Data":"26e69de60f139eee0fc68c0f2451d540ef677512f381f51767d3c701195fcf82"} Apr 16 19:31:53.722378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.722342 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:31:53.725973 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.725951 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.728830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.728807 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 19:31:53.728830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.728816 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 19:31:53.728993 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.728815 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 19:31:53.729911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.729879 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 19:31:53.729997 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.729924 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 19:31:53.729997 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.729927 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jkxxn\"" Apr 16 19:31:53.735222 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.735187 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:31:53.897149 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897070 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.897149 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897111 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.897149 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897142 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.897373 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897271 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.897373 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897306 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9g44\" (UniqueName: \"kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.897373 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.897338 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.997884 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.997842 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.998091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.997913 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.998091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.997960 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.998091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.998011 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.998091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.998043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h9g44\" (UniqueName: \"kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.998091 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.998079 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.999082 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.999055 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.999233 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.999130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:53.999233 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:53.999146 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:54.000891 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.000864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:54.001161 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.001143 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:54.006853 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.006832 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9g44\" (UniqueName: \"kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44\") pod \"console-b455f755b-qhr8r\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:54.035652 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.035617 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:31:54.168988 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.168895 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:31:54.173100 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:31:54.173068 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a5f6c8_877e_4610_a987_0e45f179396a.slice/crio-780d5819edb2bdfec487cc5eb1d849ad90d56c54d66185d72d47e304d69a3289 WatchSource:0}: Error finding container 780d5819edb2bdfec487cc5eb1d849ad90d56c54d66185d72d47e304d69a3289: Status 404 returned error can't find the container with id 780d5819edb2bdfec487cc5eb1d849ad90d56c54d66185d72d47e304d69a3289 Apr 16 19:31:54.451107 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:54.451074 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b455f755b-qhr8r" event={"ID":"b4a5f6c8-877e-4610-a987-0e45f179396a","Type":"ContainerStarted","Data":"780d5819edb2bdfec487cc5eb1d849ad90d56c54d66185d72d47e304d69a3289"} Apr 16 19:31:55.332569 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:55.332532 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6glh4" Apr 16 19:31:58.466324 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:58.466282 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b455f755b-qhr8r" event={"ID":"b4a5f6c8-877e-4610-a987-0e45f179396a","Type":"ContainerStarted","Data":"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940"} Apr 16 19:31:58.483880 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:31:58.483825 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b455f755b-qhr8r" podStartSLOduration=1.696999148 podStartE2EDuration="5.483810454s" podCreationTimestamp="2026-04-16 19:31:53 +0000 UTC" firstStartedPulling="2026-04-16 19:31:54.175173397 +0000 UTC m=+97.751951601" lastFinishedPulling="2026-04-16 19:31:57.961984691 +0000 UTC m=+101.538762907" observedRunningTime="2026-04-16 19:31:58.483473643 +0000 UTC m=+102.060251870" watchObservedRunningTime="2026-04-16 19:31:58.483810454 +0000 UTC m=+102.060588665" Apr 16 19:32:03.911695 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.911650 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:32:03.916793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.916769 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.923281 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.923259 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:32:03.926164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.925968 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 16 19:32:03.992569 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992537 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992592 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992667 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992701 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992748 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjzvv\" (UniqueName: \"kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992782 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:03.992926 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:03.992834 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.036544 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.036508 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:04.036701 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.036561 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:04.042017 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.041994 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:04.093588 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.093554 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.093746 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.093707 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pjzvv\" (UniqueName: \"kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.093746 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.093742 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.093866 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.093807 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094196 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094000 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094196 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094196 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094131 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094627 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094598 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094721 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094674 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094782 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094726 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.094834 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.094777 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.096717 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.096692 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.097007 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.096974 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.101595 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.101574 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjzvv\" (UniqueName: \"kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv\") pod \"console-66ffc66448-mg8fh\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.229277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.229241 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:04.490239 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:04.490141 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:07.979772 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:07.979705 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" podUID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" containerName="registry" containerID="cri-o://636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907" gracePeriod=30 Apr 16 19:32:08.729237 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.729196 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:32:08.836632 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836562 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836632 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836608 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q4z8\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836639 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836678 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836693 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836731 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836761 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.836842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.836793 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets\") pod \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\" (UID: \"c51aa13f-e270-4caa-b4b8-0133dfaf5f84\") " Apr 16 19:32:08.837239 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.837190 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:08.837323 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.837260 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:08.839605 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.839565 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:08.839605 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.839616 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:08.839860 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.839750 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:08.839860 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.839817 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8" (OuterVolumeSpecName: "kube-api-access-9q4z8") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "kube-api-access-9q4z8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:08.840307 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.840272 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:08.846573 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.846523 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c51aa13f-e270-4caa-b4b8-0133dfaf5f84" (UID: "c51aa13f-e270-4caa-b4b8-0133dfaf5f84"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:32:08.916669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.916635 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:32:08.920173 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:32:08.920136 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0ebd625_b7ea_4f5d_aae2_b364ad10a29d.slice/crio-f3f33fae019585f127cb06bf4bf6d69cfe2c513c2636d5a1fed3ad0f8b8838ea WatchSource:0}: Error finding container f3f33fae019585f127cb06bf4bf6d69cfe2c513c2636d5a1fed3ad0f8b8838ea: Status 404 returned error can't find the container with id f3f33fae019585f127cb06bf4bf6d69cfe2c513c2636d5a1fed3ad0f8b8838ea Apr 16 19:32:08.937778 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937746 2579 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-image-registry-private-configuration\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937778 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9q4z8\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-kube-api-access-9q4z8\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937797 2579 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-tls\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937812 2579 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-registry-certificates\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937827 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-trusted-ca\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937842 2579 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-ca-trust-extracted\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937856 2579 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-bound-sa-token\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:08.937873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:08.937870 2579 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c51aa13f-e270-4caa-b4b8-0133dfaf5f84-installation-pull-secrets\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:09.501725 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.501681 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc66448-mg8fh" event={"ID":"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d","Type":"ContainerStarted","Data":"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e"} Apr 16 19:32:09.501725 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.501730 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc66448-mg8fh" event={"ID":"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d","Type":"ContainerStarted","Data":"f3f33fae019585f127cb06bf4bf6d69cfe2c513c2636d5a1fed3ad0f8b8838ea"} Apr 16 19:32:09.503062 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.503032 2579 generic.go:358] "Generic (PLEG): container finished" podID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" containerID="636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907" exitCode=0 Apr 16 19:32:09.503182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.503087 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" Apr 16 19:32:09.503182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.503098 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" event={"ID":"c51aa13f-e270-4caa-b4b8-0133dfaf5f84","Type":"ContainerDied","Data":"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907"} Apr 16 19:32:09.503182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.503141 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-7dcf7c4679-t62l6" event={"ID":"c51aa13f-e270-4caa-b4b8-0133dfaf5f84","Type":"ContainerDied","Data":"f82e667551bcf2e835db6600d0d6eb952d28fc6a8c776a1fd6ebc14df304d3a9"} Apr 16 19:32:09.503182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.503164 2579 scope.go:117] "RemoveContainer" containerID="636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907" Apr 16 19:32:09.504828 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.504802 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-bwjxs" event={"ID":"8685294b-3f49-44ea-a7b8-0b967dd8ddbe","Type":"ContainerStarted","Data":"1a49903ef8cc579ea0a5ce108b1a40463e5350320ee21e76ff7bf01ea61befca"} Apr 16 19:32:09.505082 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.505043 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:32:09.512064 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.512013 2579 scope.go:117] "RemoveContainer" containerID="636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907" Apr 16 19:32:09.512362 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:32:09.512339 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907\": container with ID starting with 636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907 not found: ID does not exist" containerID="636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907" Apr 16 19:32:09.512450 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.512369 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907"} err="failed to get container status \"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907\": rpc error: code = NotFound desc = could not find container \"636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907\": container with ID starting with 636df52d620bf07b264bbd987d66908431828430696c41c72b49cf88518ff907 not found: ID does not exist" Apr 16 19:32:09.518477 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.518458 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-bwjxs" Apr 16 19:32:09.519807 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.519502 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-66ffc66448-mg8fh" podStartSLOduration=6.519486137 podStartE2EDuration="6.519486137s" podCreationTimestamp="2026-04-16 19:32:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:32:09.518877204 +0000 UTC m=+113.095655416" watchObservedRunningTime="2026-04-16 19:32:09.519486137 +0000 UTC m=+113.096264348" Apr 16 19:32:09.531929 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.531901 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:32:09.534994 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.534971 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7dcf7c4679-t62l6"] Apr 16 19:32:09.549503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:09.549457 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-bwjxs" podStartSLOduration=1.5322726740000001 podStartE2EDuration="21.549442946s" podCreationTimestamp="2026-04-16 19:31:48 +0000 UTC" firstStartedPulling="2026-04-16 19:31:48.645436564 +0000 UTC m=+92.222214752" lastFinishedPulling="2026-04-16 19:32:08.662606819 +0000 UTC m=+112.239385024" observedRunningTime="2026-04-16 19:32:09.548665597 +0000 UTC m=+113.125443807" watchObservedRunningTime="2026-04-16 19:32:09.549442946 +0000 UTC m=+113.126221158" Apr 16 19:32:10.944382 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:10.944343 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" path="/var/lib/kubelet/pods/c51aa13f-e270-4caa-b4b8-0133dfaf5f84/volumes" Apr 16 19:32:13.519456 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:13.519421 2579 generic.go:358] "Generic (PLEG): container finished" podID="fba37944-a4ef-4148-aae0-6082d55a03b2" containerID="6c54cf990955cea0e0a9141fe338f9cbb39936a3768f6391308228cf92e157f1" exitCode=0 Apr 16 19:32:13.519863 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:13.519497 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" event={"ID":"fba37944-a4ef-4148-aae0-6082d55a03b2","Type":"ContainerDied","Data":"6c54cf990955cea0e0a9141fe338f9cbb39936a3768f6391308228cf92e157f1"} Apr 16 19:32:13.519935 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:13.519864 2579 scope.go:117] "RemoveContainer" containerID="6c54cf990955cea0e0a9141fe338f9cbb39936a3768f6391308228cf92e157f1" Apr 16 19:32:14.230222 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.230172 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:14.230401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.230246 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:14.235782 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.235758 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:14.524845 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.524757 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-2dz6q" event={"ID":"fba37944-a4ef-4148-aae0-6082d55a03b2","Type":"ContainerStarted","Data":"3e0c3d67fc8825caa1089caa22a0b64e1009a311b0540dbae69d846b991e80fa"} Apr 16 19:32:14.529954 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.529928 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:32:14.594729 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:14.594700 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:32:28.565404 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:28.565374 2579 generic.go:358] "Generic (PLEG): container finished" podID="33c6084e-53d2-4e83-85bf-1e53dca8d967" containerID="39213adc7905f8ae4cac4276dd79d47b3dcde732f8264b7e4a8d70d173d7b6ba" exitCode=0 Apr 16 19:32:28.565806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:28.565450 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" event={"ID":"33c6084e-53d2-4e83-85bf-1e53dca8d967","Type":"ContainerDied","Data":"39213adc7905f8ae4cac4276dd79d47b3dcde732f8264b7e4a8d70d173d7b6ba"} Apr 16 19:32:28.565806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:28.565767 2579 scope.go:117] "RemoveContainer" containerID="39213adc7905f8ae4cac4276dd79d47b3dcde732f8264b7e4a8d70d173d7b6ba" Apr 16 19:32:29.569825 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:29.569790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-ph52k" event={"ID":"33c6084e-53d2-4e83-85bf-1e53dca8d967","Type":"ContainerStarted","Data":"5f6f73fc019ff72aac0452f267933f148206944092839a555d025919c5e9c406"} Apr 16 19:32:39.623228 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:39.623173 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b455f755b-qhr8r" podUID="b4a5f6c8-877e-4610-a987-0e45f179396a" containerName="console" containerID="cri-o://a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940" gracePeriod=15 Apr 16 19:32:39.896767 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:39.896746 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b455f755b-qhr8r_b4a5f6c8-877e-4610-a987-0e45f179396a/console/0.log" Apr 16 19:32:39.896872 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:39.896826 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:40.070798 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.070772 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.070933 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.070806 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.070933 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.070829 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.070933 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.070853 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9g44\" (UniqueName: \"kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.071060 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.070990 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.071112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.071070 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config\") pod \"b4a5f6c8-877e-4610-a987-0e45f179396a\" (UID: \"b4a5f6c8-877e-4610-a987-0e45f179396a\") " Apr 16 19:32:40.071278 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.071247 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config" (OuterVolumeSpecName: "console-config") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:40.071278 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.071253 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca" (OuterVolumeSpecName: "service-ca") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:40.071431 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.071350 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:32:40.073110 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.073080 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:40.073190 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.073121 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44" (OuterVolumeSpecName: "kube-api-access-h9g44") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "kube-api-access-h9g44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:32:40.073190 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.073130 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b4a5f6c8-877e-4610-a987-0e45f179396a" (UID: "b4a5f6c8-877e-4610-a987-0e45f179396a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172451 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-console-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172477 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172487 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9g44\" (UniqueName: \"kubernetes.io/projected/b4a5f6c8-877e-4610-a987-0e45f179396a-kube-api-access-h9g44\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172496 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-oauth-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172506 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b4a5f6c8-877e-4610-a987-0e45f179396a-console-oauth-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.172521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.172516 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b4a5f6c8-877e-4610-a987-0e45f179396a-service-ca\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:32:40.600478 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600454 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b455f755b-qhr8r_b4a5f6c8-877e-4610-a987-0e45f179396a/console/0.log" Apr 16 19:32:40.600638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600493 2579 generic.go:358] "Generic (PLEG): container finished" podID="b4a5f6c8-877e-4610-a987-0e45f179396a" containerID="a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940" exitCode=2 Apr 16 19:32:40.600638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600566 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b455f755b-qhr8r" Apr 16 19:32:40.600638 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600593 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b455f755b-qhr8r" event={"ID":"b4a5f6c8-877e-4610-a987-0e45f179396a","Type":"ContainerDied","Data":"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940"} Apr 16 19:32:40.600830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600642 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b455f755b-qhr8r" event={"ID":"b4a5f6c8-877e-4610-a987-0e45f179396a","Type":"ContainerDied","Data":"780d5819edb2bdfec487cc5eb1d849ad90d56c54d66185d72d47e304d69a3289"} Apr 16 19:32:40.600830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.600663 2579 scope.go:117] "RemoveContainer" containerID="a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940" Apr 16 19:32:40.608468 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.608448 2579 scope.go:117] "RemoveContainer" containerID="a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940" Apr 16 19:32:40.608740 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:32:40.608720 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940\": container with ID starting with a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940 not found: ID does not exist" containerID="a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940" Apr 16 19:32:40.608801 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.608748 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940"} err="failed to get container status \"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940\": rpc error: code = NotFound desc = could not find container \"a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940\": container with ID starting with a3a156a0fd2703b2c5e281dc3b248595877682f71821ea12c5801136b7240940 not found: ID does not exist" Apr 16 19:32:40.620031 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.620009 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:32:40.625051 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.625029 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b455f755b-qhr8r"] Apr 16 19:32:40.946392 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:32:40.944863 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4a5f6c8-877e-4610-a987-0e45f179396a" path="/var/lib/kubelet/pods/b4a5f6c8-877e-4610-a987-0e45f179396a/volumes" Apr 16 19:33:03.264594 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.264497 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:33:03.265125 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265035 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4a5f6c8-877e-4610-a987-0e45f179396a" containerName="console" Apr 16 19:33:03.265125 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265057 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4a5f6c8-877e-4610-a987-0e45f179396a" containerName="console" Apr 16 19:33:03.265125 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265077 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" containerName="registry" Apr 16 19:33:03.265125 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265087 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" containerName="registry" Apr 16 19:33:03.265851 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265389 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="c51aa13f-e270-4caa-b4b8-0133dfaf5f84" containerName="registry" Apr 16 19:33:03.265851 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.265407 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4a5f6c8-877e-4610-a987-0e45f179396a" containerName="console" Apr 16 19:33:03.272353 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.271888 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.279949 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.279925 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:33:03.338691 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338650 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.338691 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338694 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.338864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338735 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.338864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338753 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s9bh\" (UniqueName: \"kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.338864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338777 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.338864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338795 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.339004 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.338866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439707 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439674 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439732 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439770 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439797 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439834 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.439869 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439860 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s9bh\" (UniqueName: \"kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.440113 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.439901 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.440547 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.440521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.440671 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.440521 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.440764 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.440739 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.440968 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.440944 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.442782 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.442758 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.443360 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.443331 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.448114 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.448097 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s9bh\" (UniqueName: \"kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh\") pod \"console-59c6968f64-dcfl2\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.583733 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.583667 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:03.711177 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:03.711149 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:33:03.713307 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:33:03.713281 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48bead9d_9575_4710_b8ae_b777dd5759c9.slice/crio-6f5ff0faae3a2f254b58be1cd606036bb1c20672dcaf01d79523e3f552fceb32 WatchSource:0}: Error finding container 6f5ff0faae3a2f254b58be1cd606036bb1c20672dcaf01d79523e3f552fceb32: Status 404 returned error can't find the container with id 6f5ff0faae3a2f254b58be1cd606036bb1c20672dcaf01d79523e3f552fceb32 Apr 16 19:33:04.673025 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:04.672979 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c6968f64-dcfl2" event={"ID":"48bead9d-9575-4710-b8ae-b777dd5759c9","Type":"ContainerStarted","Data":"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752"} Apr 16 19:33:04.673025 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:04.673031 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c6968f64-dcfl2" event={"ID":"48bead9d-9575-4710-b8ae-b777dd5759c9","Type":"ContainerStarted","Data":"6f5ff0faae3a2f254b58be1cd606036bb1c20672dcaf01d79523e3f552fceb32"} Apr 16 19:33:04.690974 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:04.690916 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59c6968f64-dcfl2" podStartSLOduration=1.6908989220000001 podStartE2EDuration="1.690898922s" podCreationTimestamp="2026-04-16 19:33:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:33:04.688976532 +0000 UTC m=+168.265754743" watchObservedRunningTime="2026-04-16 19:33:04.690898922 +0000 UTC m=+168.267677131" Apr 16 19:33:13.584531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:13.584499 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:13.584990 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:13.584589 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:13.589079 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:13.589059 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:13.706004 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:13.705980 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:33:13.750418 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:13.750389 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:33:38.776291 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:38.776225 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-66ffc66448-mg8fh" podUID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" containerName="console" containerID="cri-o://c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e" gracePeriod=15 Apr 16 19:33:39.006186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.006165 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ffc66448-mg8fh_b0ebd625-b7ea-4f5d-aae2-b364ad10a29d/console/0.log" Apr 16 19:33:39.006303 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.006239 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:33:39.081438 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081381 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081438 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081420 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081601 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081448 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081601 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081467 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081707 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081609 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081707 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081659 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjzvv\" (UniqueName: \"kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081800 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081703 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert\") pod \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\" (UID: \"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d\") " Apr 16 19:33:39.081800 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081780 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config" (OuterVolumeSpecName: "console-config") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:39.081916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081891 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:39.081916 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081902 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca" (OuterVolumeSpecName: "service-ca") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:39.082012 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081971 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.082012 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081993 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-trusted-ca-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.082012 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.081996 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:33:39.082129 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.082010 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-service-ca\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.083675 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.083648 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:39.083757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.083696 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv" (OuterVolumeSpecName: "kube-api-access-pjzvv") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "kube-api-access-pjzvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:33:39.083757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.083747 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" (UID: "b0ebd625-b7ea-4f5d-aae2-b364ad10a29d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:33:39.183182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.183161 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-oauth-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.183182 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.183180 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-oauth-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.183327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.183189 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pjzvv\" (UniqueName: \"kubernetes.io/projected/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-kube-api-access-pjzvv\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.183327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.183199 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d-console-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:33:39.781294 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781269 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-66ffc66448-mg8fh_b0ebd625-b7ea-4f5d-aae2-b364ad10a29d/console/0.log" Apr 16 19:33:39.781670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781309 2579 generic.go:358] "Generic (PLEG): container finished" podID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" containerID="c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e" exitCode=2 Apr 16 19:33:39.781670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781375 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc66448-mg8fh" event={"ID":"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d","Type":"ContainerDied","Data":"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e"} Apr 16 19:33:39.781670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781390 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-66ffc66448-mg8fh" Apr 16 19:33:39.781670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-66ffc66448-mg8fh" event={"ID":"b0ebd625-b7ea-4f5d-aae2-b364ad10a29d","Type":"ContainerDied","Data":"f3f33fae019585f127cb06bf4bf6d69cfe2c513c2636d5a1fed3ad0f8b8838ea"} Apr 16 19:33:39.781670 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.781423 2579 scope.go:117] "RemoveContainer" containerID="c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e" Apr 16 19:33:39.790578 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.790553 2579 scope.go:117] "RemoveContainer" containerID="c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e" Apr 16 19:33:39.790888 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:33:39.790865 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e\": container with ID starting with c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e not found: ID does not exist" containerID="c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e" Apr 16 19:33:39.790934 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.790896 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e"} err="failed to get container status \"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e\": rpc error: code = NotFound desc = could not find container \"c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e\": container with ID starting with c1662eebedd70787dd0d0f464079c7e193b4ee4bed86261ed2bd2a7c2b7c093e not found: ID does not exist" Apr 16 19:33:39.803083 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.803058 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:33:39.808269 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:39.808245 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-66ffc66448-mg8fh"] Apr 16 19:33:40.942303 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:33:40.942271 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" path="/var/lib/kubelet/pods/b0ebd625-b7ea-4f5d-aae2-b364ad10a29d/volumes" Apr 16 19:34:17.077419 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.077355 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:34:17.077769 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.077753 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" containerName="console" Apr 16 19:34:17.077810 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.077773 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" containerName="console" Apr 16 19:34:17.077867 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.077856 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="b0ebd625-b7ea-4f5d-aae2-b364ad10a29d" containerName="console" Apr 16 19:34:17.081862 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.081842 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.092591 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.092566 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:34:17.129063 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129044 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6tcf\" (UniqueName: \"kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129171 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129171 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129110 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129272 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129165 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129272 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129243 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129272 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129263 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.129363 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.129285 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230400 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230377 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230412 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230429 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230449 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230599 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230562 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w6tcf\" (UniqueName: \"kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230648 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230638 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.230685 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.230663 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.231119 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.231092 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.231227 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.231130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.231293 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.231248 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.231353 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.231328 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.232864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.232831 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.233062 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.233043 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.242658 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.242631 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6tcf\" (UniqueName: \"kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf\") pod \"console-5d6bbd894d-d9m49\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.392836 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.392789 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:17.517660 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.517632 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:34:17.519936 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:34:17.519908 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode785553f_df01_4e45_b876_2c6f1cdee6ef.slice/crio-18e50c95aece30ca18b8befabc896ae371a89e521a816a7b5170819835b98ba0 WatchSource:0}: Error finding container 18e50c95aece30ca18b8befabc896ae371a89e521a816a7b5170819835b98ba0: Status 404 returned error can't find the container with id 18e50c95aece30ca18b8befabc896ae371a89e521a816a7b5170819835b98ba0 Apr 16 19:34:17.888697 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.888667 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6bbd894d-d9m49" event={"ID":"e785553f-df01-4e45-b876-2c6f1cdee6ef","Type":"ContainerStarted","Data":"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097"} Apr 16 19:34:17.888877 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.888707 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6bbd894d-d9m49" event={"ID":"e785553f-df01-4e45-b876-2c6f1cdee6ef","Type":"ContainerStarted","Data":"18e50c95aece30ca18b8befabc896ae371a89e521a816a7b5170819835b98ba0"} Apr 16 19:34:17.909999 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:17.909952 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d6bbd894d-d9m49" podStartSLOduration=0.909937599 podStartE2EDuration="909.937599ms" podCreationTimestamp="2026-04-16 19:34:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:34:17.907898272 +0000 UTC m=+241.484676509" watchObservedRunningTime="2026-04-16 19:34:17.909937599 +0000 UTC m=+241.486715808" Apr 16 19:34:27.393359 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:27.393325 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:27.393359 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:27.393361 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:27.398893 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:27.398869 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:27.919286 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:27.919261 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:34:27.966937 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:27.966910 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:34:52.987679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:52.987577 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-59c6968f64-dcfl2" podUID="48bead9d-9575-4710-b8ae-b777dd5759c9" containerName="console" containerID="cri-o://585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752" gracePeriod=15 Apr 16 19:34:53.219639 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.219614 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c6968f64-dcfl2_48bead9d-9575-4710-b8ae-b777dd5759c9/console/0.log" Apr 16 19:34:53.219779 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.219687 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:34:53.388645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388560 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s9bh\" (UniqueName: \"kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388611 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388640 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388662 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388691 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388711 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.388911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.388746 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca\") pod \"48bead9d-9575-4710-b8ae-b777dd5759c9\" (UID: \"48bead9d-9575-4710-b8ae-b777dd5759c9\") " Apr 16 19:34:53.389348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.389171 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config" (OuterVolumeSpecName: "console-config") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:34:53.389348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.389200 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:34:53.389348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.389318 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca" (OuterVolumeSpecName: "service-ca") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:34:53.389348 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.389314 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:34:53.390714 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.390692 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh" (OuterVolumeSpecName: "kube-api-access-6s9bh") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "kube-api-access-6s9bh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:34:53.390813 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.390800 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:34:53.390873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.390822 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "48bead9d-9575-4710-b8ae-b777dd5759c9" (UID: "48bead9d-9575-4710-b8ae-b777dd5759c9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:34:53.489762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489731 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-service-ca\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489756 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6s9bh\" (UniqueName: \"kubernetes.io/projected/48bead9d-9575-4710-b8ae-b777dd5759c9-kube-api-access-6s9bh\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489768 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-console-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489952 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489777 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489952 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489786 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-trusted-ca-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489952 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489794 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/48bead9d-9575-4710-b8ae-b777dd5759c9-oauth-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.489952 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.489804 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/48bead9d-9575-4710-b8ae-b777dd5759c9-console-oauth-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:34:53.985356 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985331 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59c6968f64-dcfl2_48bead9d-9575-4710-b8ae-b777dd5759c9/console/0.log" Apr 16 19:34:53.985528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985370 2579 generic.go:358] "Generic (PLEG): container finished" podID="48bead9d-9575-4710-b8ae-b777dd5759c9" containerID="585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752" exitCode=2 Apr 16 19:34:53.985528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985404 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c6968f64-dcfl2" event={"ID":"48bead9d-9575-4710-b8ae-b777dd5759c9","Type":"ContainerDied","Data":"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752"} Apr 16 19:34:53.985528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985425 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c6968f64-dcfl2" event={"ID":"48bead9d-9575-4710-b8ae-b777dd5759c9","Type":"ContainerDied","Data":"6f5ff0faae3a2f254b58be1cd606036bb1c20672dcaf01d79523e3f552fceb32"} Apr 16 19:34:53.985528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985431 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c6968f64-dcfl2" Apr 16 19:34:53.985528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.985439 2579 scope.go:117] "RemoveContainer" containerID="585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752" Apr 16 19:34:53.993704 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.993536 2579 scope.go:117] "RemoveContainer" containerID="585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752" Apr 16 19:34:53.993942 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:34:53.993805 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752\": container with ID starting with 585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752 not found: ID does not exist" containerID="585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752" Apr 16 19:34:53.993942 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:53.993830 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752"} err="failed to get container status \"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752\": rpc error: code = NotFound desc = could not find container \"585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752\": container with ID starting with 585bf0826b52da6651cb98729191c80c859a53ef3e9c82362786cd14a52ff752 not found: ID does not exist" Apr 16 19:34:54.006629 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:54.006604 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:34:54.016120 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:54.016100 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-59c6968f64-dcfl2"] Apr 16 19:34:54.945394 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:54.945358 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48bead9d-9575-4710-b8ae-b777dd5759c9" path="/var/lib/kubelet/pods/48bead9d-9575-4710-b8ae-b777dd5759c9/volumes" Apr 16 19:34:59.581926 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.581891 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2"] Apr 16 19:34:59.582400 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.582173 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="48bead9d-9575-4710-b8ae-b777dd5759c9" containerName="console" Apr 16 19:34:59.582400 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.582184 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="48bead9d-9575-4710-b8ae-b777dd5759c9" containerName="console" Apr 16 19:34:59.582400 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.582247 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="48bead9d-9575-4710-b8ae-b777dd5759c9" containerName="console" Apr 16 19:34:59.585040 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.585022 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.587685 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.587665 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:34:59.587786 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.587710 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:34:59.588797 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.588771 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:34:59.595252 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.595230 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2"] Apr 16 19:34:59.640841 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.640807 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.640950 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.640850 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.640950 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.640899 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmrbg\" (UniqueName: \"kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.742009 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.741981 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mmrbg\" (UniqueName: \"kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.742112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.742043 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.742112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.742062 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.742385 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.742369 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.742464 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.742446 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.750956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.750932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmrbg\" (UniqueName: \"kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:34:59.894151 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:34:59.894076 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:35:00.009504 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:00.009474 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2"] Apr 16 19:35:00.012375 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:00.012348 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbee4208a_8b56_4a8d_8aed_e3c06b434dcc.slice/crio-d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5 WatchSource:0}: Error finding container d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5: Status 404 returned error can't find the container with id d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5 Apr 16 19:35:01.005245 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:01.005191 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" event={"ID":"bee4208a-8b56-4a8d-8aed-e3c06b434dcc","Type":"ContainerStarted","Data":"d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5"} Apr 16 19:35:06.023021 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:06.022992 2579 generic.go:358] "Generic (PLEG): container finished" podID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerID="4bcd2789d52cf33aee86e57a84883065eca670dd0f5961eb8297de4743baf434" exitCode=0 Apr 16 19:35:06.023358 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:06.023038 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" event={"ID":"bee4208a-8b56-4a8d-8aed-e3c06b434dcc","Type":"ContainerDied","Data":"4bcd2789d52cf33aee86e57a84883065eca670dd0f5961eb8297de4743baf434"} Apr 16 19:35:09.033142 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:09.033110 2579 generic.go:358] "Generic (PLEG): container finished" podID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerID="155a39f1618f07db1a302cf598475e4c668fb6df94e201edfb592870b4f3cb32" exitCode=0 Apr 16 19:35:09.033510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:09.033171 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" event={"ID":"bee4208a-8b56-4a8d-8aed-e3c06b434dcc","Type":"ContainerDied","Data":"155a39f1618f07db1a302cf598475e4c668fb6df94e201edfb592870b4f3cb32"} Apr 16 19:35:16.802779 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:16.802719 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:35:16.803174 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:16.802849 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:35:16.811880 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:16.811855 2579 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 19:35:17.060997 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:17.060920 2579 generic.go:358] "Generic (PLEG): container finished" podID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerID="f4f76c90501e5953ecc565b8bf77442f4dd7506aca758a1f098a41e7486c8189" exitCode=0 Apr 16 19:35:17.061150 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:17.061009 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" event={"ID":"bee4208a-8b56-4a8d-8aed-e3c06b434dcc","Type":"ContainerDied","Data":"f4f76c90501e5953ecc565b8bf77442f4dd7506aca758a1f098a41e7486c8189"} Apr 16 19:35:18.189027 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.189004 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:35:18.287776 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.287726 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmrbg\" (UniqueName: \"kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg\") pod \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " Apr 16 19:35:18.287966 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.287788 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util\") pod \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " Apr 16 19:35:18.287966 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.287855 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle\") pod \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\" (UID: \"bee4208a-8b56-4a8d-8aed-e3c06b434dcc\") " Apr 16 19:35:18.288530 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.288500 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle" (OuterVolumeSpecName: "bundle") pod "bee4208a-8b56-4a8d-8aed-e3c06b434dcc" (UID: "bee4208a-8b56-4a8d-8aed-e3c06b434dcc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:18.289907 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.289880 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg" (OuterVolumeSpecName: "kube-api-access-mmrbg") pod "bee4208a-8b56-4a8d-8aed-e3c06b434dcc" (UID: "bee4208a-8b56-4a8d-8aed-e3c06b434dcc"). InnerVolumeSpecName "kube-api-access-mmrbg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:35:18.292173 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.292152 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util" (OuterVolumeSpecName: "util") pod "bee4208a-8b56-4a8d-8aed-e3c06b434dcc" (UID: "bee4208a-8b56-4a8d-8aed-e3c06b434dcc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:18.388669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.388576 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mmrbg\" (UniqueName: \"kubernetes.io/projected/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-kube-api-access-mmrbg\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:18.388669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.388611 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:18.388669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:18.388622 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/bee4208a-8b56-4a8d-8aed-e3c06b434dcc-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:19.068521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:19.068489 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" event={"ID":"bee4208a-8b56-4a8d-8aed-e3c06b434dcc","Type":"ContainerDied","Data":"d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5"} Apr 16 19:35:19.068521 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:19.068515 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5gphg2" Apr 16 19:35:19.068732 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:19.068520 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4aaba8e43e4cb047c201efd8357ab1e9399ff30cf048461f5eef4a5caa63fc5" Apr 16 19:35:21.884073 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884039 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6"] Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884324 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="extract" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884335 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="extract" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884344 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="pull" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884350 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="pull" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884368 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="util" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884378 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="util" Apr 16 19:35:21.884457 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.884439 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="bee4208a-8b56-4a8d-8aed-e3c06b434dcc" containerName="extract" Apr 16 19:35:21.914104 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.914064 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6"] Apr 16 19:35:21.914266 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.914142 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:21.917036 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.917017 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-g28cz\"" Apr 16 19:35:21.917152 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.917092 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:35:21.917152 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:21.917127 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 19:35:22.018528 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.018497 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf29f\" (UniqueName: \"kubernetes.io/projected/22545554-2069-4726-887a-11bae159f406-kube-api-access-qf29f\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.018656 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.018560 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22545554-2069-4726-887a-11bae159f406-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.119757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.119723 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22545554-2069-4726-887a-11bae159f406-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.119909 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.119782 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf29f\" (UniqueName: \"kubernetes.io/projected/22545554-2069-4726-887a-11bae159f406-kube-api-access-qf29f\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.120153 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.120135 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22545554-2069-4726-887a-11bae159f406-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.131897 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.131864 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf29f\" (UniqueName: \"kubernetes.io/projected/22545554-2069-4726-887a-11bae159f406-kube-api-access-qf29f\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-nr8d6\" (UID: \"22545554-2069-4726-887a-11bae159f406\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.223278 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.223252 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" Apr 16 19:35:22.342495 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.342414 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6"] Apr 16 19:35:22.344853 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:22.344813 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22545554_2069_4726_887a_11bae159f406.slice/crio-d5d400c9965e9ed08dbf145acd4522233e5979eb9d4ddce7d229430f05d46a47 WatchSource:0}: Error finding container d5d400c9965e9ed08dbf145acd4522233e5979eb9d4ddce7d229430f05d46a47: Status 404 returned error can't find the container with id d5d400c9965e9ed08dbf145acd4522233e5979eb9d4ddce7d229430f05d46a47 Apr 16 19:35:22.347297 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:22.347274 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:35:23.081009 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:23.080977 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" event={"ID":"22545554-2069-4726-887a-11bae159f406","Type":"ContainerStarted","Data":"d5d400c9965e9ed08dbf145acd4522233e5979eb9d4ddce7d229430f05d46a47"} Apr 16 19:35:26.093267 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:26.093229 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" event={"ID":"22545554-2069-4726-887a-11bae159f406","Type":"ContainerStarted","Data":"c7fef3ab1da25ccddc39d4ca3e67040c217daccf9946ee6d231542fd69ae639d"} Apr 16 19:35:26.114308 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:26.114256 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-nr8d6" podStartSLOduration=2.2943816630000002 podStartE2EDuration="5.114236795s" podCreationTimestamp="2026-04-16 19:35:21 +0000 UTC" firstStartedPulling="2026-04-16 19:35:22.347410789 +0000 UTC m=+305.924188978" lastFinishedPulling="2026-04-16 19:35:25.167265907 +0000 UTC m=+308.744044110" observedRunningTime="2026-04-16 19:35:26.11213851 +0000 UTC m=+309.688916723" watchObservedRunningTime="2026-04-16 19:35:26.114236795 +0000 UTC m=+309.691015007" Apr 16 19:35:28.339803 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.339768 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c"] Apr 16 19:35:28.343299 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.343282 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.345705 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.345682 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:35:28.345911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.345896 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:35:28.346697 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.346677 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:35:28.350666 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.350643 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c"] Apr 16 19:35:28.472379 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.472354 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.472493 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.472392 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.472493 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.472415 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gbmk\" (UniqueName: \"kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.573121 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.573096 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.573254 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.573129 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.573254 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.573153 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6gbmk\" (UniqueName: \"kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.573471 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.573449 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.573561 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.573538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.586970 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.586935 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gbmk\" (UniqueName: \"kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.653061 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.653012 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:28.772780 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.772753 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c"] Apr 16 19:35:28.775628 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:28.775602 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e663b29_048c_487e_a01f_4611d525920e.slice/crio-10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1 WatchSource:0}: Error finding container 10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1: Status 404 returned error can't find the container with id 10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1 Apr 16 19:35:28.859912 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.859879 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-f6g7k"] Apr 16 19:35:28.862968 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.862953 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:28.866453 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.866431 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-qtkhw\"" Apr 16 19:35:28.866572 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.866549 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 19:35:28.874241 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.874220 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 19:35:28.879125 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.879106 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-f6g7k"] Apr 16 19:35:28.975992 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.975967 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:28.976109 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:28.975996 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvrkx\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-kube-api-access-cvrkx\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.076836 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.076798 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.076954 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.076839 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvrkx\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-kube-api-access-cvrkx\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.084552 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.084529 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.085148 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.085128 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvrkx\" (UniqueName: \"kubernetes.io/projected/800ecb87-4f31-4335-8579-a0e49c3ac031-kube-api-access-cvrkx\") pod \"cert-manager-webhook-597b96b99b-f6g7k\" (UID: \"800ecb87-4f31-4335-8579-a0e49c3ac031\") " pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.104451 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.104428 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e663b29-048c-487e-a01f-4611d525920e" containerID="a87cea87e8ed8d3fbfb1d0eaabb4f0db28c64fee36214b643ad45c6b2cb38269" exitCode=0 Apr 16 19:35:29.104534 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.104473 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" event={"ID":"8e663b29-048c-487e-a01f-4611d525920e","Type":"ContainerDied","Data":"a87cea87e8ed8d3fbfb1d0eaabb4f0db28c64fee36214b643ad45c6b2cb38269"} Apr 16 19:35:29.104534 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.104492 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" event={"ID":"8e663b29-048c-487e-a01f-4611d525920e","Type":"ContainerStarted","Data":"10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1"} Apr 16 19:35:29.179740 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.179719 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:29.294559 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:29.294470 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-f6g7k"] Apr 16 19:35:29.297126 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:29.297090 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod800ecb87_4f31_4335_8579_a0e49c3ac031.slice/crio-5e513544b58abe41dd05d597bb3fe7c439d5d16f31286a93c22fb6dd5d960bcd WatchSource:0}: Error finding container 5e513544b58abe41dd05d597bb3fe7c439d5d16f31286a93c22fb6dd5d960bcd: Status 404 returned error can't find the container with id 5e513544b58abe41dd05d597bb3fe7c439d5d16f31286a93c22fb6dd5d960bcd Apr 16 19:35:30.109000 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:30.108955 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" event={"ID":"800ecb87-4f31-4335-8579-a0e49c3ac031","Type":"ContainerStarted","Data":"5e513544b58abe41dd05d597bb3fe7c439d5d16f31286a93c22fb6dd5d960bcd"} Apr 16 19:35:32.116930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:32.116894 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e663b29-048c-487e-a01f-4611d525920e" containerID="49d168499ce917f1b142eb1e1d20fbf7402ed5df4f298db51210f2fb49d4a0ac" exitCode=0 Apr 16 19:35:32.117291 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:32.116938 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" event={"ID":"8e663b29-048c-487e-a01f-4611d525920e","Type":"ContainerDied","Data":"49d168499ce917f1b142eb1e1d20fbf7402ed5df4f298db51210f2fb49d4a0ac"} Apr 16 19:35:33.122389 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:33.122355 2579 generic.go:358] "Generic (PLEG): container finished" podID="8e663b29-048c-487e-a01f-4611d525920e" containerID="d23f6e44d0876701e0802bc4ebd04afba295e4701d471340c30125ed43a09163" exitCode=0 Apr 16 19:35:33.122760 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:33.122421 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" event={"ID":"8e663b29-048c-487e-a01f-4611d525920e","Type":"ContainerDied","Data":"d23f6e44d0876701e0802bc4ebd04afba295e4701d471340c30125ed43a09163"} Apr 16 19:35:34.127469 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.127380 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" event={"ID":"800ecb87-4f31-4335-8579-a0e49c3ac031","Type":"ContainerStarted","Data":"cb0e54b49603b533790468baadb1f2d4dfb9faf4c4287faefb615cf4110204c9"} Apr 16 19:35:34.127819 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.127545 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:34.146581 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.146531 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" podStartSLOduration=1.7389451 podStartE2EDuration="6.146516685s" podCreationTimestamp="2026-04-16 19:35:28 +0000 UTC" firstStartedPulling="2026-04-16 19:35:29.29895964 +0000 UTC m=+312.875737829" lastFinishedPulling="2026-04-16 19:35:33.706531212 +0000 UTC m=+317.283309414" observedRunningTime="2026-04-16 19:35:34.145626798 +0000 UTC m=+317.722405022" watchObservedRunningTime="2026-04-16 19:35:34.146516685 +0000 UTC m=+317.723294895" Apr 16 19:35:34.246584 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.246564 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:34.320050 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.320024 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util\") pod \"8e663b29-048c-487e-a01f-4611d525920e\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " Apr 16 19:35:34.320050 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.320052 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle\") pod \"8e663b29-048c-487e-a01f-4611d525920e\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " Apr 16 19:35:34.320186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.320074 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gbmk\" (UniqueName: \"kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk\") pod \"8e663b29-048c-487e-a01f-4611d525920e\" (UID: \"8e663b29-048c-487e-a01f-4611d525920e\") " Apr 16 19:35:34.320441 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.320419 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle" (OuterVolumeSpecName: "bundle") pod "8e663b29-048c-487e-a01f-4611d525920e" (UID: "8e663b29-048c-487e-a01f-4611d525920e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:34.322050 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.322026 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk" (OuterVolumeSpecName: "kube-api-access-6gbmk") pod "8e663b29-048c-487e-a01f-4611d525920e" (UID: "8e663b29-048c-487e-a01f-4611d525920e"). InnerVolumeSpecName "kube-api-access-6gbmk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:35:34.324196 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.324176 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util" (OuterVolumeSpecName: "util") pod "8e663b29-048c-487e-a01f-4611d525920e" (UID: "8e663b29-048c-487e-a01f-4611d525920e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:34.421508 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.421447 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:34.421508 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.421471 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e663b29-048c-487e-a01f-4611d525920e-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:34.421508 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:34.421485 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6gbmk\" (UniqueName: \"kubernetes.io/projected/8e663b29-048c-487e-a01f-4611d525920e-kube-api-access-6gbmk\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:35.132164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:35.132137 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" Apr 16 19:35:35.132164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:35.132143 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87fzz68c" event={"ID":"8e663b29-048c-487e-a01f-4611d525920e","Type":"ContainerDied","Data":"10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1"} Apr 16 19:35:35.132639 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:35.132173 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10da0d1c124c9b3a85c4e94d27461fe4c39e4a9253b2cc0bc4e27465935171e1" Apr 16 19:35:40.134881 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.134844 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-f6g7k" Apr 16 19:35:40.644578 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644542 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj"] Apr 16 19:35:40.644872 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644855 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="extract" Apr 16 19:35:40.644930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644874 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="extract" Apr 16 19:35:40.644930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644891 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="pull" Apr 16 19:35:40.644930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644896 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="pull" Apr 16 19:35:40.644930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644909 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="util" Apr 16 19:35:40.644930 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644914 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="util" Apr 16 19:35:40.645079 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.644968 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8e663b29-048c-487e-a01f-4611d525920e" containerName="extract" Apr 16 19:35:40.650343 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.650326 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.652870 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.652834 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 19:35:40.652979 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.652834 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-zvqhh\"" Apr 16 19:35:40.653937 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.653914 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 19:35:40.655866 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.655842 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj"] Apr 16 19:35:40.766969 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.766943 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhmgd\" (UniqueName: \"kubernetes.io/projected/419999fc-bda6-4250-8aa0-5f0797260078-kube-api-access-zhmgd\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.767070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.766977 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/419999fc-bda6-4250-8aa0-5f0797260078-tmp\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.867818 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.867793 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zhmgd\" (UniqueName: \"kubernetes.io/projected/419999fc-bda6-4250-8aa0-5f0797260078-kube-api-access-zhmgd\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.867920 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.867827 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/419999fc-bda6-4250-8aa0-5f0797260078-tmp\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.868112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.868096 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/419999fc-bda6-4250-8aa0-5f0797260078-tmp\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.878862 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.878838 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhmgd\" (UniqueName: \"kubernetes.io/projected/419999fc-bda6-4250-8aa0-5f0797260078-kube-api-access-zhmgd\") pod \"openshift-lws-operator-bfc7f696d-h4drj\" (UID: \"419999fc-bda6-4250-8aa0-5f0797260078\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:40.960442 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:40.960426 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" Apr 16 19:35:41.079278 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:41.079252 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj"] Apr 16 19:35:41.081760 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:41.081734 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod419999fc_bda6_4250_8aa0_5f0797260078.slice/crio-36a9fd2fd7c27a0211761b0d7440bbdce50f41adcabaa53b32c198ed055029bc WatchSource:0}: Error finding container 36a9fd2fd7c27a0211761b0d7440bbdce50f41adcabaa53b32c198ed055029bc: Status 404 returned error can't find the container with id 36a9fd2fd7c27a0211761b0d7440bbdce50f41adcabaa53b32c198ed055029bc Apr 16 19:35:41.151403 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:41.151361 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" event={"ID":"419999fc-bda6-4250-8aa0-5f0797260078","Type":"ContainerStarted","Data":"36a9fd2fd7c27a0211761b0d7440bbdce50f41adcabaa53b32c198ed055029bc"} Apr 16 19:35:44.163378 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:44.163292 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" event={"ID":"419999fc-bda6-4250-8aa0-5f0797260078","Type":"ContainerStarted","Data":"9c11a8c6106087d8391ee53df08949a950f46e8adffd5781747f44b283b14fd9"} Apr 16 19:35:44.182910 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:44.182850 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-h4drj" podStartSLOduration=1.426731119 podStartE2EDuration="4.182834153s" podCreationTimestamp="2026-04-16 19:35:40 +0000 UTC" firstStartedPulling="2026-04-16 19:35:41.083084985 +0000 UTC m=+324.659863174" lastFinishedPulling="2026-04-16 19:35:43.839188016 +0000 UTC m=+327.415966208" observedRunningTime="2026-04-16 19:35:44.179667082 +0000 UTC m=+327.756445295" watchObservedRunningTime="2026-04-16 19:35:44.182834153 +0000 UTC m=+327.759612365" Apr 16 19:35:46.587468 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.587434 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs"] Apr 16 19:35:46.611265 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.611237 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs"] Apr 16 19:35:46.611428 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.611357 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.614864 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.614836 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:35:46.616926 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.616896 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:35:46.616926 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.616919 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:35:46.711277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.711242 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jrc4\" (UniqueName: \"kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.711441 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.711330 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.711441 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.711422 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.811802 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.811771 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6jrc4\" (UniqueName: \"kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.811956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.811816 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.811956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.811878 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.812341 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.812320 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.812341 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.812333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.820779 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.820753 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jrc4\" (UniqueName: \"kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:46.921366 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:46.921293 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:47.250039 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:47.250011 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs"] Apr 16 19:35:47.252312 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:35:47.252281 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd537cdb1_76b6_482e_babe_5afe642f4362.slice/crio-73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0 WatchSource:0}: Error finding container 73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0: Status 404 returned error can't find the container with id 73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0 Apr 16 19:35:48.176038 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:48.175953 2579 generic.go:358] "Generic (PLEG): container finished" podID="d537cdb1-76b6-482e-babe-5afe642f4362" containerID="f26a56d6ccaca8ab87e0701b51602deddeba4f61342dd94d31c75cf2b9ffadd5" exitCode=0 Apr 16 19:35:48.176485 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:48.176030 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" event={"ID":"d537cdb1-76b6-482e-babe-5afe642f4362","Type":"ContainerDied","Data":"f26a56d6ccaca8ab87e0701b51602deddeba4f61342dd94d31c75cf2b9ffadd5"} Apr 16 19:35:48.176485 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:48.176062 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" event={"ID":"d537cdb1-76b6-482e-babe-5afe642f4362","Type":"ContainerStarted","Data":"73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0"} Apr 16 19:35:50.183529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:50.183493 2579 generic.go:358] "Generic (PLEG): container finished" podID="d537cdb1-76b6-482e-babe-5afe642f4362" containerID="c67c8802cc2395ba8f56a68c0e77861397fc77e69fcdc26bc734fc4573e99e25" exitCode=0 Apr 16 19:35:50.183910 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:50.183578 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" event={"ID":"d537cdb1-76b6-482e-babe-5afe642f4362","Type":"ContainerDied","Data":"c67c8802cc2395ba8f56a68c0e77861397fc77e69fcdc26bc734fc4573e99e25"} Apr 16 19:35:51.188345 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:51.188314 2579 generic.go:358] "Generic (PLEG): container finished" podID="d537cdb1-76b6-482e-babe-5afe642f4362" containerID="81cab996c88808c2089f66786eb55e1c6854854b7bf1ea9f1665d5c7bdde9bd2" exitCode=0 Apr 16 19:35:51.188782 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:51.188352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" event={"ID":"d537cdb1-76b6-482e-babe-5afe642f4362","Type":"ContainerDied","Data":"81cab996c88808c2089f66786eb55e1c6854854b7bf1ea9f1665d5c7bdde9bd2"} Apr 16 19:35:52.308306 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.308278 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:35:52.452498 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.452420 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jrc4\" (UniqueName: \"kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4\") pod \"d537cdb1-76b6-482e-babe-5afe642f4362\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " Apr 16 19:35:52.452498 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.452464 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util\") pod \"d537cdb1-76b6-482e-babe-5afe642f4362\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " Apr 16 19:35:52.452728 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.452574 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle\") pod \"d537cdb1-76b6-482e-babe-5afe642f4362\" (UID: \"d537cdb1-76b6-482e-babe-5afe642f4362\") " Apr 16 19:35:52.453260 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.453234 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle" (OuterVolumeSpecName: "bundle") pod "d537cdb1-76b6-482e-babe-5afe642f4362" (UID: "d537cdb1-76b6-482e-babe-5afe642f4362"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:52.454650 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.454623 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4" (OuterVolumeSpecName: "kube-api-access-6jrc4") pod "d537cdb1-76b6-482e-babe-5afe642f4362" (UID: "d537cdb1-76b6-482e-babe-5afe642f4362"). InnerVolumeSpecName "kube-api-access-6jrc4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:35:52.553383 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.553340 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6jrc4\" (UniqueName: \"kubernetes.io/projected/d537cdb1-76b6-482e-babe-5afe642f4362-kube-api-access-6jrc4\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:52.553383 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.553381 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:52.553914 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.553871 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util" (OuterVolumeSpecName: "util") pod "d537cdb1-76b6-482e-babe-5afe642f4362" (UID: "d537cdb1-76b6-482e-babe-5afe642f4362"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:35:52.654651 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:52.654612 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/d537cdb1-76b6-482e-babe-5afe642f4362-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:35:53.196388 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:53.196352 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" event={"ID":"d537cdb1-76b6-482e-babe-5afe642f4362","Type":"ContainerDied","Data":"73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0"} Apr 16 19:35:53.196388 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:53.196388 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73d13273da1f98d15d8705706fd43213627004e63ea0a1f17742bb862dd760d0" Apr 16 19:35:53.196586 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:35:53.196404 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5jb8cs" Apr 16 19:36:03.655469 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655376 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq"] Apr 16 19:36:03.655827 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655818 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="pull" Apr 16 19:36:03.655867 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655833 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="pull" Apr 16 19:36:03.655867 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655844 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="extract" Apr 16 19:36:03.655867 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655853 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="extract" Apr 16 19:36:03.655980 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655869 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="util" Apr 16 19:36:03.655980 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655881 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="util" Apr 16 19:36:03.655980 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.655964 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="d537cdb1-76b6-482e-babe-5afe642f4362" containerName="extract" Apr 16 19:36:03.658994 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.658975 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6"] Apr 16 19:36:03.659141 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.659124 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.662391 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.662370 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.662806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.662783 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 19:36:03.662806 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.662798 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 19:36:03.663271 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.663254 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vrbr4\"" Apr 16 19:36:03.663539 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.663526 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 19:36:03.663969 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.663952 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 19:36:03.665444 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.665427 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:36:03.666679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.666425 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:36:03.667762 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.667742 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:36:03.679505 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.679478 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq"] Apr 16 19:36:03.682695 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.682669 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6"] Apr 16 19:36:03.750582 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750538 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.750774 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750600 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.750774 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750640 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.750774 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750675 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrtz4\" (UniqueName: \"kubernetes.io/projected/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-kube-api-access-rrtz4\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.750774 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750722 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.750774 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.750769 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4q6b\" (UniqueName: \"kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.851960 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.851921 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4q6b\" (UniqueName: \"kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.852112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.851965 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.852112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.851993 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.852112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.852026 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.852112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.852055 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrtz4\" (UniqueName: \"kubernetes.io/projected/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-kube-api-access-rrtz4\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.852112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.852073 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.852503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.852476 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.852581 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.852508 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.854564 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.854545 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.854627 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.854609 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-webhook-cert\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.863854 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.863825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrtz4\" (UniqueName: \"kubernetes.io/projected/6289dbfc-b014-4bf2-926d-ea8c13ae04b0-kube-api-access-rrtz4\") pod \"opendatahub-operator-controller-manager-57586b9555-2zhnq\" (UID: \"6289dbfc-b014-4bf2-926d-ea8c13ae04b0\") " pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.864049 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.864014 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4q6b\" (UniqueName: \"kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:03.970448 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.970404 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:03.975322 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:03.975300 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:04.112223 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:04.112181 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq"] Apr 16 19:36:04.114609 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:04.114577 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6289dbfc_b014_4bf2_926d_ea8c13ae04b0.slice/crio-bb240f30ee9e3e47704c7bdf1463a4991ca1884c1d83a98b010e9ac07a234c78 WatchSource:0}: Error finding container bb240f30ee9e3e47704c7bdf1463a4991ca1884c1d83a98b010e9ac07a234c78: Status 404 returned error can't find the container with id bb240f30ee9e3e47704c7bdf1463a4991ca1884c1d83a98b010e9ac07a234c78 Apr 16 19:36:04.141310 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:04.141248 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6"] Apr 16 19:36:04.143897 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:04.143866 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94f59c4d_5893_458e_8317_1318bb0a758f.slice/crio-066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a WatchSource:0}: Error finding container 066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a: Status 404 returned error can't find the container with id 066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a Apr 16 19:36:04.238783 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:04.238680 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" event={"ID":"6289dbfc-b014-4bf2-926d-ea8c13ae04b0","Type":"ContainerStarted","Data":"bb240f30ee9e3e47704c7bdf1463a4991ca1884c1d83a98b010e9ac07a234c78"} Apr 16 19:36:04.240221 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:04.240028 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerStarted","Data":"b783dbe7974efb3990a5083abffbcc18b21d0f386a090bed3dac4369d6b64a0b"} Apr 16 19:36:04.240221 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:04.240059 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerStarted","Data":"066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a"} Apr 16 19:36:05.246507 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:05.246435 2579 generic.go:358] "Generic (PLEG): container finished" podID="94f59c4d-5893-458e-8317-1318bb0a758f" containerID="b783dbe7974efb3990a5083abffbcc18b21d0f386a090bed3dac4369d6b64a0b" exitCode=0 Apr 16 19:36:05.247080 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:05.246516 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerDied","Data":"b783dbe7974efb3990a5083abffbcc18b21d0f386a090bed3dac4369d6b64a0b"} Apr 16 19:36:07.256129 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:07.256085 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" event={"ID":"6289dbfc-b014-4bf2-926d-ea8c13ae04b0","Type":"ContainerStarted","Data":"76fb2f8555f6caea4194a30d6a2777c23d04cfb6576837bce63aff50ba78ac2c"} Apr 16 19:36:07.256607 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:07.256196 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:07.257754 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:07.257726 2579 generic.go:358] "Generic (PLEG): container finished" podID="94f59c4d-5893-458e-8317-1318bb0a758f" containerID="81714164c81cc57ec9145ab7b9fc60f22f1062e7f67b44e7d3dabaa9de6660e6" exitCode=0 Apr 16 19:36:07.257866 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:07.257814 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerDied","Data":"81714164c81cc57ec9145ab7b9fc60f22f1062e7f67b44e7d3dabaa9de6660e6"} Apr 16 19:36:07.277550 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:07.277490 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" podStartSLOduration=1.797110361 podStartE2EDuration="4.277468536s" podCreationTimestamp="2026-04-16 19:36:03 +0000 UTC" firstStartedPulling="2026-04-16 19:36:04.116446749 +0000 UTC m=+347.693224937" lastFinishedPulling="2026-04-16 19:36:06.596804922 +0000 UTC m=+350.173583112" observedRunningTime="2026-04-16 19:36:07.275508019 +0000 UTC m=+350.852286231" watchObservedRunningTime="2026-04-16 19:36:07.277468536 +0000 UTC m=+350.854246748" Apr 16 19:36:08.263469 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:08.263427 2579 generic.go:358] "Generic (PLEG): container finished" podID="94f59c4d-5893-458e-8317-1318bb0a758f" containerID="f65bdce05f6388f9d047050d66c1d0b8d6e0a37768d3b43a4a0988de714f8047" exitCode=0 Apr 16 19:36:08.263925 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:08.263511 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerDied","Data":"f65bdce05f6388f9d047050d66c1d0b8d6e0a37768d3b43a4a0988de714f8047"} Apr 16 19:36:09.395065 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.395039 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:09.403885 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.403860 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4q6b\" (UniqueName: \"kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b\") pod \"94f59c4d-5893-458e-8317-1318bb0a758f\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " Apr 16 19:36:09.404010 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.403902 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle\") pod \"94f59c4d-5893-458e-8317-1318bb0a758f\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " Apr 16 19:36:09.404010 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.403958 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util\") pod \"94f59c4d-5893-458e-8317-1318bb0a758f\" (UID: \"94f59c4d-5893-458e-8317-1318bb0a758f\") " Apr 16 19:36:09.404860 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.404832 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle" (OuterVolumeSpecName: "bundle") pod "94f59c4d-5893-458e-8317-1318bb0a758f" (UID: "94f59c4d-5893-458e-8317-1318bb0a758f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:09.405946 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.405920 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b" (OuterVolumeSpecName: "kube-api-access-c4q6b") pod "94f59c4d-5893-458e-8317-1318bb0a758f" (UID: "94f59c4d-5893-458e-8317-1318bb0a758f"). InnerVolumeSpecName "kube-api-access-c4q6b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:36:09.505504 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.505459 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-c4q6b\" (UniqueName: \"kubernetes.io/projected/94f59c4d-5893-458e-8317-1318bb0a758f-kube-api-access-c4q6b\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:09.505504 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.505496 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:09.561720 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.561630 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util" (OuterVolumeSpecName: "util") pod "94f59c4d-5893-458e-8317-1318bb0a758f" (UID: "94f59c4d-5893-458e-8317-1318bb0a758f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:09.606518 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:09.606479 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/94f59c4d-5893-458e-8317-1318bb0a758f-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:10.272330 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:10.272290 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" event={"ID":"94f59c4d-5893-458e-8317-1318bb0a758f","Type":"ContainerDied","Data":"066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a"} Apr 16 19:36:10.272330 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:10.272323 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9chjs6" Apr 16 19:36:10.272330 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:10.272330 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="066e8bd6e35a2e78936b2cc5229ceeb0b8c5a2f167ccbf9cfb8cb3af67dad58a" Apr 16 19:36:18.266113 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:18.266076 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57586b9555-2zhnq" Apr 16 19:36:21.160855 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.160807 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx"] Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161140 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="extract" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161152 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="extract" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161170 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="pull" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161175 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="pull" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161188 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="util" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161193 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="util" Apr 16 19:36:21.161283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.161259 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="94f59c4d-5893-458e-8317-1318bb0a758f" containerName="extract" Apr 16 19:36:21.169480 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.169457 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.173164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.173128 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-lfkhf\"" Apr 16 19:36:21.173164 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.173155 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 16 19:36:21.173368 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.173178 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 16 19:36:21.176942 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.176924 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx"] Apr 16 19:36:21.203251 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.203217 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aea05870-5faf-4066-b9b2-c179cf5245f4-tls-certs\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.203369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.203266 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc4kz\" (UniqueName: \"kubernetes.io/projected/aea05870-5faf-4066-b9b2-c179cf5245f4-kube-api-access-hc4kz\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.203369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.203311 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aea05870-5faf-4066-b9b2-c179cf5245f4-tmp\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.226503 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.226478 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p"] Apr 16 19:36:21.230083 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.230067 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.232569 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.232539 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:36:21.232569 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.232559 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:36:21.232717 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.232614 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:36:21.237430 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.237405 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p"] Apr 16 19:36:21.304226 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304175 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.304386 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aea05870-5faf-4066-b9b2-c179cf5245f4-tls-certs\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.304386 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304267 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.304386 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304289 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hc4kz\" (UniqueName: \"kubernetes.io/projected/aea05870-5faf-4066-b9b2-c179cf5245f4-kube-api-access-hc4kz\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.304386 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304325 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bhwz\" (UniqueName: \"kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.304524 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.304459 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aea05870-5faf-4066-b9b2-c179cf5245f4-tmp\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.306659 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.306626 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/aea05870-5faf-4066-b9b2-c179cf5245f4-tmp\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.306843 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.306825 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/aea05870-5faf-4066-b9b2-c179cf5245f4-tls-certs\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.331364 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.331333 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc4kz\" (UniqueName: \"kubernetes.io/projected/aea05870-5faf-4066-b9b2-c179cf5245f4-kube-api-access-hc4kz\") pod \"kube-auth-proxy-6b5579666b-ws9zx\" (UID: \"aea05870-5faf-4066-b9b2-c179cf5245f4\") " pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.405395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.405357 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.405395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.405402 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.405605 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.405528 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bhwz\" (UniqueName: \"kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.405790 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.405770 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.405825 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.405782 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.416741 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.416695 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bhwz\" (UniqueName: \"kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.480252 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.480197 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" Apr 16 19:36:21.539913 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.539876 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:21.634222 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.634173 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx"] Apr 16 19:36:21.637085 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:21.637048 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea05870_5faf_4066_b9b2_c179cf5245f4.slice/crio-0b546aa3d0f4c5209f46083d51d9f0d807073f1ed29713475a292ff4a5bf41ab WatchSource:0}: Error finding container 0b546aa3d0f4c5209f46083d51d9f0d807073f1ed29713475a292ff4a5bf41ab: Status 404 returned error can't find the container with id 0b546aa3d0f4c5209f46083d51d9f0d807073f1ed29713475a292ff4a5bf41ab Apr 16 19:36:21.667767 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:21.667740 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p"] Apr 16 19:36:21.669355 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:21.669322 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb0e4f6_4056_47a4_9b06_cf814a5aa023.slice/crio-756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e WatchSource:0}: Error finding container 756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e: Status 404 returned error can't find the container with id 756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e Apr 16 19:36:22.318195 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:22.318153 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" event={"ID":"aea05870-5faf-4066-b9b2-c179cf5245f4","Type":"ContainerStarted","Data":"0b546aa3d0f4c5209f46083d51d9f0d807073f1ed29713475a292ff4a5bf41ab"} Apr 16 19:36:22.320807 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:22.320774 2579 generic.go:358] "Generic (PLEG): container finished" podID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerID="46ac3f6851ffa4d909164da8f4f6a2eac5d799049073affaf90a9b725d081a5d" exitCode=0 Apr 16 19:36:22.320928 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:22.320840 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" event={"ID":"6bb0e4f6-4056-47a4-9b06-cf814a5aa023","Type":"ContainerDied","Data":"46ac3f6851ffa4d909164da8f4f6a2eac5d799049073affaf90a9b725d081a5d"} Apr 16 19:36:22.320928 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:22.320870 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" event={"ID":"6bb0e4f6-4056-47a4-9b06-cf814a5aa023","Type":"ContainerStarted","Data":"756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e"} Apr 16 19:36:24.180445 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.180345 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7jbgv"] Apr 16 19:36:24.185309 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.185287 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.187823 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.187799 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 16 19:36:24.188102 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.188078 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-4vwxp\"" Apr 16 19:36:24.191808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.191780 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7jbgv"] Apr 16 19:36:24.232898 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.232847 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.233070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.232950 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7dc\" (UniqueName: \"kubernetes.io/projected/2f1f4763-90e1-44e4-8429-6c99794ae398-kube-api-access-ss7dc\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.333333 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.333297 2579 generic.go:358] "Generic (PLEG): container finished" podID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerID="9cdfeb7f158fdeab5b9905509475c41719896f615d34ec926acb6de8cbf1be24" exitCode=0 Apr 16 19:36:24.333510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.333357 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" event={"ID":"6bb0e4f6-4056-47a4-9b06-cf814a5aa023","Type":"ContainerDied","Data":"9cdfeb7f158fdeab5b9905509475c41719896f615d34ec926acb6de8cbf1be24"} Apr 16 19:36:24.333510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.333405 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.333510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.333479 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7dc\" (UniqueName: \"kubernetes.io/projected/2f1f4763-90e1-44e4-8429-6c99794ae398-kube-api-access-ss7dc\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.333674 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:24.333560 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 19:36:24.333674 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:24.333623 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert podName:2f1f4763-90e1-44e4-8429-6c99794ae398 nodeName:}" failed. No retries permitted until 2026-04-16 19:36:24.833601013 +0000 UTC m=+368.410379205 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert") pod "odh-model-controller-858dbf95b8-7jbgv" (UID: "2f1f4763-90e1-44e4-8429-6c99794ae398") : secret "odh-model-controller-webhook-cert" not found Apr 16 19:36:24.343089 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.343056 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7dc\" (UniqueName: \"kubernetes.io/projected/2f1f4763-90e1-44e4-8429-6c99794ae398-kube-api-access-ss7dc\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.839119 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:24.839097 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:24.839287 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:24.839268 2579 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 16 19:36:24.839359 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:24.839344 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert podName:2f1f4763-90e1-44e4-8429-6c99794ae398 nodeName:}" failed. No retries permitted until 2026-04-16 19:36:25.839320659 +0000 UTC m=+369.416098866 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert") pod "odh-model-controller-858dbf95b8-7jbgv" (UID: "2f1f4763-90e1-44e4-8429-6c99794ae398") : secret "odh-model-controller-webhook-cert" not found Apr 16 19:36:25.337970 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.337931 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" event={"ID":"aea05870-5faf-4066-b9b2-c179cf5245f4","Type":"ContainerStarted","Data":"433048a7d0c3520612ba5911bb5053c6cc000456fb4f38d648d139eb053a4909"} Apr 16 19:36:25.339798 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.339772 2579 generic.go:358] "Generic (PLEG): container finished" podID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerID="096e77e1645adc478d1d710b4d507039cd584f5260033e2aed10d22aa2980e1f" exitCode=0 Apr 16 19:36:25.339926 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.339823 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" event={"ID":"6bb0e4f6-4056-47a4-9b06-cf814a5aa023","Type":"ContainerDied","Data":"096e77e1645adc478d1d710b4d507039cd584f5260033e2aed10d22aa2980e1f"} Apr 16 19:36:25.361676 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.358867 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-6b5579666b-ws9zx" podStartSLOduration=1.1544998180000001 podStartE2EDuration="4.358849133s" podCreationTimestamp="2026-04-16 19:36:21 +0000 UTC" firstStartedPulling="2026-04-16 19:36:21.639094863 +0000 UTC m=+365.215873052" lastFinishedPulling="2026-04-16 19:36:24.843444174 +0000 UTC m=+368.420222367" observedRunningTime="2026-04-16 19:36:25.354686537 +0000 UTC m=+368.931464760" watchObservedRunningTime="2026-04-16 19:36:25.358849133 +0000 UTC m=+368.935627345" Apr 16 19:36:25.847269 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.847193 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:25.849689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.849661 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2f1f4763-90e1-44e4-8429-6c99794ae398-cert\") pod \"odh-model-controller-858dbf95b8-7jbgv\" (UID: \"2f1f4763-90e1-44e4-8429-6c99794ae398\") " pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:25.998679 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:25.998633 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:26.325470 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.325442 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-7jbgv"] Apr 16 19:36:26.327384 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:26.327354 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f1f4763_90e1_44e4_8429_6c99794ae398.slice/crio-f365a576e3d8c6a17b46cd4b9fea70adfdaae7892dc6dab6154017c52fa6d091 WatchSource:0}: Error finding container f365a576e3d8c6a17b46cd4b9fea70adfdaae7892dc6dab6154017c52fa6d091: Status 404 returned error can't find the container with id f365a576e3d8c6a17b46cd4b9fea70adfdaae7892dc6dab6154017c52fa6d091 Apr 16 19:36:26.345081 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.345046 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" event={"ID":"2f1f4763-90e1-44e4-8429-6c99794ae398","Type":"ContainerStarted","Data":"f365a576e3d8c6a17b46cd4b9fea70adfdaae7892dc6dab6154017c52fa6d091"} Apr 16 19:36:26.476959 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.476936 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:26.553014 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.552975 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util\") pod \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " Apr 16 19:36:26.553186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.553073 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle\") pod \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " Apr 16 19:36:26.553186 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.553120 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bhwz\" (UniqueName: \"kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz\") pod \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\" (UID: \"6bb0e4f6-4056-47a4-9b06-cf814a5aa023\") " Apr 16 19:36:26.554045 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.554015 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle" (OuterVolumeSpecName: "bundle") pod "6bb0e4f6-4056-47a4-9b06-cf814a5aa023" (UID: "6bb0e4f6-4056-47a4-9b06-cf814a5aa023"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:26.555273 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.555242 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz" (OuterVolumeSpecName: "kube-api-access-8bhwz") pod "6bb0e4f6-4056-47a4-9b06-cf814a5aa023" (UID: "6bb0e4f6-4056-47a4-9b06-cf814a5aa023"). InnerVolumeSpecName "kube-api-access-8bhwz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:36:26.558556 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.558530 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util" (OuterVolumeSpecName: "util") pod "6bb0e4f6-4056-47a4-9b06-cf814a5aa023" (UID: "6bb0e4f6-4056-47a4-9b06-cf814a5aa023"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:26.654460 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.654358 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:26.654460 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.654393 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bhwz\" (UniqueName: \"kubernetes.io/projected/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-kube-api-access-8bhwz\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:26.654460 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:26.654404 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6bb0e4f6-4056-47a4-9b06-cf814a5aa023-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:27.351302 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:27.351176 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" event={"ID":"6bb0e4f6-4056-47a4-9b06-cf814a5aa023","Type":"ContainerDied","Data":"756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e"} Apr 16 19:36:27.351302 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:27.351224 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835hq86p" Apr 16 19:36:27.351302 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:27.351242 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756f392379b3f5444f9c6d045c0a5c783685423f7d3a1db5bb31a9b5a493be2e" Apr 16 19:36:29.946676 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.946636 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-44kfz"] Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.946969 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="pull" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.946979 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="pull" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.946987 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="extract" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.946993 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="extract" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.947006 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="util" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.947012 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="util" Apr 16 19:36:29.947070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.947064 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bb0e4f6-4056-47a4-9b06-cf814a5aa023" containerName="extract" Apr 16 19:36:29.950039 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.950009 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:29.952682 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.952655 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-c7wr9\"" Apr 16 19:36:29.952849 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.952698 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 16 19:36:29.966427 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:29.966400 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-44kfz"] Apr 16 19:36:30.081583 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.081546 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kkjw\" (UniqueName: \"kubernetes.io/projected/cfc99283-cf9c-4242-a5ac-b07e345d04f7-kube-api-access-9kkjw\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.081735 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.081611 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.182731 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.182687 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.182931 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.182768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kkjw\" (UniqueName: \"kubernetes.io/projected/cfc99283-cf9c-4242-a5ac-b07e345d04f7-kube-api-access-9kkjw\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.182931 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:30.182841 2579 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 16 19:36:30.182931 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:36:30.182922 2579 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert podName:cfc99283-cf9c-4242-a5ac-b07e345d04f7 nodeName:}" failed. No retries permitted until 2026-04-16 19:36:30.682904641 +0000 UTC m=+374.259682833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert") pod "kserve-controller-manager-856948b99f-44kfz" (UID: "cfc99283-cf9c-4242-a5ac-b07e345d04f7") : secret "kserve-webhook-server-cert" not found Apr 16 19:36:30.204029 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.203959 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kkjw\" (UniqueName: \"kubernetes.io/projected/cfc99283-cf9c-4242-a5ac-b07e345d04f7-kube-api-access-9kkjw\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.363064 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.363024 2579 generic.go:358] "Generic (PLEG): container finished" podID="2f1f4763-90e1-44e4-8429-6c99794ae398" containerID="7508f27803239b6940d14afba565fd011e6610aad925efab756f77d45b536fa9" exitCode=1 Apr 16 19:36:30.363064 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.363060 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" event={"ID":"2f1f4763-90e1-44e4-8429-6c99794ae398","Type":"ContainerDied","Data":"7508f27803239b6940d14afba565fd011e6610aad925efab756f77d45b536fa9"} Apr 16 19:36:30.363409 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.363392 2579 scope.go:117] "RemoveContainer" containerID="7508f27803239b6940d14afba565fd011e6610aad925efab756f77d45b536fa9" Apr 16 19:36:30.686199 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.686115 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.688511 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.688484 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cfc99283-cf9c-4242-a5ac-b07e345d04f7-cert\") pod \"kserve-controller-manager-856948b99f-44kfz\" (UID: \"cfc99283-cf9c-4242-a5ac-b07e345d04f7\") " pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.861964 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.861930 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:30.989170 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:30.989134 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-44kfz"] Apr 16 19:36:31.368150 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:31.368047 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" event={"ID":"cfc99283-cf9c-4242-a5ac-b07e345d04f7","Type":"ContainerStarted","Data":"d164787b214eefafd24afc5f5d35296ee1a207c8387d14b1e63611f08d55f9fc"} Apr 16 19:36:31.370426 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:31.370394 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" event={"ID":"2f1f4763-90e1-44e4-8429-6c99794ae398","Type":"ContainerStarted","Data":"f969d17c12e0c088439176a2aa2a21d97e494fd65247e702aa69d2d0f0dd0ad7"} Apr 16 19:36:31.370576 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:31.370455 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:31.390510 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:31.390451 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" podStartSLOduration=3.007824408 podStartE2EDuration="7.390436187s" podCreationTimestamp="2026-04-16 19:36:24 +0000 UTC" firstStartedPulling="2026-04-16 19:36:26.32907491 +0000 UTC m=+369.905853099" lastFinishedPulling="2026-04-16 19:36:30.711686685 +0000 UTC m=+374.288464878" observedRunningTime="2026-04-16 19:36:31.388594929 +0000 UTC m=+374.965373143" watchObservedRunningTime="2026-04-16 19:36:31.390436187 +0000 UTC m=+374.967214430" Apr 16 19:36:34.384419 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:34.384316 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" event={"ID":"cfc99283-cf9c-4242-a5ac-b07e345d04f7","Type":"ContainerStarted","Data":"822a2021ca9bb1831d1af3f25729779e3b6471535144da6f799e358d7528d6f5"} Apr 16 19:36:34.384884 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:34.384462 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:36:34.409114 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:34.409061 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" podStartSLOduration=2.303247393 podStartE2EDuration="5.40904416s" podCreationTimestamp="2026-04-16 19:36:29 +0000 UTC" firstStartedPulling="2026-04-16 19:36:30.996442048 +0000 UTC m=+374.573220236" lastFinishedPulling="2026-04-16 19:36:34.102238812 +0000 UTC m=+377.679017003" observedRunningTime="2026-04-16 19:36:34.407114441 +0000 UTC m=+377.983892655" watchObservedRunningTime="2026-04-16 19:36:34.40904416 +0000 UTC m=+377.985822371" Apr 16 19:36:35.660574 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.660537 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk"] Apr 16 19:36:35.664179 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.664162 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.670116 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.670089 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 19:36:35.670296 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.670278 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-97mkl\"" Apr 16 19:36:35.670611 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.670596 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 19:36:35.701592 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.701554 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk"] Apr 16 19:36:35.733529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.733493 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.733683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.733541 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.733683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.733574 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtgrc\" (UniqueName: \"kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.834956 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.834916 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.835121 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.834963 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rtgrc\" (UniqueName: \"kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.835121 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.835038 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.835371 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.835347 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.835432 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.835413 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.844713 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.844684 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtgrc\" (UniqueName: \"kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:35.973599 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:35.973563 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:36.106157 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:36.106129 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk"] Apr 16 19:36:36.107546 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:36.107516 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67fe265a_488f_45f8_9576_b0f22fb42c54.slice/crio-af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435 WatchSource:0}: Error finding container af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435: Status 404 returned error can't find the container with id af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435 Apr 16 19:36:36.393802 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:36.393705 2579 generic.go:358] "Generic (PLEG): container finished" podID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerID="d6ae687064dbc4d1347addd2eeb437f35727773ec901eaec877066f613e61527" exitCode=0 Apr 16 19:36:36.393954 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:36.393790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" event={"ID":"67fe265a-488f-45f8-9576-b0f22fb42c54","Type":"ContainerDied","Data":"d6ae687064dbc4d1347addd2eeb437f35727773ec901eaec877066f613e61527"} Apr 16 19:36:36.393954 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:36.393836 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" event={"ID":"67fe265a-488f-45f8-9576-b0f22fb42c54","Type":"ContainerStarted","Data":"af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435"} Apr 16 19:36:37.036417 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.036382 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm"] Apr 16 19:36:37.039854 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.039832 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.042743 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.042716 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-5hffh\"" Apr 16 19:36:37.042743 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.042723 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 19:36:37.043285 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.043269 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 19:36:37.050400 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.050369 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm"] Apr 16 19:36:37.147562 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.147464 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.147562 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.147517 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbvq5\" (UniqueName: \"kubernetes.io/projected/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-kube-api-access-mbvq5\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.248279 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.248230 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.248468 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.248335 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbvq5\" (UniqueName: \"kubernetes.io/projected/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-kube-api-access-mbvq5\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.250816 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.250789 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-operator-config\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.258035 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.257992 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbvq5\" (UniqueName: \"kubernetes.io/projected/67e630e0-3c0e-4afb-a811-65d2cbcf7a4a-kube-api-access-mbvq5\") pod \"servicemesh-operator3-55f49c5f94-jwhfm\" (UID: \"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.350634 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.350605 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:37.493557 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:37.493390 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm"] Apr 16 19:36:37.499151 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:37.499115 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod67e630e0_3c0e_4afb_a811_65d2cbcf7a4a.slice/crio-b806a099bad2a8f8678eff422789d7b888941d92ac5e6c0514ace4fa6c798a05 WatchSource:0}: Error finding container b806a099bad2a8f8678eff422789d7b888941d92ac5e6c0514ace4fa6c798a05: Status 404 returned error can't find the container with id b806a099bad2a8f8678eff422789d7b888941d92ac5e6c0514ace4fa6c798a05 Apr 16 19:36:38.408230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:38.408178 2579 generic.go:358] "Generic (PLEG): container finished" podID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerID="0202dabf5e5a624b4c4ca4e3829fec8c5d6b0bfcbe2835d1df2e9b1484ad57b3" exitCode=0 Apr 16 19:36:38.408693 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:38.408256 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" event={"ID":"67fe265a-488f-45f8-9576-b0f22fb42c54","Type":"ContainerDied","Data":"0202dabf5e5a624b4c4ca4e3829fec8c5d6b0bfcbe2835d1df2e9b1484ad57b3"} Apr 16 19:36:38.410270 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:38.410239 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" event={"ID":"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a","Type":"ContainerStarted","Data":"b806a099bad2a8f8678eff422789d7b888941d92ac5e6c0514ace4fa6c798a05"} Apr 16 19:36:39.416467 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:39.416423 2579 generic.go:358] "Generic (PLEG): container finished" podID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerID="274c6e54ac1bd41716687da7ce6d205453071611e7dedf4d2b13252d17407b06" exitCode=0 Apr 16 19:36:39.416955 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:39.416493 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" event={"ID":"67fe265a-488f-45f8-9576-b0f22fb42c54","Type":"ContainerDied","Data":"274c6e54ac1bd41716687da7ce6d205453071611e7dedf4d2b13252d17407b06"} Apr 16 19:36:40.424165 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.423806 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" event={"ID":"67e630e0-3c0e-4afb-a811-65d2cbcf7a4a","Type":"ContainerStarted","Data":"8c8307b43ba812807314055b8769f5ebc50fd00244812ebf3825540b687c322f"} Apr 16 19:36:40.424165 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.423914 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:40.446357 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.445611 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" podStartSLOduration=0.710267469 podStartE2EDuration="3.44558893s" podCreationTimestamp="2026-04-16 19:36:37 +0000 UTC" firstStartedPulling="2026-04-16 19:36:37.501720065 +0000 UTC m=+381.078498253" lastFinishedPulling="2026-04-16 19:36:40.237041512 +0000 UTC m=+383.813819714" observedRunningTime="2026-04-16 19:36:40.444663779 +0000 UTC m=+384.021441991" watchObservedRunningTime="2026-04-16 19:36:40.44558893 +0000 UTC m=+384.022367141" Apr 16 19:36:40.597039 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.597009 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:40.683199 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.683161 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util\") pod \"67fe265a-488f-45f8-9576-b0f22fb42c54\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " Apr 16 19:36:40.683395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.683259 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle\") pod \"67fe265a-488f-45f8-9576-b0f22fb42c54\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " Apr 16 19:36:40.683395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.683279 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtgrc\" (UniqueName: \"kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc\") pod \"67fe265a-488f-45f8-9576-b0f22fb42c54\" (UID: \"67fe265a-488f-45f8-9576-b0f22fb42c54\") " Apr 16 19:36:40.684413 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.684384 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle" (OuterVolumeSpecName: "bundle") pod "67fe265a-488f-45f8-9576-b0f22fb42c54" (UID: "67fe265a-488f-45f8-9576-b0f22fb42c54"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:40.685639 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.685611 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc" (OuterVolumeSpecName: "kube-api-access-rtgrc") pod "67fe265a-488f-45f8-9576-b0f22fb42c54" (UID: "67fe265a-488f-45f8-9576-b0f22fb42c54"). InnerVolumeSpecName "kube-api-access-rtgrc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:36:40.688818 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.688784 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util" (OuterVolumeSpecName: "util") pod "67fe265a-488f-45f8-9576-b0f22fb42c54" (UID: "67fe265a-488f-45f8-9576-b0f22fb42c54"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:36:40.784269 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.784160 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:40.784269 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.784199 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rtgrc\" (UniqueName: \"kubernetes.io/projected/67fe265a-488f-45f8-9576-b0f22fb42c54-kube-api-access-rtgrc\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:40.784269 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:40.784223 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/67fe265a-488f-45f8-9576-b0f22fb42c54-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:36:41.429105 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:41.429068 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" event={"ID":"67fe265a-488f-45f8-9576-b0f22fb42c54","Type":"ContainerDied","Data":"af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435"} Apr 16 19:36:41.429105 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:41.429097 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2w8kzk" Apr 16 19:36:41.429105 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:41.429111 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af72d0bfdd804415088ee44940c4f316f2211c69ace3ce0efcc4f4651b35b435" Apr 16 19:36:42.377749 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:42.377719 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-7jbgv" Apr 16 19:36:49.621413 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621372 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9"] Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621703 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="util" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621714 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="util" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621727 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="extract" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621732 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="extract" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621748 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="pull" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621754 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="pull" Apr 16 19:36:49.621876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.621812 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="67fe265a-488f-45f8-9576-b0f22fb42c54" containerName="extract" Apr 16 19:36:49.626283 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.626263 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.628924 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.628899 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 19:36:49.628924 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.628913 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 19:36:49.629112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.629011 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-975hc\"" Apr 16 19:36:49.629112 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.628911 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 19:36:49.629318 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.629302 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 19:36:49.637464 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.637433 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9"] Apr 16 19:36:49.763000 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.762961 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763000 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763002 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763029 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763083 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184964d2-5d55-4f19-b607-e94fa7d12038-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763099 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763183 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.763253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.763236 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm8lp\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-kube-api-access-dm8lp\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.864975 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.864936 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.865439 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.865326 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184964d2-5d55-4f19-b607-e94fa7d12038-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.865613 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.865597 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.866572 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.866526 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.866686 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.866592 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dm8lp\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-kube-api-access-dm8lp\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.866744 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.866714 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.866793 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.866768 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.867309 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.867287 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.868117 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.868074 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/184964d2-5d55-4f19-b607-e94fa7d12038-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.868609 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.868586 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.868890 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.868867 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.869312 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.869290 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/184964d2-5d55-4f19-b607-e94fa7d12038-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.873501 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.873450 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm8lp\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-kube-api-access-dm8lp\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.873720 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.873701 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/184964d2-5d55-4f19-b607-e94fa7d12038-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-njdt9\" (UID: \"184964d2-5d55-4f19-b607-e94fa7d12038\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:49.937690 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:49.937656 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:50.082264 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:50.082239 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9"] Apr 16 19:36:50.085075 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:36:50.085034 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod184964d2_5d55_4f19_b607_e94fa7d12038.slice/crio-8b2f72b270f4f961123537726682bec7cfcea543e102f863bd2bbc23cc163855 WatchSource:0}: Error finding container 8b2f72b270f4f961123537726682bec7cfcea543e102f863bd2bbc23cc163855: Status 404 returned error can't find the container with id 8b2f72b270f4f961123537726682bec7cfcea543e102f863bd2bbc23cc163855 Apr 16 19:36:50.472288 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:50.472250 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" event={"ID":"184964d2-5d55-4f19-b607-e94fa7d12038","Type":"ContainerStarted","Data":"8b2f72b270f4f961123537726682bec7cfcea543e102f863bd2bbc23cc163855"} Apr 16 19:36:51.432043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:51.431996 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-jwhfm" Apr 16 19:36:52.759660 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:52.759614 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 19:36:52.759985 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:52.759691 2579 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 19:36:53.487891 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:53.487853 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" event={"ID":"184964d2-5d55-4f19-b607-e94fa7d12038","Type":"ContainerStarted","Data":"87eee24072f2ec1fc86c66ec09209d46140301e48c0582219236956d87064159"} Apr 16 19:36:53.488101 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:53.488053 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:36:53.489919 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:53.489888 2579 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-njdt9 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 19:36:53.490187 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:53.490127 2579 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" podUID="184964d2-5d55-4f19-b607-e94fa7d12038" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 19:36:53.510492 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:53.510439 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" podStartSLOduration=1.8383116849999999 podStartE2EDuration="4.510421262s" podCreationTimestamp="2026-04-16 19:36:49 +0000 UTC" firstStartedPulling="2026-04-16 19:36:50.087255779 +0000 UTC m=+393.664033968" lastFinishedPulling="2026-04-16 19:36:52.759365353 +0000 UTC m=+396.336143545" observedRunningTime="2026-04-16 19:36:53.508156479 +0000 UTC m=+397.084934695" watchObservedRunningTime="2026-04-16 19:36:53.510421262 +0000 UTC m=+397.087199475" Apr 16 19:36:54.492495 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:36:54.492465 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-njdt9" Apr 16 19:37:05.393138 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:05.393105 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-44kfz" Apr 16 19:37:36.448392 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.448307 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7"] Apr 16 19:37:36.451873 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.451852 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.454628 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.454601 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:37:36.455669 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.455645 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5m27b\"" Apr 16 19:37:36.455780 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.455646 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:37:36.459240 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.459189 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7"] Apr 16 19:37:36.531489 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.531459 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.531605 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.531513 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.531605 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.531577 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchjx\" (UniqueName: \"kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.632070 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.632045 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.632184 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.632078 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wchjx\" (UniqueName: \"kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.632184 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.632124 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.632463 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.632442 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.632522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.632467 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.641178 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.641156 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchjx\" (UniqueName: \"kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.763401 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.763341 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:36.887918 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:36.887890 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7"] Apr 16 19:37:36.890005 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:37:36.889980 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88506021_904e_4fc4_96fa_64e8c06a4b94.slice/crio-7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74 WatchSource:0}: Error finding container 7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74: Status 404 returned error can't find the container with id 7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74 Apr 16 19:37:37.221041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.221004 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst"] Apr 16 19:37:37.224482 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.224463 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.232532 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.232510 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst"] Apr 16 19:37:37.336996 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.336969 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.337134 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.337017 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.337134 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.337090 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xk2n\" (UniqueName: \"kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.438301 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.438274 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.438390 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.438327 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xk2n\" (UniqueName: \"kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.438390 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.438369 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.438613 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.438596 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.438719 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.438703 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.446946 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.446925 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xk2n\" (UniqueName: \"kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.534076 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.534025 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:37.657409 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.657384 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst"] Apr 16 19:37:37.658913 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:37:37.658885 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9714675c_b0ed_4114_bb4b_562f793075e3.slice/crio-cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e WatchSource:0}: Error finding container cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e: Status 404 returned error can't find the container with id cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e Apr 16 19:37:37.669279 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.669253 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" event={"ID":"9714675c-b0ed-4114-bb4b-562f793075e3","Type":"ContainerStarted","Data":"cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e"} Apr 16 19:37:37.670587 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.670559 2579 generic.go:358] "Generic (PLEG): container finished" podID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerID="0e4b850b1f982e049d8deb0f958e97cbd2c1e5c01d7afad05c382d039116e5db" exitCode=0 Apr 16 19:37:37.670682 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.670628 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" event={"ID":"88506021-904e-4fc4-96fa-64e8c06a4b94","Type":"ContainerDied","Data":"0e4b850b1f982e049d8deb0f958e97cbd2c1e5c01d7afad05c382d039116e5db"} Apr 16 19:37:37.670682 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:37.670651 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" event={"ID":"88506021-904e-4fc4-96fa-64e8c06a4b94","Type":"ContainerStarted","Data":"7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74"} Apr 16 19:37:38.027790 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.027760 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd"] Apr 16 19:37:38.031131 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.031115 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.039277 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.039255 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd"] Apr 16 19:37:38.142924 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.142901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.143043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.142989 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.143043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.143015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5brx\" (UniqueName: \"kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.244153 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.244126 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.244280 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.244160 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5brx\" (UniqueName: \"kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.244280 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.244198 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.244561 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.244538 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.244628 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.244575 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.252634 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.252611 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5brx\" (UniqueName: \"kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.340327 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.340274 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:38.485715 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.485692 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd"] Apr 16 19:37:38.521776 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:37:38.521744 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5dbc0b_c8e2_4c35_b878_e4ceef3332cd.slice/crio-dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e WatchSource:0}: Error finding container dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e: Status 404 returned error can't find the container with id dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e Apr 16 19:37:38.634777 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.634750 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j"] Apr 16 19:37:38.638421 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.638402 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.648253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.648194 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j"] Apr 16 19:37:38.675547 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.675520 2579 generic.go:358] "Generic (PLEG): container finished" podID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerID="f033175e86e9c6787952729911fe0c47fbe6c7fcc5b2e9364b0077657f8b5802" exitCode=0 Apr 16 19:37:38.675632 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.675597 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" event={"ID":"88506021-904e-4fc4-96fa-64e8c06a4b94","Type":"ContainerDied","Data":"f033175e86e9c6787952729911fe0c47fbe6c7fcc5b2e9364b0077657f8b5802"} Apr 16 19:37:38.676871 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.676840 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerID="9be6ca19e57237ddf69fe427831512a74a3680e081400a3b24f283daed378e48" exitCode=0 Apr 16 19:37:38.676966 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.676916 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" event={"ID":"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd","Type":"ContainerDied","Data":"9be6ca19e57237ddf69fe427831512a74a3680e081400a3b24f283daed378e48"} Apr 16 19:37:38.676966 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.676948 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" event={"ID":"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd","Type":"ContainerStarted","Data":"dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e"} Apr 16 19:37:38.678394 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.678373 2579 generic.go:358] "Generic (PLEG): container finished" podID="9714675c-b0ed-4114-bb4b-562f793075e3" containerID="5155c4c5c8ece76074051e7ee4b7109c068ad2a777b303bbe98e6a17fe0cd23e" exitCode=0 Apr 16 19:37:38.678538 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.678408 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" event={"ID":"9714675c-b0ed-4114-bb4b-562f793075e3","Type":"ContainerDied","Data":"5155c4c5c8ece76074051e7ee4b7109c068ad2a777b303bbe98e6a17fe0cd23e"} Apr 16 19:37:38.758096 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.758073 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.758243 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.758220 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpjt\" (UniqueName: \"kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.758321 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.758287 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.859570 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.859516 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpjt\" (UniqueName: \"kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.859570 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.859549 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.859703 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.859671 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.859891 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.859871 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.859966 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.859947 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:38.867952 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:38.867932 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpjt\" (UniqueName: \"kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:39.046082 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.046053 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:39.168026 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.167941 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j"] Apr 16 19:37:39.169908 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:37:39.169879 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8303a68e_9421_4752_a46a_4c94b985677c.slice/crio-e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd WatchSource:0}: Error finding container e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd: Status 404 returned error can't find the container with id e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd Apr 16 19:37:39.683486 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.683456 2579 generic.go:358] "Generic (PLEG): container finished" podID="9714675c-b0ed-4114-bb4b-562f793075e3" containerID="c440f034529de2b8822bba8f3ba343d46dabae3cf9891bb71c677236fdb8bdd2" exitCode=0 Apr 16 19:37:39.683821 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.683521 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" event={"ID":"9714675c-b0ed-4114-bb4b-562f793075e3","Type":"ContainerDied","Data":"c440f034529de2b8822bba8f3ba343d46dabae3cf9891bb71c677236fdb8bdd2"} Apr 16 19:37:39.684895 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.684871 2579 generic.go:358] "Generic (PLEG): container finished" podID="8303a68e-9421-4752-a46a-4c94b985677c" containerID="363f67e252a29328050846dd742912733c70388f0dc298937c595b85aa1c813b" exitCode=0 Apr 16 19:37:39.685007 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.684950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" event={"ID":"8303a68e-9421-4752-a46a-4c94b985677c","Type":"ContainerDied","Data":"363f67e252a29328050846dd742912733c70388f0dc298937c595b85aa1c813b"} Apr 16 19:37:39.685007 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.684985 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" event={"ID":"8303a68e-9421-4752-a46a-4c94b985677c","Type":"ContainerStarted","Data":"e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd"} Apr 16 19:37:39.686942 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.686921 2579 generic.go:358] "Generic (PLEG): container finished" podID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerID="cc650a22de94095ca5349758ae73fcecb80f091adc38add581d925d898024bdf" exitCode=0 Apr 16 19:37:39.687041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.686950 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" event={"ID":"88506021-904e-4fc4-96fa-64e8c06a4b94","Type":"ContainerDied","Data":"cc650a22de94095ca5349758ae73fcecb80f091adc38add581d925d898024bdf"} Apr 16 19:37:39.688829 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.688786 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerID="69609f52d11ad28e2a129919b29fb72d21fbd805ee84679eb8c4779ab3ca0f63" exitCode=0 Apr 16 19:37:39.688829 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:39.688820 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" event={"ID":"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd","Type":"ContainerDied","Data":"69609f52d11ad28e2a129919b29fb72d21fbd805ee84679eb8c4779ab3ca0f63"} Apr 16 19:37:40.694304 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.694266 2579 generic.go:358] "Generic (PLEG): container finished" podID="9714675c-b0ed-4114-bb4b-562f793075e3" containerID="31965fe1dd7fb8f2a871655d8a37a27fb3284eb3497333118f9a386162258359" exitCode=0 Apr 16 19:37:40.694703 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.694344 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" event={"ID":"9714675c-b0ed-4114-bb4b-562f793075e3","Type":"ContainerDied","Data":"31965fe1dd7fb8f2a871655d8a37a27fb3284eb3497333118f9a386162258359"} Apr 16 19:37:40.695923 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.695901 2579 generic.go:358] "Generic (PLEG): container finished" podID="8303a68e-9421-4752-a46a-4c94b985677c" containerID="477a28a347fde7141c65efe82c4ca699e307aaa425c48bce627341741bfa6095" exitCode=0 Apr 16 19:37:40.696037 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.695991 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" event={"ID":"8303a68e-9421-4752-a46a-4c94b985677c","Type":"ContainerDied","Data":"477a28a347fde7141c65efe82c4ca699e307aaa425c48bce627341741bfa6095"} Apr 16 19:37:40.698258 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.698236 2579 generic.go:358] "Generic (PLEG): container finished" podID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerID="5810ba206721904b0298b14b37e6b539954355f8b27c8a20c11ec785cd54168f" exitCode=0 Apr 16 19:37:40.698359 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.698311 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" event={"ID":"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd","Type":"ContainerDied","Data":"5810ba206721904b0298b14b37e6b539954355f8b27c8a20c11ec785cd54168f"} Apr 16 19:37:40.838689 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.838661 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:40.978326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.978289 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle\") pod \"88506021-904e-4fc4-96fa-64e8c06a4b94\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " Apr 16 19:37:40.978489 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.978334 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util\") pod \"88506021-904e-4fc4-96fa-64e8c06a4b94\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " Apr 16 19:37:40.978489 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.978357 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchjx\" (UniqueName: \"kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx\") pod \"88506021-904e-4fc4-96fa-64e8c06a4b94\" (UID: \"88506021-904e-4fc4-96fa-64e8c06a4b94\") " Apr 16 19:37:40.978816 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.978784 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle" (OuterVolumeSpecName: "bundle") pod "88506021-904e-4fc4-96fa-64e8c06a4b94" (UID: "88506021-904e-4fc4-96fa-64e8c06a4b94"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:40.980637 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.980612 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx" (OuterVolumeSpecName: "kube-api-access-wchjx") pod "88506021-904e-4fc4-96fa-64e8c06a4b94" (UID: "88506021-904e-4fc4-96fa-64e8c06a4b94"). InnerVolumeSpecName "kube-api-access-wchjx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:37:40.983766 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:40.983743 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util" (OuterVolumeSpecName: "util") pod "88506021-904e-4fc4-96fa-64e8c06a4b94" (UID: "88506021-904e-4fc4-96fa-64e8c06a4b94"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:41.079676 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.079636 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:41.079676 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.079669 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/88506021-904e-4fc4-96fa-64e8c06a4b94-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:41.079676 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.079679 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wchjx\" (UniqueName: \"kubernetes.io/projected/88506021-904e-4fc4-96fa-64e8c06a4b94-kube-api-access-wchjx\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:41.703778 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.703743 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" event={"ID":"88506021-904e-4fc4-96fa-64e8c06a4b94","Type":"ContainerDied","Data":"7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74"} Apr 16 19:37:41.703778 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.703772 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7" Apr 16 19:37:41.704360 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.703779 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f185276775cc1197c5cdf24bb98f88747eaac4b73d3782c9cf4af7dc0450a74" Apr 16 19:37:41.705783 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.705758 2579 generic.go:358] "Generic (PLEG): container finished" podID="8303a68e-9421-4752-a46a-4c94b985677c" containerID="b57b17f045a7e4e45edae196d89349a12f4ca564a996d9433bbdd3bc9ab33ddd" exitCode=0 Apr 16 19:37:41.705917 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.705790 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" event={"ID":"8303a68e-9421-4752-a46a-4c94b985677c","Type":"ContainerDied","Data":"b57b17f045a7e4e45edae196d89349a12f4ca564a996d9433bbdd3bc9ab33ddd"} Apr 16 19:37:41.865850 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.865824 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:41.871856 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.871833 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:41.986883 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.986851 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util\") pod \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " Apr 16 19:37:41.987099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.986891 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util\") pod \"9714675c-b0ed-4114-bb4b-562f793075e3\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " Apr 16 19:37:41.987099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.986931 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle\") pod \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " Apr 16 19:37:41.987099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.986974 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5brx\" (UniqueName: \"kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx\") pod \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\" (UID: \"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd\") " Apr 16 19:37:41.987099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.987014 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle\") pod \"9714675c-b0ed-4114-bb4b-562f793075e3\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " Apr 16 19:37:41.987099 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.987044 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2xk2n\" (UniqueName: \"kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n\") pod \"9714675c-b0ed-4114-bb4b-562f793075e3\" (UID: \"9714675c-b0ed-4114-bb4b-562f793075e3\") " Apr 16 19:37:41.987556 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.987523 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle" (OuterVolumeSpecName: "bundle") pod "9714675c-b0ed-4114-bb4b-562f793075e3" (UID: "9714675c-b0ed-4114-bb4b-562f793075e3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:41.987691 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.987580 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle" (OuterVolumeSpecName: "bundle") pod "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" (UID: "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:41.989436 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.989409 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx" (OuterVolumeSpecName: "kube-api-access-d5brx") pod "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" (UID: "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd"). InnerVolumeSpecName "kube-api-access-d5brx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:37:41.989585 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.989569 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n" (OuterVolumeSpecName: "kube-api-access-2xk2n") pod "9714675c-b0ed-4114-bb4b-562f793075e3" (UID: "9714675c-b0ed-4114-bb4b-562f793075e3"). InnerVolumeSpecName "kube-api-access-2xk2n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:37:41.993189 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.993155 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util" (OuterVolumeSpecName: "util") pod "9714675c-b0ed-4114-bb4b-562f793075e3" (UID: "9714675c-b0ed-4114-bb4b-562f793075e3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:41.994596 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:41.994574 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util" (OuterVolumeSpecName: "util") pod "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" (UID: "fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:42.088599 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088549 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.088599 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088594 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2xk2n\" (UniqueName: \"kubernetes.io/projected/9714675c-b0ed-4114-bb4b-562f793075e3-kube-api-access-2xk2n\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.088808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088609 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.088808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088626 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9714675c-b0ed-4114-bb4b-562f793075e3-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.088808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088638 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.088808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.088650 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-d5brx\" (UniqueName: \"kubernetes.io/projected/fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd-kube-api-access-d5brx\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:42.711690 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.711661 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" Apr 16 19:37:42.711690 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.711670 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst" event={"ID":"9714675c-b0ed-4114-bb4b-562f793075e3","Type":"ContainerDied","Data":"cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e"} Apr 16 19:37:42.712109 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.711707 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb80057ab78d9f42b95d48d2e8fe75db310d632751bf72330337928a329e454e" Apr 16 19:37:42.713543 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.713518 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" event={"ID":"fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd","Type":"ContainerDied","Data":"dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e"} Apr 16 19:37:42.713654 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.713548 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dafefe3f6417bb8f706eb5ec92222e8c382212c40fc481f57378118fd23ab50e" Apr 16 19:37:42.713728 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.713711 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd" Apr 16 19:37:42.843767 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.843742 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:42.996317 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.996196 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util\") pod \"8303a68e-9421-4752-a46a-4c94b985677c\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " Apr 16 19:37:42.996317 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.996282 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cpjt\" (UniqueName: \"kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt\") pod \"8303a68e-9421-4752-a46a-4c94b985677c\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " Apr 16 19:37:42.996563 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.996338 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle\") pod \"8303a68e-9421-4752-a46a-4c94b985677c\" (UID: \"8303a68e-9421-4752-a46a-4c94b985677c\") " Apr 16 19:37:42.996943 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.996918 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle" (OuterVolumeSpecName: "bundle") pod "8303a68e-9421-4752-a46a-4c94b985677c" (UID: "8303a68e-9421-4752-a46a-4c94b985677c"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:42.998584 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:42.998558 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt" (OuterVolumeSpecName: "kube-api-access-5cpjt") pod "8303a68e-9421-4752-a46a-4c94b985677c" (UID: "8303a68e-9421-4752-a46a-4c94b985677c"). InnerVolumeSpecName "kube-api-access-5cpjt". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:37:43.001934 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.001912 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util" (OuterVolumeSpecName: "util") pod "8303a68e-9421-4752-a46a-4c94b985677c" (UID: "8303a68e-9421-4752-a46a-4c94b985677c"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 19:37:43.098043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.097998 2579 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-util\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:43.098043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.098041 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5cpjt\" (UniqueName: \"kubernetes.io/projected/8303a68e-9421-4752-a46a-4c94b985677c-kube-api-access-5cpjt\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:43.098043 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.098057 2579 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8303a68e-9421-4752-a46a-4c94b985677c-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:37:43.719237 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.719179 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" event={"ID":"8303a68e-9421-4752-a46a-4c94b985677c","Type":"ContainerDied","Data":"e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd"} Apr 16 19:37:43.719237 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.719239 2579 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e76a65f2201019c1f09c99782b36efe54d8d193b3f6318d3d20efc25e936cebd" Apr 16 19:37:43.719660 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:43.719261 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j" Apr 16 19:37:54.022544 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022498 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4qcfw"] Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022862 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="extract" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022874 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="extract" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022886 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="util" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022892 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="util" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022899 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="extract" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022904 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="extract" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022912 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="pull" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022917 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="pull" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022923 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="pull" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022928 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="pull" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022942 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="pull" Apr 16 19:37:54.022941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022947 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="pull" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022955 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022960 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022967 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="pull" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022972 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="pull" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022980 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022985 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022991 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.022996 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023001 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023006 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023012 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023017 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="util" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023071 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="88506021-904e-4fc4-96fa-64e8c06a4b94" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023079 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="8303a68e-9421-4752-a46a-4c94b985677c" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023086 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd" containerName="extract" Apr 16 19:37:54.023369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.023094 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="9714675c-b0ed-4114-bb4b-562f793075e3" containerName="extract" Apr 16 19:37:54.027539 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.027519 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:54.031461 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.031434 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 19:37:54.032154 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.032135 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 19:37:54.032301 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.032135 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-6c885\"" Apr 16 19:37:54.044298 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.044271 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4qcfw"] Apr 16 19:37:54.197468 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.197432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkjxn\" (UniqueName: \"kubernetes.io/projected/4660f614-c3c6-4e34-9e97-98210a4ccd50-kube-api-access-vkjxn\") pod \"authorino-operator-657f44b778-4qcfw\" (UID: \"4660f614-c3c6-4e34-9e97-98210a4ccd50\") " pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:54.298384 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.298280 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkjxn\" (UniqueName: \"kubernetes.io/projected/4660f614-c3c6-4e34-9e97-98210a4ccd50-kube-api-access-vkjxn\") pod \"authorino-operator-657f44b778-4qcfw\" (UID: \"4660f614-c3c6-4e34-9e97-98210a4ccd50\") " pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:54.307426 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.307390 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkjxn\" (UniqueName: \"kubernetes.io/projected/4660f614-c3c6-4e34-9e97-98210a4ccd50-kube-api-access-vkjxn\") pod \"authorino-operator-657f44b778-4qcfw\" (UID: \"4660f614-c3c6-4e34-9e97-98210a4ccd50\") " pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:54.338062 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.338024 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:54.484687 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.484650 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-4qcfw"] Apr 16 19:37:54.485924 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:37:54.485886 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4660f614_c3c6_4e34_9e97_98210a4ccd50.slice/crio-536c25b74d7d295f45f4c6c8d52f724d08c96e5aa0dc31290a277171b8c393f6 WatchSource:0}: Error finding container 536c25b74d7d295f45f4c6c8d52f724d08c96e5aa0dc31290a277171b8c393f6: Status 404 returned error can't find the container with id 536c25b74d7d295f45f4c6c8d52f724d08c96e5aa0dc31290a277171b8c393f6 Apr 16 19:37:54.763948 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:54.763907 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" event={"ID":"4660f614-c3c6-4e34-9e97-98210a4ccd50","Type":"ContainerStarted","Data":"536c25b74d7d295f45f4c6c8d52f724d08c96e5aa0dc31290a277171b8c393f6"} Apr 16 19:37:56.773403 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:56.773364 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" event={"ID":"4660f614-c3c6-4e34-9e97-98210a4ccd50","Type":"ContainerStarted","Data":"b6cd473931ca3249fa4850d89e791e7460931006ccbb93a80cc1049f58950b8e"} Apr 16 19:37:56.773768 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:56.773484 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:37:56.791596 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:37:56.791543 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" podStartSLOduration=1.092762859 podStartE2EDuration="2.791531598s" podCreationTimestamp="2026-04-16 19:37:54 +0000 UTC" firstStartedPulling="2026-04-16 19:37:54.487925247 +0000 UTC m=+458.064703437" lastFinishedPulling="2026-04-16 19:37:56.186693975 +0000 UTC m=+459.763472176" observedRunningTime="2026-04-16 19:37:56.790493317 +0000 UTC m=+460.367271540" watchObservedRunningTime="2026-04-16 19:37:56.791531598 +0000 UTC m=+460.368309808" Apr 16 19:38:07.779756 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:07.779718 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-4qcfw" Apr 16 19:38:14.620766 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.620724 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-86d56b47d5-ttzdg"] Apr 16 19:38:14.626442 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.626417 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.635471 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.635442 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d56b47d5-ttzdg"] Apr 16 19:38:14.671842 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.671793 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672020 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.671866 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-oauth-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672020 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.671901 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-oauth-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672020 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.671955 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7wd\" (UniqueName: \"kubernetes.io/projected/eaafc64a-e440-46b0-bce8-8fef4b455035-kube-api-access-5v7wd\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672020 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.671987 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-console-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672180 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.672027 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-trusted-ca-bundle\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.672180 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.672056 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-service-ca\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773199 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773159 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773398 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773238 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-oauth-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773398 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773265 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-oauth-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773398 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773308 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7wd\" (UniqueName: \"kubernetes.io/projected/eaafc64a-e440-46b0-bce8-8fef4b455035-kube-api-access-5v7wd\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773398 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773339 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-console-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773398 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773380 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-trusted-ca-bundle\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.773644 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.773410 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-service-ca\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.774159 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.774130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-service-ca\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.774286 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.774130 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-oauth-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.774286 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.774238 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-console-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.774369 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.774349 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eaafc64a-e440-46b0-bce8-8fef4b455035-trusted-ca-bundle\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.775843 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.775822 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-oauth-config\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.775946 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.775928 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eaafc64a-e440-46b0-bce8-8fef4b455035-console-serving-cert\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.781575 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.781549 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7wd\" (UniqueName: \"kubernetes.io/projected/eaafc64a-e440-46b0-bce8-8fef4b455035-kube-api-access-5v7wd\") pod \"console-86d56b47d5-ttzdg\" (UID: \"eaafc64a-e440-46b0-bce8-8fef4b455035\") " pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:14.938071 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:14.938032 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:15.084113 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:15.084081 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-86d56b47d5-ttzdg"] Apr 16 19:38:15.085609 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:38:15.085576 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaafc64a_e440_46b0_bce8_8fef4b455035.slice/crio-426e0ce7eb9ca1da0f2170dc8b80b26a853bd52c30945a12bcfbd79bfa6d9992 WatchSource:0}: Error finding container 426e0ce7eb9ca1da0f2170dc8b80b26a853bd52c30945a12bcfbd79bfa6d9992: Status 404 returned error can't find the container with id 426e0ce7eb9ca1da0f2170dc8b80b26a853bd52c30945a12bcfbd79bfa6d9992 Apr 16 19:38:15.846531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:15.846490 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d56b47d5-ttzdg" event={"ID":"eaafc64a-e440-46b0-bce8-8fef4b455035","Type":"ContainerStarted","Data":"c8a12ad1fb2100ad25bacea1b312f842d5c5094aac9d0705d033c2935aecb8e9"} Apr 16 19:38:15.846531 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:15.846533 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-86d56b47d5-ttzdg" event={"ID":"eaafc64a-e440-46b0-bce8-8fef4b455035","Type":"ContainerStarted","Data":"426e0ce7eb9ca1da0f2170dc8b80b26a853bd52c30945a12bcfbd79bfa6d9992"} Apr 16 19:38:15.865837 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:15.865785 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-86d56b47d5-ttzdg" podStartSLOduration=1.8657699939999999 podStartE2EDuration="1.865769994s" podCreationTimestamp="2026-04-16 19:38:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:38:15.864202147 +0000 UTC m=+479.440980355" watchObservedRunningTime="2026-04-16 19:38:15.865769994 +0000 UTC m=+479.442548205" Apr 16 19:38:23.137880 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.137840 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6"] Apr 16 19:38:23.141649 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.141624 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.147168 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.147143 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v5mgg\"" Apr 16 19:38:23.161300 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.161259 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6"] Apr 16 19:38:23.247947 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.247906 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwgmv\" (UniqueName: \"kubernetes.io/projected/646e966d-575d-44aa-b0fc-beaa68723ed6-kube-api-access-pwgmv\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.248142 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.248015 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/646e966d-575d-44aa-b0fc-beaa68723ed6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.349453 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.349398 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/646e966d-575d-44aa-b0fc-beaa68723ed6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.349625 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.349506 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pwgmv\" (UniqueName: \"kubernetes.io/projected/646e966d-575d-44aa-b0fc-beaa68723ed6-kube-api-access-pwgmv\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.349816 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.349792 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/646e966d-575d-44aa-b0fc-beaa68723ed6-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.358953 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.358927 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwgmv\" (UniqueName: \"kubernetes.io/projected/646e966d-575d-44aa-b0fc-beaa68723ed6-kube-api-access-pwgmv\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6\" (UID: \"646e966d-575d-44aa-b0fc-beaa68723ed6\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.453590 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.453531 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:23.589456 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.589421 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6"] Apr 16 19:38:23.591277 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:38:23.591237 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod646e966d_575d_44aa_b0fc_beaa68723ed6.slice/crio-d59d1d03f99a4aba3be5bcd8f920372001dfb087119cd6fcdd7ca0887cdba950 WatchSource:0}: Error finding container d59d1d03f99a4aba3be5bcd8f920372001dfb087119cd6fcdd7ca0887cdba950: Status 404 returned error can't find the container with id d59d1d03f99a4aba3be5bcd8f920372001dfb087119cd6fcdd7ca0887cdba950 Apr 16 19:38:23.883615 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:23.883524 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" event={"ID":"646e966d-575d-44aa-b0fc-beaa68723ed6","Type":"ContainerStarted","Data":"d59d1d03f99a4aba3be5bcd8f920372001dfb087119cd6fcdd7ca0887cdba950"} Apr 16 19:38:24.946757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:24.946714 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:24.947199 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:24.946817 2579 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:24.947199 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:24.946833 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:24.951657 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:24.951627 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-86d56b47d5-ttzdg" Apr 16 19:38:25.043237 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:25.043188 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:38:28.912519 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:28.912481 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" event={"ID":"646e966d-575d-44aa-b0fc-beaa68723ed6","Type":"ContainerStarted","Data":"e4cdb135049545f4589fb5d7f10bac5d7e53ac2ca2e87bb102eb8c80121c6cca"} Apr 16 19:38:28.912943 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:28.912591 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:28.933592 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:28.933529 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" podStartSLOduration=1.31861588 podStartE2EDuration="5.933514271s" podCreationTimestamp="2026-04-16 19:38:23 +0000 UTC" firstStartedPulling="2026-04-16 19:38:23.593750308 +0000 UTC m=+487.170528497" lastFinishedPulling="2026-04-16 19:38:28.208648684 +0000 UTC m=+491.785426888" observedRunningTime="2026-04-16 19:38:28.931003597 +0000 UTC m=+492.507781808" watchObservedRunningTime="2026-04-16 19:38:28.933514271 +0000 UTC m=+492.510292544" Apr 16 19:38:39.919384 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:39.919349 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6" Apr 16 19:38:50.912395 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:50.912345 2579 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-5d6bbd894d-d9m49" podUID="e785553f-df01-4e45-b876-2c6f1cdee6ef" containerName="console" containerID="cri-o://738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097" gracePeriod=15 Apr 16 19:38:51.153672 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.153646 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d6bbd894d-d9m49_e785553f-df01-4e45-b876-2c6f1cdee6ef/console/0.log" Apr 16 19:38:51.153815 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.153723 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:38:51.286310 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286281 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6tcf\" (UniqueName: \"kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286310 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286324 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286537 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286507 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286593 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286581 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286614 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286708 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286680 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286761 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286717 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:38:51.286761 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286736 2579 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca\") pod \"e785553f-df01-4e45-b876-2c6f1cdee6ef\" (UID: \"e785553f-df01-4e45-b876-2c6f1cdee6ef\") " Apr 16 19:38:51.286967 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.286943 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config" (OuterVolumeSpecName: "console-config") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:38:51.287031 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.287017 2579 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-oauth-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.287090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.287036 2579 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.287090 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.287032 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:38:51.287308 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.287281 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca" (OuterVolumeSpecName: "service-ca") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 19:38:51.288647 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.288616 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf" (OuterVolumeSpecName: "kube-api-access-w6tcf") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "kube-api-access-w6tcf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 19:38:51.288778 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.288693 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:38:51.288778 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.288765 2579 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e785553f-df01-4e45-b876-2c6f1cdee6ef" (UID: "e785553f-df01-4e45-b876-2c6f1cdee6ef"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 19:38:51.387522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.387484 2579 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-serving-cert\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.387522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.387514 2579 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-trusted-ca-bundle\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.387522 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.387524 2579 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e785553f-df01-4e45-b876-2c6f1cdee6ef-console-oauth-config\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.387755 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.387533 2579 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e785553f-df01-4e45-b876-2c6f1cdee6ef-service-ca\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:51.387755 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:51.387544 2579 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w6tcf\" (UniqueName: \"kubernetes.io/projected/e785553f-df01-4e45-b876-2c6f1cdee6ef-kube-api-access-w6tcf\") on node \"ip-10-0-129-155.ec2.internal\" DevicePath \"\"" Apr 16 19:38:52.008392 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008362 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5d6bbd894d-d9m49_e785553f-df01-4e45-b876-2c6f1cdee6ef/console/0.log" Apr 16 19:38:52.008796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008405 2579 generic.go:358] "Generic (PLEG): container finished" podID="e785553f-df01-4e45-b876-2c6f1cdee6ef" containerID="738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097" exitCode=2 Apr 16 19:38:52.008796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008472 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6bbd894d-d9m49" event={"ID":"e785553f-df01-4e45-b876-2c6f1cdee6ef","Type":"ContainerDied","Data":"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097"} Apr 16 19:38:52.008796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008513 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d6bbd894d-d9m49" event={"ID":"e785553f-df01-4e45-b876-2c6f1cdee6ef","Type":"ContainerDied","Data":"18e50c95aece30ca18b8befabc896ae371a89e521a816a7b5170819835b98ba0"} Apr 16 19:38:52.008796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008510 2579 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d6bbd894d-d9m49" Apr 16 19:38:52.008796 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.008584 2579 scope.go:117] "RemoveContainer" containerID="738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097" Apr 16 19:38:52.017618 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.017602 2579 scope.go:117] "RemoveContainer" containerID="738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097" Apr 16 19:38:52.017853 ip-10-0-129-155 kubenswrapper[2579]: E0416 19:38:52.017832 2579 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097\": container with ID starting with 738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097 not found: ID does not exist" containerID="738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097" Apr 16 19:38:52.017891 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.017862 2579 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097"} err="failed to get container status \"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097\": rpc error: code = NotFound desc = could not find container \"738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097\": container with ID starting with 738bca409eedfcec8ade37a86158798820b0a845ee18226924b6b92a23a44097 not found: ID does not exist" Apr 16 19:38:52.032983 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.032956 2579 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:38:52.037683 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.037661 2579 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5d6bbd894d-d9m49"] Apr 16 19:38:52.943911 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:38:52.943862 2579 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e785553f-df01-4e45-b876-2c6f1cdee6ef" path="/var/lib/kubelet/pods/e785553f-df01-4e45-b876-2c6f1cdee6ef/volumes" Apr 16 19:39:00.911884 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.911841 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:39:00.912480 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.912458 2579 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e785553f-df01-4e45-b876-2c6f1cdee6ef" containerName="console" Apr 16 19:39:00.912594 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.912483 2579 state_mem.go:107] "Deleted CPUSet assignment" podUID="e785553f-df01-4e45-b876-2c6f1cdee6ef" containerName="console" Apr 16 19:39:00.912651 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.912600 2579 memory_manager.go:356] "RemoveStaleState removing state" podUID="e785553f-df01-4e45-b876-2c6f1cdee6ef" containerName="console" Apr 16 19:39:00.916176 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.916154 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:00.919133 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.919106 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 19:39:00.919322 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.919176 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-5m27b\"" Apr 16 19:39:00.923728 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.923659 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:39:00.952703 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.952670 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:39:00.967387 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.967361 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mzzb\" (UniqueName: \"kubernetes.io/projected/1b0a4a07-2305-4d29-b57f-fe12a45f9407-kube-api-access-7mzzb\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:00.967509 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:00.967432 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b0a4a07-2305-4d29-b57f-fe12a45f9407-config-file\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.068102 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.068063 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b0a4a07-2305-4d29-b57f-fe12a45f9407-config-file\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.068295 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.068128 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7mzzb\" (UniqueName: \"kubernetes.io/projected/1b0a4a07-2305-4d29-b57f-fe12a45f9407-kube-api-access-7mzzb\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.068757 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.068735 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/1b0a4a07-2305-4d29-b57f-fe12a45f9407-config-file\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.076067 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.076042 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mzzb\" (UniqueName: \"kubernetes.io/projected/1b0a4a07-2305-4d29-b57f-fe12a45f9407-kube-api-access-7mzzb\") pod \"limitador-limitador-78c99df468-sr7g6\" (UID: \"1b0a4a07-2305-4d29-b57f-fe12a45f9407\") " pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.228538 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.228491 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:01.360680 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:01.360647 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:39:01.362576 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:39:01.362548 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b0a4a07_2305_4d29_b57f_fe12a45f9407.slice/crio-2729aa9d9332f56ba332ebad233533754b41f585cb9b585d27c2eeaa1e3f6ca2 WatchSource:0}: Error finding container 2729aa9d9332f56ba332ebad233533754b41f585cb9b585d27c2eeaa1e3f6ca2: Status 404 returned error can't find the container with id 2729aa9d9332f56ba332ebad233533754b41f585cb9b585d27c2eeaa1e3f6ca2 Apr 16 19:39:02.052021 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:02.051947 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" event={"ID":"1b0a4a07-2305-4d29-b57f-fe12a45f9407","Type":"ContainerStarted","Data":"2729aa9d9332f56ba332ebad233533754b41f585cb9b585d27c2eeaa1e3f6ca2"} Apr 16 19:39:04.061504 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:04.061462 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" event={"ID":"1b0a4a07-2305-4d29-b57f-fe12a45f9407","Type":"ContainerStarted","Data":"453fb00e2da176a910f46db8f35e9073e239ba94659c0b4d9f4e34c513fa0a8b"} Apr 16 19:39:04.061878 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:04.061597 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:04.077772 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:04.077716 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" podStartSLOduration=1.565476694 podStartE2EDuration="4.077700168s" podCreationTimestamp="2026-04-16 19:39:00 +0000 UTC" firstStartedPulling="2026-04-16 19:39:01.364991933 +0000 UTC m=+524.941770125" lastFinishedPulling="2026-04-16 19:39:03.877215397 +0000 UTC m=+527.453993599" observedRunningTime="2026-04-16 19:39:04.076901026 +0000 UTC m=+527.653679235" watchObservedRunningTime="2026-04-16 19:39:04.077700168 +0000 UTC m=+527.654478379" Apr 16 19:39:15.066859 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:15.066825 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-sr7g6" Apr 16 19:39:38.834781 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:39:38.834744 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:40:09.063529 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:09.063492 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:40:16.835712 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:16.835686 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:40:16.835712 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:16.835706 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:40:34.637721 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:34.637645 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:40:53.862402 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:53.862321 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:40:56.926830 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:40:56.926796 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:41:01.333748 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:41:01.333706 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-sr7g6"] Apr 16 19:42:41.520579 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:41.520547 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-44kfz_cfc99283-cf9c-4242-a5ac-b07e345d04f7/manager/0.log" Apr 16 19:42:41.889230 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:41.889109 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7jbgv_2f1f4763-90e1-44e4-8429-6c99794ae398/manager/1.log" Apr 16 19:42:42.000783 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:42.000756 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-2zhnq_6289dbfc-b014-4bf2-926d-ea8c13ae04b0/manager/0.log" Apr 16 19:42:43.123897 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.123868 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/util/0.log" Apr 16 19:42:43.130066 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.130040 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/pull/0.log" Apr 16 19:42:43.136813 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.136790 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/extract/0.log" Apr 16 19:42:43.248386 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.248357 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/pull/0.log" Apr 16 19:42:43.260024 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.260003 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/extract/0.log" Apr 16 19:42:43.267884 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.267862 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/util/0.log" Apr 16 19:42:43.378066 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.378014 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/util/0.log" Apr 16 19:42:43.385488 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.385456 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/pull/0.log" Apr 16 19:42:43.391569 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.391550 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/extract/0.log" Apr 16 19:42:43.498476 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.498453 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/util/0.log" Apr 16 19:42:43.505218 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.505189 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/pull/0.log" Apr 16 19:42:43.511312 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.511284 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/extract/0.log" Apr 16 19:42:43.747751 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:43.747727 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4qcfw_4660f614-c3c6-4e34-9e97-98210a4ccd50/manager/0.log" Apr 16 19:42:44.230939 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:44.230908 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6_646e966d-575d-44aa-b0fc-beaa68723ed6/manager/0.log" Apr 16 19:42:44.338876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:44.338856 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-sr7g6_1b0a4a07-2305-4d29-b57f-fe12a45f9407/limitador/0.log" Apr 16 19:42:44.915549 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:44.915519 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-njdt9_184964d2-5d55-4f19-b607-e94fa7d12038/discovery/0.log" Apr 16 19:42:45.140025 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:45.140001 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6b5579666b-ws9zx_aea05870-5faf-4066-b9b2-c179cf5245f4/kube-auth-proxy/0.log" Apr 16 19:42:45.375133 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:45.375089 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-85f49fc854-txwk9_43c093f8-2338-4e6e-8349-b4e1575f5161/router/0.log" Apr 16 19:42:53.488028 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:53.487999 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-ls7dc_afd027c2-990e-4d6c-b57c-62c9c66ce5f2/global-pull-secret-syncer/0.log" Apr 16 19:42:53.562818 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:53.562787 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nh8b7_d478cc2e-78cb-4140-9eaf-2624faf8382b/konnectivity-agent/0.log" Apr 16 19:42:53.606549 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:53.606526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-155.ec2.internal_ad3ef22c0a8b4cbdc51ac89991654b86/haproxy/0.log" Apr 16 19:42:57.243791 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.243744 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/extract/0.log" Apr 16 19:42:57.272771 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.272750 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/util/0.log" Apr 16 19:42:57.295421 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.295401 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759j5lst_9714675c-b0ed-4114-bb4b-562f793075e3/pull/0.log" Apr 16 19:42:57.322275 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.322255 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/extract/0.log" Apr 16 19:42:57.345086 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.345065 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/util/0.log" Apr 16 19:42:57.365876 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.365857 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e0qnwdd_fb5dbc0b-c8e2-4c35-b878-e4ceef3332cd/pull/0.log" Apr 16 19:42:57.389775 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.389757 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/extract/0.log" Apr 16 19:42:57.412927 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.412906 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/util/0.log" Apr 16 19:42:57.434148 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.434125 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed73l4t9j_8303a68e-9421-4752-a46a-4c94b985677c/pull/0.log" Apr 16 19:42:57.458903 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.458886 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/extract/0.log" Apr 16 19:42:57.478804 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.478776 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/util/0.log" Apr 16 19:42:57.498883 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.498813 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef179xt7_88506021-904e-4fc4-96fa-64e8c06a4b94/pull/0.log" Apr 16 19:42:57.680253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.680228 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-4qcfw_4660f614-c3c6-4e34-9e97-98210a4ccd50/manager/0.log" Apr 16 19:42:57.810392 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.810292 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-cn7b6_646e966d-575d-44aa-b0fc-beaa68723ed6/manager/0.log" Apr 16 19:42:57.827347 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:57.827326 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-sr7g6_1b0a4a07-2305-4d29-b57f-fe12a45f9407/limitador/0.log" Apr 16 19:42:59.761376 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.761341 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gjg22_f1d1fd5a-5106-4eca-85d4-dec132a69811/node-exporter/0.log" Apr 16 19:42:59.787328 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.787301 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gjg22_f1d1fd5a-5106-4eca-85d4-dec132a69811/kube-rbac-proxy/0.log" Apr 16 19:42:59.814889 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.814870 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gjg22_f1d1fd5a-5106-4eca-85d4-dec132a69811/init-textfile/0.log" Apr 16 19:42:59.842046 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.842028 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6chqw_0b6ba5cf-ff81-4de6-bf41-1d82a9913e97/kube-rbac-proxy-main/0.log" Apr 16 19:42:59.861955 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.861933 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6chqw_0b6ba5cf-ff81-4de6-bf41-1d82a9913e97/kube-rbac-proxy-self/0.log" Apr 16 19:42:59.881543 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:42:59.881526 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6chqw_0b6ba5cf-ff81-4de6-bf41-1d82a9913e97/openshift-state-metrics/0.log" Apr 16 19:43:00.092493 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:00.092415 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zmnp2_1ffbb2c0-6d4f-4842-8198-cebed3110c5d/prometheus-operator/0.log" Apr 16 19:43:00.108477 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:00.108456 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-zmnp2_1ffbb2c0-6d4f-4842-8198-cebed3110c5d/kube-rbac-proxy/0.log" Apr 16 19:43:01.431565 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.431531 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-96nm2_91f7dc43-a855-4712-8639-caad7b7a8458/networking-console-plugin/0.log" Apr 16 19:43:01.915430 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.915397 2579 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv"] Apr 16 19:43:01.917732 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.917715 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:01.920326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.920303 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"kube-root-ca.crt\"" Apr 16 19:43:01.920326 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.920322 2579 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5h99\"/\"openshift-service-ca.crt\"" Apr 16 19:43:01.921338 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.921324 2579 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5h99\"/\"default-dockercfg-5clhq\"" Apr 16 19:43:01.928540 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.928517 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv"] Apr 16 19:43:01.952645 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.952623 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/2.log" Apr 16 19:43:01.960307 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:01.960291 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-ckt7c_d5d951a1-3e60-4517-ae8b-75bba19200c9/console-operator/3.log" Apr 16 19:43:02.008716 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.008692 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-podres\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.008808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.008723 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-sys\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.008808 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.008750 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-proc\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.008908 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.008821 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-lib-modules\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.008908 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.008852 2579 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jmmm\" (UniqueName: \"kubernetes.io/projected/bd70d6fd-bf12-4994-92f4-fba395a151d8-kube-api-access-2jmmm\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110041 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110016 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-sys\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110126 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110061 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-proc\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110166 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-proc\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110233 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110125 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-sys\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110287 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110237 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-lib-modules\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110287 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110268 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2jmmm\" (UniqueName: \"kubernetes.io/projected/bd70d6fd-bf12-4994-92f4-fba395a151d8-kube-api-access-2jmmm\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110385 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110364 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-lib-modules\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110474 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110457 2579 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-podres\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.110642 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.110620 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/bd70d6fd-bf12-4994-92f4-fba395a151d8-podres\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.118998 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.118977 2579 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jmmm\" (UniqueName: \"kubernetes.io/projected/bd70d6fd-bf12-4994-92f4-fba395a151d8-kube-api-access-2jmmm\") pod \"perf-node-gather-daemonset-pbbdv\" (UID: \"bd70d6fd-bf12-4994-92f4-fba395a151d8\") " pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.227548 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.227527 2579 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:02.347347 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.347322 2579 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv"] Apr 16 19:43:02.348728 ip-10-0-129-155 kubenswrapper[2579]: W0416 19:43:02.348703 2579 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbd70d6fd_bf12_4994_92f4_fba395a151d8.slice/crio-6959d8e8a0f289eaebc99bbfd87efaa7c7e7f40d0f0a0b57d920fd0c12d5e909 WatchSource:0}: Error finding container 6959d8e8a0f289eaebc99bbfd87efaa7c7e7f40d0f0a0b57d920fd0c12d5e909: Status 404 returned error can't find the container with id 6959d8e8a0f289eaebc99bbfd87efaa7c7e7f40d0f0a0b57d920fd0c12d5e909 Apr 16 19:43:02.350312 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.350297 2579 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 19:43:02.498717 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.498660 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-86d56b47d5-ttzdg_eaafc64a-e440-46b0-bce8-8fef4b455035/console/0.log" Apr 16 19:43:02.527000 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.526978 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-bwjxs_8685294b-3f49-44ea-a7b8-0b967dd8ddbe/download-server/0.log" Apr 16 19:43:02.987115 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.987084 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dm5bc_eb8666d2-cbd8-4ed1-8781-eb4176da58ab/volume-data-source-validator/0.log" Apr 16 19:43:02.990069 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.990044 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" event={"ID":"bd70d6fd-bf12-4994-92f4-fba395a151d8","Type":"ContainerStarted","Data":"d581b13a484b96aecf726f8e485eb7c1e6a9705eef49a51d65e979af94f1be9f"} Apr 16 19:43:02.990069 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.990072 2579 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" event={"ID":"bd70d6fd-bf12-4994-92f4-fba395a151d8","Type":"ContainerStarted","Data":"6959d8e8a0f289eaebc99bbfd87efaa7c7e7f40d0f0a0b57d920fd0c12d5e909"} Apr 16 19:43:02.990235 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:02.990157 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:03.005562 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:03.005525 2579 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" podStartSLOduration=2.005514845 podStartE2EDuration="2.005514845s" podCreationTimestamp="2026-04-16 19:43:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 19:43:03.003955564 +0000 UTC m=+766.580733773" watchObservedRunningTime="2026-04-16 19:43:03.005514845 +0000 UTC m=+766.582293051" Apr 16 19:43:03.855534 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:03.855505 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x5hqp_e038a6a1-56fa-476b-8faf-dc54fd9afdfa/dns/0.log" Apr 16 19:43:03.874153 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:03.874134 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-x5hqp_e038a6a1-56fa-476b-8faf-dc54fd9afdfa/kube-rbac-proxy/0.log" Apr 16 19:43:03.937837 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:03.937807 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-stc5r_b153866d-121b-4ac1-a27e-c2aea8f9de02/dns-node-resolver/0.log" Apr 16 19:43:04.441159 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:04.441137 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-l5xrl_9859a426-6968-44ca-b63e-42baba2b957d/node-ca/0.log" Apr 16 19:43:05.310361 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:05.310320 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-njdt9_184964d2-5d55-4f19-b607-e94fa7d12038/discovery/0.log" Apr 16 19:43:05.351500 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:05.351472 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-6b5579666b-ws9zx_aea05870-5faf-4066-b9b2-c179cf5245f4/kube-auth-proxy/0.log" Apr 16 19:43:05.404449 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:05.404420 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-85f49fc854-txwk9_43c093f8-2338-4e6e-8349-b4e1575f5161/router/0.log" Apr 16 19:43:05.868190 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:05.868163 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-4dv2w_4063c915-74be-464e-845c-caccf6e297c5/serve-healthcheck-canary/0.log" Apr 16 19:43:06.541143 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:06.541117 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xzqwj_e7dc6e11-9200-454f-9b2e-e9d2081a2b29/kube-rbac-proxy/0.log" Apr 16 19:43:06.560489 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:06.560460 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xzqwj_e7dc6e11-9200-454f-9b2e-e9d2081a2b29/exporter/0.log" Apr 16 19:43:06.580257 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:06.580238 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-xzqwj_e7dc6e11-9200-454f-9b2e-e9d2081a2b29/extractor/0.log" Apr 16 19:43:08.328582 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:08.328556 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-44kfz_cfc99283-cf9c-4242-a5ac-b07e345d04f7/manager/0.log" Apr 16 19:43:08.395266 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:08.395244 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7jbgv_2f1f4763-90e1-44e4-8429-6c99794ae398/manager/0.log" Apr 16 19:43:08.410961 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:08.410937 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-7jbgv_2f1f4763-90e1-44e4-8429-6c99794ae398/manager/1.log" Apr 16 19:43:08.431743 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:08.431720 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57586b9555-2zhnq_6289dbfc-b014-4bf2-926d-ea8c13ae04b0/manager/0.log" Apr 16 19:43:09.003105 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:09.003083 2579 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5h99/perf-node-gather-daemonset-pbbdv" Apr 16 19:43:09.626715 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:09.626686 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-h4drj_419999fc-bda6-4250-8aa0-5f0797260078/openshift-lws-operator/0.log" Apr 16 19:43:14.256941 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:14.256888 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ph52k_33c6084e-53d2-4e83-85bf-1e53dca8d967/kube-storage-version-migrator-operator/1.log" Apr 16 19:43:14.258455 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:14.258419 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-ph52k_33c6084e-53d2-4e83-85bf-1e53dca8d967/kube-storage-version-migrator-operator/0.log" Apr 16 19:43:15.545969 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.545944 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/kube-multus-additional-cni-plugins/0.log" Apr 16 19:43:15.568393 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.568361 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/egress-router-binary-copy/0.log" Apr 16 19:43:15.588382 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.588355 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/cni-plugins/0.log" Apr 16 19:43:15.606426 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.606405 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/bond-cni-plugin/0.log" Apr 16 19:43:15.626253 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.626231 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/routeoverride-cni/0.log" Apr 16 19:43:15.645502 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.645485 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/whereabouts-cni-bincopy/0.log" Apr 16 19:43:15.665317 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.665295 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-whspd_6c0704bd-2f0f-4e78-8573-cf9346b4ae16/whereabouts-cni/0.log" Apr 16 19:43:15.726488 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.726463 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-ncrmg_4b53a341-a257-4e51-866a-7aaefe569885/kube-multus/0.log" Apr 16 19:43:15.749622 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.749602 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7mh9f_ca23e8db-bb88-449f-8286-27f2978eb0ca/network-metrics-daemon/0.log" Apr 16 19:43:15.766747 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:15.766729 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7mh9f_ca23e8db-bb88-449f-8286-27f2978eb0ca/kube-rbac-proxy/0.log" Apr 16 19:43:16.909981 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:16.909956 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/ovn-controller/0.log" Apr 16 19:43:16.933690 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:16.933663 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/ovn-acl-logging/0.log" Apr 16 19:43:16.953600 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:16.953579 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/kube-rbac-proxy-node/0.log" Apr 16 19:43:16.977629 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:16.977610 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 19:43:16.995973 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:16.995955 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/northd/0.log" Apr 16 19:43:17.015893 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:17.015874 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/nbdb/0.log" Apr 16 19:43:17.038545 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:17.038528 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/sbdb/0.log" Apr 16 19:43:17.207897 ip-10-0-129-155 kubenswrapper[2579]: I0416 19:43:17.207854 2579 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mx7dg_f6b1a716-a116-40d7-bd7e-8947f3cfea04/ovnkube-controller/0.log"