Apr 17 21:36:43.129867 ip-10-0-132-27 systemd[1]: Starting Kubernetes Kubelet... Apr 17 21:36:43.510920 ip-10-0-132-27 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:43.510920 ip-10-0-132-27 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 21:36:43.510920 ip-10-0-132-27 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:43.510920 ip-10-0-132-27 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 21:36:43.510920 ip-10-0-132-27 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 21:36:43.511738 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.511651 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 21:36:43.513946 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513931 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:43.513946 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513946 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513950 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513954 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513957 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513960 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513963 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513966 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513968 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513971 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513974 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513977 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513980 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513983 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513985 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513988 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513991 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513994 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513996 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.513999 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:43.514004 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514002 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514004 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514007 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514010 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514013 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514016 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514019 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514021 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514024 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514027 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514029 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514032 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514037 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514040 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514044 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514047 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514050 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514053 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514055 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514058 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:43.514457 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514060 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514063 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514065 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514068 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514070 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514073 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514076 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514078 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514081 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514083 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514086 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514089 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514091 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514094 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514098 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514100 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514103 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514105 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514108 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514111 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:43.514964 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514114 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514117 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514119 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514122 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514124 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514127 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514130 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514132 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514135 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514138 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514140 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514142 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514146 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514148 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514151 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514154 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514156 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514161 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514164 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:43.515442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514167 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514170 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514173 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514175 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514178 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514181 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514184 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514587 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514606 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514609 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514612 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514615 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514618 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514621 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514625 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514629 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514632 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514636 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514638 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:43.515920 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514642 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514645 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514648 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514650 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514653 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514655 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514658 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514661 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514663 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514666 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514668 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514671 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514674 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514677 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514679 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514682 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514685 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514687 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514690 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514692 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:43.516410 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514695 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514698 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514701 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514703 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514706 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514708 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514711 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514714 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514716 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514719 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514721 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514724 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514726 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514729 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514732 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514735 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514737 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514740 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514743 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514745 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:43.516925 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514747 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514751 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514753 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514756 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514759 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514762 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514764 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514767 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514769 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514772 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514774 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514777 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514780 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514783 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514786 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514788 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514791 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514794 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514799 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514802 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:43.517414 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514805 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514808 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514810 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514813 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514815 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514818 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514821 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514823 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514826 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514830 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514832 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514835 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514838 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.514841 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515936 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515945 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515952 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515956 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515961 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515964 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515968 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 21:36:43.517928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515973 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515976 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515979 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515983 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515986 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515990 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515993 2564 flags.go:64] FLAG: --cgroup-root="" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515996 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.515999 2564 flags.go:64] FLAG: --client-ca-file="" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516002 2564 flags.go:64] FLAG: --cloud-config="" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516005 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516008 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516013 2564 flags.go:64] FLAG: --cluster-domain="" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516016 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516019 2564 flags.go:64] FLAG: --config-dir="" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516022 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516025 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516029 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516033 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516036 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516040 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516043 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516046 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516049 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516055 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 21:36:43.518605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516058 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516062 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516065 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516068 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516071 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516074 2564 flags.go:64] FLAG: --enable-server="true" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516077 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516081 2564 flags.go:64] FLAG: --event-burst="100" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516084 2564 flags.go:64] FLAG: --event-qps="50" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516087 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516092 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516095 2564 flags.go:64] FLAG: --eviction-hard="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516099 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516101 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516104 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516108 2564 flags.go:64] FLAG: --eviction-soft="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516111 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516113 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516116 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516126 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516129 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516132 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516134 2564 flags.go:64] FLAG: --feature-gates="" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516138 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516141 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 21:36:43.519303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516144 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516153 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516156 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516159 2564 flags.go:64] FLAG: --help="false" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516162 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516165 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516170 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516173 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516176 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516180 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516182 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516185 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516188 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516191 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516194 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516197 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516200 2564 flags.go:64] FLAG: --kube-reserved="" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516204 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516207 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516210 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516213 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516215 2564 flags.go:64] FLAG: --lock-file="" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516218 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516221 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 21:36:43.520006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516224 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516230 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516232 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516235 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516238 2564 flags.go:64] FLAG: --logging-format="text" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516241 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516244 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516247 2564 flags.go:64] FLAG: --manifest-url="" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516250 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516254 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516264 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516268 2564 flags.go:64] FLAG: --max-pods="110" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516271 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516274 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516278 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516282 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516285 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516288 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516291 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516302 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516305 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516308 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516311 2564 flags.go:64] FLAG: --pod-cidr="" Apr 17 21:36:43.520684 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516314 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516320 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516323 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516327 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516330 2564 flags.go:64] FLAG: --port="10250" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516333 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516336 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0156b55d350ddfbf3" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516339 2564 flags.go:64] FLAG: --qos-reserved="" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516342 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516345 2564 flags.go:64] FLAG: --register-node="true" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516347 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516350 2564 flags.go:64] FLAG: --register-with-taints="" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516354 2564 flags.go:64] FLAG: --registry-burst="10" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516357 2564 flags.go:64] FLAG: --registry-qps="5" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516359 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516362 2564 flags.go:64] FLAG: --reserved-memory="" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516366 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516369 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516372 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516374 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516383 2564 flags.go:64] FLAG: --runonce="false" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516386 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516389 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516394 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516396 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516399 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 21:36:43.521320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516402 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516405 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516408 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516411 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516414 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516417 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516420 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516423 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516426 2564 flags.go:64] FLAG: --system-cgroups="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516429 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516435 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516438 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516441 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516448 2564 flags.go:64] FLAG: --tls-min-version="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516450 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516453 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516456 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516459 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516462 2564 flags.go:64] FLAG: --v="2" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516466 2564 flags.go:64] FLAG: --version="false" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516470 2564 flags.go:64] FLAG: --vmodule="" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516474 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.516477 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516617 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:43.522082 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516622 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516625 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516628 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516631 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516634 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516639 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516641 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516645 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516649 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516652 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516655 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516657 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516660 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516663 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516665 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516668 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516670 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516673 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516676 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:43.522750 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516678 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516681 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516683 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516686 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516689 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516691 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516694 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516697 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516699 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516702 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516705 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516708 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516710 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516713 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516715 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516718 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516721 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516729 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516736 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516738 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:43.523234 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516741 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516744 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516746 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516749 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516751 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516754 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516756 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516759 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516761 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516763 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516766 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516769 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516771 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516774 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516776 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516779 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516781 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516784 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516786 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516789 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:43.523767 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516792 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516794 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516797 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516800 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516802 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516805 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516807 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516809 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516812 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516814 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516824 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516827 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516830 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516832 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516835 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516837 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516840 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516842 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516845 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516847 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:43.524264 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516850 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516852 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516855 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516858 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516861 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.516864 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.517472 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.523936 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.523952 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524000 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524005 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524008 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524011 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524014 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524016 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:43.524785 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524019 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524022 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524024 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524027 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524030 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524032 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524035 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524038 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524040 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524043 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524045 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524048 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524050 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524053 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524055 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524058 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524060 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524064 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524068 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524071 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:43.525161 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524074 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524076 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524079 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524082 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524086 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524099 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524103 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524106 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524109 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524112 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524115 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524117 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524120 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524123 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524125 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524128 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524130 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524133 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524135 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:43.525838 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524138 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524141 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524143 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524146 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524148 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524151 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524153 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524156 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524158 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524161 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524163 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524166 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524169 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524172 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524174 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524177 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524180 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524182 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524185 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524194 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:43.526608 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524197 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524199 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524202 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524204 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524207 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524209 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524212 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524215 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524217 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524220 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524222 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524225 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524228 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524230 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524233 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524235 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524238 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524241 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524243 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524245 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:43.527127 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524248 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.524253 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524380 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524384 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524388 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524391 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524394 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524397 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524400 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524402 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524405 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524408 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524417 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524420 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524422 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 21:36:43.527631 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524425 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524427 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524430 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524432 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524435 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524437 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524440 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524442 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524445 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524447 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524450 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524452 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524454 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524457 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524460 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524462 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524465 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524467 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524470 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 21:36:43.528020 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524473 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524475 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524477 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524480 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524483 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524486 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524489 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524491 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524494 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524496 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524499 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524509 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524512 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524514 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524516 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524519 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524522 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524524 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524527 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524530 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 21:36:43.528543 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524534 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524537 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524539 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524542 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524545 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524548 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524550 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524553 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524555 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524558 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524560 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524563 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524565 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524568 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524571 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524573 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524576 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524584 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524587 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524606 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 21:36:43.529154 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524609 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524612 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524615 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524618 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524626 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524629 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524632 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524635 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524637 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524640 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524642 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524645 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524648 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:43.524651 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.524656 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 21:36:43.529688 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.524778 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 21:36:43.530127 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.528031 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 21:36:43.530127 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.529070 2564 server.go:1019] "Starting client certificate rotation" Apr 17 21:36:43.530127 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.529172 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:36:43.530127 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.529851 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 21:36:43.551000 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.550973 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:36:43.556052 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.556034 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 21:36:43.570776 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.570749 2564 log.go:25] "Validated CRI v1 runtime API" Apr 17 21:36:43.576148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.576132 2564 log.go:25] "Validated CRI v1 image API" Apr 17 21:36:43.579647 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.579631 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 21:36:43.582509 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.582486 2564 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 8beff777-a2b3-414b-ab5b-60fb5abe6542:/dev/nvme0n1p3 be050b25-dfd2-451d-a017-f5e30c455e71:/dev/nvme0n1p4] Apr 17 21:36:43.582585 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.582507 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 21:36:43.584065 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.584048 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:36:43.589469 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.589351 2564 manager.go:217] Machine: {Timestamp:2026-04-17 21:36:43.588106827 +0000 UTC m=+0.350866295 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3101497 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2fc97f2edcf7e9f996884ae5525a77 SystemUUID:ec2fc97f-2edc-f7e9-f996-884ae5525a77 BootID:4e4f783f-2676-46f7-b4e0-1ab796ce9134 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7f:46:7a:08:51 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7f:46:7a:08:51 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:6a:dc:42:b4:70:43 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 21:36:43.589469 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.589450 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 21:36:43.589627 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.589528 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 21:36:43.589888 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.589869 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 21:36:43.590018 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.589889 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-27.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 21:36:43.590063 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.590027 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 21:36:43.590063 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.590037 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 21:36:43.590063 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.590050 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:36:43.590841 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.590830 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 21:36:43.592003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.591993 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:36:43.592117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.592108 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 21:36:43.594722 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.594712 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 17 21:36:43.594766 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.594726 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 21:36:43.594766 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.594737 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 21:36:43.594766 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.594747 2564 kubelet.go:397] "Adding apiserver pod source" Apr 17 21:36:43.594766 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.594756 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 21:36:43.595744 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.595733 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:36:43.595788 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.595750 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 21:36:43.598299 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.598280 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 21:36:43.599669 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.599655 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 21:36:43.601386 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601370 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601392 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601399 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601408 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601415 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601421 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601429 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601438 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601448 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 21:36:43.601452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601456 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 21:36:43.601741 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601468 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 21:36:43.601741 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.601481 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 21:36:43.602247 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.602233 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 21:36:43.602296 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.602252 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 21:36:43.605508 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.605490 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rlxlf" Apr 17 21:36:43.605819 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.605809 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 21:36:43.605856 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.605844 2564 server.go:1295] "Started kubelet" Apr 17 21:36:43.605916 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.605890 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 21:36:43.606078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.606010 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 21:36:43.606129 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.606093 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 21:36:43.606800 ip-10-0-132-27 systemd[1]: Started Kubernetes Kubelet. Apr 17 21:36:43.607115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.607097 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-27.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 21:36:43.607187 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.607129 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 21:36:43.607335 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.607207 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 21:36:43.607460 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.607444 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 21:36:43.608991 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.608976 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 17 21:36:43.610435 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.610414 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rlxlf" Apr 17 21:36:43.611267 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.610507 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-27.ec2.internal.18a7429928f97e4d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-27.ec2.internal,UID:ip-10-0-132-27.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-27.ec2.internal,},FirstTimestamp:2026-04-17 21:36:43.605818957 +0000 UTC m=+0.368578426,LastTimestamp:2026-04-17 21:36:43.605818957 +0000 UTC m=+0.368578426,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-27.ec2.internal,}" Apr 17 21:36:43.613695 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.613679 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 21:36:43.615631 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.615617 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 21:36:43.616657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.616643 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 21:36:43.617283 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617263 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 21:36:43.617283 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617263 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 21:36:43.617415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617289 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 21:36:43.617415 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.617375 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:43.617496 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617421 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 17 21:36:43.617496 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617430 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 17 21:36:43.617644 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617628 2564 factory.go:153] Registering CRI-O factory Apr 17 21:36:43.617711 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617690 2564 factory.go:223] Registration of the crio container factory successfully Apr 17 21:36:43.617810 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617786 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 21:36:43.617844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617812 2564 factory.go:55] Registering systemd factory Apr 17 21:36:43.617844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617820 2564 factory.go:223] Registration of the systemd container factory successfully Apr 17 21:36:43.617900 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617846 2564 factory.go:103] Registering Raw factory Apr 17 21:36:43.617900 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.617860 2564 manager.go:1196] Started watching for new ooms in manager Apr 17 21:36:43.618577 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.618564 2564 manager.go:319] Starting recovery of all containers Apr 17 21:36:43.619038 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.619010 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 21:36:43.619156 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.619134 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-27.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 21:36:43.630283 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.630092 2564 manager.go:324] Recovery completed Apr 17 21:36:43.634247 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.634234 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.636702 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.636687 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.636763 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.636716 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.636763 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.636731 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.637441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.637424 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 21:36:43.637441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.637437 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 21:36:43.637545 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.637452 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 17 21:36:43.639495 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.639483 2564 policy_none.go:49] "None policy: Start" Apr 17 21:36:43.639530 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.639499 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 21:36:43.639999 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.639990 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 17 21:36:43.678178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678164 2564 manager.go:341] "Starting Device Plugin manager" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.678194 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678203 2564 server.go:85] "Starting device plugin registration server" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678439 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678452 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678557 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678692 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.678701 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.679204 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 21:36:43.685696 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.679234 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:43.687748 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.687723 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 21:36:43.688825 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.688805 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 21:36:43.688904 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.688835 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 21:36:43.688904 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.688852 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 21:36:43.688904 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.688859 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 21:36:43.688904 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.688887 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 21:36:43.690454 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.690437 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:43.778806 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.778722 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.779854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.779840 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.779922 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.779893 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.779922 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.779903 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.779986 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.779928 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.785178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.785163 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.785235 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.785184 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-27.ec2.internal\": node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:43.789008 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.788991 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal"] Apr 17 21:36:43.789078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.789052 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.789782 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.789767 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.789838 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.789793 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.789838 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.789803 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.791216 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791204 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.791340 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791326 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.791377 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791356 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.791916 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791897 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.792013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791903 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.792013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791952 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.792013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791964 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.792013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.791928 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.792013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.792008 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.792953 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.792935 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.793021 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.792964 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 21:36:43.793585 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.793572 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientMemory" Apr 17 21:36:43.793650 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.793613 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 21:36:43.793650 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.793623 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeHasSufficientPID" Apr 17 21:36:43.797170 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.797149 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:43.820145 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.820122 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-27.ec2.internal\" not found" node="ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.824400 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.824384 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-27.ec2.internal\" not found" node="ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.898233 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.898207 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:43.918435 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.918412 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.918523 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.918438 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.918523 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:43.918457 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2064d33e84866d45f0ed11cc547caffc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-27.ec2.internal\" (UID: \"2064d33e84866d45f0ed11cc547caffc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:43.998556 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:43.998524 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.018844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018822 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.018924 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018850 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2064d33e84866d45f0ed11cc547caffc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-27.ec2.internal\" (UID: \"2064d33e84866d45f0ed11cc547caffc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.018924 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.018924 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018913 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/2064d33e84866d45f0ed11cc547caffc-config\") pod \"kube-apiserver-proxy-ip-10-0-132-27.ec2.internal\" (UID: \"2064d33e84866d45f0ed11cc547caffc\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.019020 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018951 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.019020 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.018970 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e6054250cf2e363eb96823aa69ef44c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal\" (UID: \"e6054250cf2e363eb96823aa69ef44c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.099227 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.099199 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.121663 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.121631 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.127432 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.127417 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.200160 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.200130 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.300637 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.300600 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.401104 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.401023 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.496612 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.496575 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:44.502141 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.502118 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.528647 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.528625 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 21:36:44.529091 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.528765 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:36:44.529091 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.528798 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 21:36:44.594341 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:44.594309 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6054250cf2e363eb96823aa69ef44c1.slice/crio-2201e1fb8d3c62ac90805b437d3a68849a1ba0878fe82a26ea2cfe9828fa5e99 WatchSource:0}: Error finding container 2201e1fb8d3c62ac90805b437d3a68849a1ba0878fe82a26ea2cfe9828fa5e99: Status 404 returned error can't find the container with id 2201e1fb8d3c62ac90805b437d3a68849a1ba0878fe82a26ea2cfe9828fa5e99 Apr 17 21:36:44.594771 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:44.594743 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2064d33e84866d45f0ed11cc547caffc.slice/crio-4606f12d1f916ed38043bc5c5ce51f748d32f239b259184a099a186eda0256dd WatchSource:0}: Error finding container 4606f12d1f916ed38043bc5c5ce51f748d32f239b259184a099a186eda0256dd: Status 404 returned error can't find the container with id 4606f12d1f916ed38043bc5c5ce51f748d32f239b259184a099a186eda0256dd Apr 17 21:36:44.600018 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.600003 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:36:44.603199 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.603183 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.613601 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.613557 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 21:31:43 +0000 UTC" deadline="2027-12-19 04:12:30.24725986 +0000 UTC" Apr 17 21:36:44.613680 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.613603 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14646h35m45.633672931s" Apr 17 21:36:44.616544 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.616531 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 21:36:44.625979 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.625961 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 21:36:44.642952 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.642932 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-bjtzs" Apr 17 21:36:44.650008 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.649988 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-bjtzs" Apr 17 21:36:44.691344 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.691260 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" event={"ID":"2064d33e84866d45f0ed11cc547caffc","Type":"ContainerStarted","Data":"4606f12d1f916ed38043bc5c5ce51f748d32f239b259184a099a186eda0256dd"} Apr 17 21:36:44.692065 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.692044 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" event={"ID":"e6054250cf2e363eb96823aa69ef44c1","Type":"ContainerStarted","Data":"2201e1fb8d3c62ac90805b437d3a68849a1ba0878fe82a26ea2cfe9828fa5e99"} Apr 17 21:36:44.704233 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.704212 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.804759 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:44.804723 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-27.ec2.internal\" not found" Apr 17 21:36:44.884609 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.884572 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:44.917865 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.917840 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.928336 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.928315 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:36:44.929901 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.929890 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" Apr 17 21:36:44.935663 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:44.935650 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 21:36:45.122345 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.122318 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:45.596276 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.596250 2564 apiserver.go:52] "Watching apiserver" Apr 17 21:36:45.605311 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.605285 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 21:36:45.606564 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.606535 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-bmcmp","kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk","openshift-cluster-node-tuning-operator/tuned-rghrv","openshift-dns/node-resolver-l7tk5","openshift-image-registry/node-ca-nlczj","openshift-multus/network-metrics-daemon-hqbt5","openshift-network-operator/iptables-alerter-5dwgb","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal","openshift-multus/multus-additional-cni-plugins-4wvmp","openshift-multus/multus-v9gzn","openshift-network-diagnostics/network-check-target-ddrrn","openshift-ovn-kubernetes/ovnkube-node-kwgmn"] Apr 17 21:36:45.609670 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.609650 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.609761 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.609734 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:45.611914 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.611896 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.614053 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.614032 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.614517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.614492 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-shwkf\"" Apr 17 21:36:45.614637 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.614533 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.614637 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.614496 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 21:36:45.614834 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.614817 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.616261 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.616245 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.616353 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.616270 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jqp4x\"" Apr 17 21:36:45.616475 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.616457 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.616979 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.616961 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.618844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.618479 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.618844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.618605 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.618844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.618649 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.619049 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.618871 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-t6ql7\"" Apr 17 21:36:45.620873 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.620842 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.621096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.621028 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.621251 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.621236 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 21:36:45.621480 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.621462 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-9pl4m\"" Apr 17 21:36:45.621577 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.621501 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.624040 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.623445 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.624565 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.624362 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 21:36:45.624667 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.624651 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 21:36:45.624721 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.624673 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-mk7h2\"" Apr 17 21:36:45.626092 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-host\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626193 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626110 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-systemd\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626673 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626650 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-sys-fs\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.626755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626690 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-kubernetes\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626715 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-tuned\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626739 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-tmp\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626870 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626764 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-device-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.626870 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626782 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.626870 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626788 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.626870 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626829 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.626870 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626854 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8bpm\" (UniqueName: \"kubernetes.io/projected/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-kube-api-access-c8bpm\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626880 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/116be4c2-a389-4822-bd06-12d2e0fcf15a-hosts-file\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626903 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkcgj\" (UniqueName: \"kubernetes.io/projected/116be4c2-a389-4822-bd06-12d2e0fcf15a-kube-api-access-qkcgj\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626925 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-serviceca\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626948 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626972 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-modprobe-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626977 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 21:36:45.627009 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.626995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-host\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627018 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g849\" (UniqueName: \"kubernetes.io/projected/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-kube-api-access-6g849\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627042 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-socket-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627064 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysconfig\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627088 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-run\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627127 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-var-lib-kubelet\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627149 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/116be4c2-a389-4822-bd06-12d2e0fcf15a-tmp-dir\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.627264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627178 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627294 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx8ff\" (UniqueName: \"kubernetes.io/projected/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-kube-api-access-jx8ff\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627321 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-68vhj\"" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627331 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627344 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-registration-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627384 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-lib-modules\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627412 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqp4z\" (UniqueName: \"kubernetes.io/projected/f1458f6c-b1bc-4b38-8141-1f70918a345d-kube-api-access-rqp4z\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627445 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-conf\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.627570 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.627469 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-sys\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.629372 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.629034 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.629372 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.629125 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.631300 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631268 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.631376 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631342 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:45.631429 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.631413 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:45.631497 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631427 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 21:36:45.631497 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631461 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 21:36:45.631765 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631738 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-mkjg4\"" Apr 17 21:36:45.631848 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631818 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-5lft9\"" Apr 17 21:36:45.631999 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.631982 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.632059 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.632019 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 21:36:45.632108 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.632022 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 21:36:45.633846 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.633828 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.635957 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.635921 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 21:36:45.636233 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.636190 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-757tt\"" Apr 17 21:36:45.636301 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.636252 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 21:36:45.636459 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.636441 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 21:36:45.636518 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.636508 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 21:36:45.637555 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.637536 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 21:36:45.637670 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.637607 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 21:36:45.650562 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.650536 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:31:44 +0000 UTC" deadline="2027-10-25 23:29:25.863076888 +0000 UTC" Apr 17 21:36:45.650671 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.650562 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13345h52m40.212517761s" Apr 17 21:36:45.718203 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.718178 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 21:36:45.727752 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727724 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-var-lib-kubelet\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.727861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727778 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzrnh\" (UniqueName: \"kubernetes.io/projected/1ced47e5-b9ea-4efa-8587-2c824560fd6c-kube-api-access-mzrnh\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.727861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727807 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-os-release\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.727861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-var-lib-kubelet\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.727861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727835 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb2a6908-4966-4b74-86c6-f31cd952cff2-host-slash\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727863 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m7bz\" (UniqueName: \"kubernetes.io/projected/fb2a6908-4966-4b74-86c6-f31cd952cff2-kube-api-access-5m7bz\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727890 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727911 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-config\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727927 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-conf-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.727994 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jx8ff\" (UniqueName: \"kubernetes.io/projected/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-kube-api-access-jx8ff\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.728038 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-registration-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728059 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-lib-modules\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728086 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-registration-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728094 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-system-cni-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728122 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-multus-certs\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728146 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-systemd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728171 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrx8w\" (UniqueName: \"kubernetes.io/projected/60e0c58d-3db8-4433-a617-00082bd25488-kube-api-access-vrx8w\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728186 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-lib-modules\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728198 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-sys\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728251 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728266 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-sys\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728276 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.728319 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728310 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb2a6908-4966-4b74-86c6-f31cd952cff2-iptables-alerter-script\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-script-lib\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728362 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-tmp\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728404 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8q6z\" (UniqueName: \"kubernetes.io/projected/87fbd26a-4a22-4878-91ae-b4b73c69c322-kube-api-access-x8q6z\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728460 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728488 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-env-overrides\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728510 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-slash\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728533 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-node-log\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728555 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/116be4c2-a389-4822-bd06-12d2e0fcf15a-hosts-file\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728572 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkcgj\" (UniqueName: \"kubernetes.io/projected/116be4c2-a389-4822-bd06-12d2e0fcf15a-kube-api-access-qkcgj\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728588 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-serviceca\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728628 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728647 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-kubelet\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728655 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/116be4c2-a389-4822-bd06-12d2e0fcf15a-hosts-file\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728674 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-etc-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728711 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728746 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.728793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728774 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728784 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g849\" (UniqueName: \"kubernetes.io/projected/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-kube-api-access-6g849\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728841 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-etc-selinux\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728879 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-netns\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728911 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-socket-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728953 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysconfig\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728978 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-run\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.728998 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-serviceca\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729015 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cnibin\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729041 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysconfig\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729047 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-socket-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-bin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-run\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729147 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-log-socket\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729175 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/116be4c2-a389-4822-bd06-12d2e0fcf15a-tmp-dir\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729194 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729262 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-os-release\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.729279 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:45.729503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729293 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-k8s-cni-cncf-io\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729317 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-etc-kubernetes\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.729379 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:46.229318945 +0000 UTC m=+2.992078421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729433 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-var-lib-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/116be4c2-a389-4822-bd06-12d2e0fcf15a-tmp-dir\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60e0c58d-3db8-4433-a617-00082bd25488-ovn-node-metrics-cert\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729490 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqp4z\" (UniqueName: \"kubernetes.io/projected/f1458f6c-b1bc-4b38-8141-1f70918a345d-kube-api-access-rqp4z\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-conf\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729570 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-host\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729613 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-system-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729662 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-socket-dir-parent\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-host\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729697 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-netns\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729747 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-daemon-config\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729748 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-conf\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729794 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.730218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729823 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-systemd\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729843 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-multus\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729866 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-ovn\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729884 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-sys-fs\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729873 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-systemd\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729903 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-kubernetes\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729932 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-tuned\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729957 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-agent-certs\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729967 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-sys-fs\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.729995 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730023 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-cni-binary-copy\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730043 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-netd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730076 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-device-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730100 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730129 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730137 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-kubernetes\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730145 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-device-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.730855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-systemd-units\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730196 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730220 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8bpm\" (UniqueName: \"kubernetes.io/projected/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-kube-api-access-c8bpm\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-konnectivity-ca\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730265 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f1458f6c-b1bc-4b38-8141-1f70918a345d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730277 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-cnibin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730251 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-sysctl-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730301 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-hostroot\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730327 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-modprobe-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730349 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-host\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730397 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-kubelet\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730427 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-bin\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730526 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-host\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.731451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.730542 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-modprobe-d\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.732043 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.731937 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-tmp\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.732889 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.732870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-etc-tuned\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.738614 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.738567 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g849\" (UniqueName: \"kubernetes.io/projected/960eb4ac-0adf-443b-8b6e-b34cc770fb3a-kube-api-access-6g849\") pod \"node-ca-nlczj\" (UID: \"960eb4ac-0adf-443b-8b6e-b34cc770fb3a\") " pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.738614 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.738582 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkcgj\" (UniqueName: \"kubernetes.io/projected/116be4c2-a389-4822-bd06-12d2e0fcf15a-kube-api-access-qkcgj\") pod \"node-resolver-l7tk5\" (UID: \"116be4c2-a389-4822-bd06-12d2e0fcf15a\") " pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.738862 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.738842 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqp4z\" (UniqueName: \"kubernetes.io/projected/f1458f6c-b1bc-4b38-8141-1f70918a345d-kube-api-access-rqp4z\") pod \"aws-ebs-csi-driver-node-g55xk\" (UID: \"f1458f6c-b1bc-4b38-8141-1f70918a345d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.738934 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.738915 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8bpm\" (UniqueName: \"kubernetes.io/projected/ae12babd-ac52-4639-8c7c-50eb1dd9fb6b-kube-api-access-c8bpm\") pod \"tuned-rghrv\" (UID: \"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b\") " pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.739254 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.739234 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx8ff\" (UniqueName: \"kubernetes.io/projected/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-kube-api-access-jx8ff\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:45.756178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.756159 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:45.831354 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831328 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.831354 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831357 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-kubelet\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831375 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-etc-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831410 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-netns\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831436 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cnibin\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831456 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-kubelet\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831456 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-etc-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831464 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-bin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831478 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-run-netns\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831505 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cnibin\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831503 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-bin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831523 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-log-socket\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831563 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-os-release\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.831605 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831606 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-log-socket\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831616 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-k8s-cni-cncf-io\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831640 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-etc-kubernetes\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-var-lib-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831683 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-os-release\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831699 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-k8s-cni-cncf-io\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831708 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-var-lib-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831710 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-etc-kubernetes\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831725 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60e0c58d-3db8-4433-a617-00082bd25488-ovn-node-metrics-cert\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831744 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-system-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831767 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-socket-dir-parent\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831786 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-netns\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831832 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-system-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831840 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-socket-dir-parent\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-daemon-config\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831848 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-netns\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831869 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831894 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-multus\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.832157 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831930 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-openvswitch\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831935 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-ovn\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831968 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-ovn\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831970 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-cni-multus\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.831970 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-agent-certs\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832006 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-cni-binary-copy\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832033 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-binary-copy\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-netd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832097 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-systemd-units\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832159 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-konnectivity-ca\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832183 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-cnibin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832207 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-hostroot\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832232 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-kubelet\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832261 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-bin\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832289 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mzrnh\" (UniqueName: \"kubernetes.io/projected/1ced47e5-b9ea-4efa-8587-2c824560fd6c-kube-api-access-mzrnh\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833011 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832314 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-os-release\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832363 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb2a6908-4966-4b74-86c6-f31cd952cff2-host-slash\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832387 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-daemon-config\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832389 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5m7bz\" (UniqueName: \"kubernetes.io/projected/fb2a6908-4966-4b74-86c6-f31cd952cff2-kube-api-access-5m7bz\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832441 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832469 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-config\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832496 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-conf-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832524 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-system-cni-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832561 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832572 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-multus-certs\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832584 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/87fbd26a-4a22-4878-91ae-b4b73c69c322-cni-binary-copy\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832617 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-systemd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832643 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-systemd-units\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832660 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-netd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832665 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/1ced47e5-b9ea-4efa-8587-2c824560fd6c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832646 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrx8w\" (UniqueName: \"kubernetes.io/projected/60e0c58d-3db8-4433-a617-00082bd25488-kube-api-access-vrx8w\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.833773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832707 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-conf-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832712 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832726 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-cnibin\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832733 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-system-cni-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832746 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-konnectivity-ca\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832757 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-run-systemd\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832767 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-hostroot\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832788 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832818 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb2a6908-4966-4b74-86c6-f31cd952cff2-host-slash\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832837 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb2a6908-4966-4b74-86c6-f31cd952cff2-iptables-alerter-script\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832849 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1ced47e5-b9ea-4efa-8587-2c824560fd6c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832857 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-var-lib-kubelet\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832864 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-script-lib\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832914 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-multus-cni-dir\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.832953 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-cni-bin\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833005 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-os-release\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833018 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8q6z\" (UniqueName: \"kubernetes.io/projected/87fbd26a-4a22-4878-91ae-b4b73c69c322-kube-api-access-x8q6z\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833055 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:45.834289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833083 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-env-overrides\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-config\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833161 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-slash\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833185 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-node-log\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833196 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/87fbd26a-4a22-4878-91ae-b4b73c69c322-host-run-multus-certs\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833246 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-node-log\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833257 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/60e0c58d-3db8-4433-a617-00082bd25488-host-slash\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833368 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/fb2a6908-4966-4b74-86c6-f31cd952cff2-iptables-alerter-script\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833375 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-ovnkube-script-lib\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.833616 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/60e0c58d-3db8-4433-a617-00082bd25488-env-overrides\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.834857 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.834717 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/ab744ec2-8a2e-4824-b6e1-3ec78e188e1e-agent-certs\") pod \"konnectivity-agent-bmcmp\" (UID: \"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e\") " pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.835175 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.834899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/60e0c58d-3db8-4433-a617-00082bd25488-ovn-node-metrics-cert\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.838393 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.838371 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:45.838393 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.838396 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:45.838881 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.838862 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:45.838989 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:45.838950 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:46.338920063 +0000 UTC m=+3.101679518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:45.840958 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.840918 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrx8w\" (UniqueName: \"kubernetes.io/projected/60e0c58d-3db8-4433-a617-00082bd25488-kube-api-access-vrx8w\") pod \"ovnkube-node-kwgmn\" (UID: \"60e0c58d-3db8-4433-a617-00082bd25488\") " pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:45.841096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.841075 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8q6z\" (UniqueName: \"kubernetes.io/projected/87fbd26a-4a22-4878-91ae-b4b73c69c322-kube-api-access-x8q6z\") pod \"multus-v9gzn\" (UID: \"87fbd26a-4a22-4878-91ae-b4b73c69c322\") " pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.841578 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.841550 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzrnh\" (UniqueName: \"kubernetes.io/projected/1ced47e5-b9ea-4efa-8587-2c824560fd6c-kube-api-access-mzrnh\") pod \"multus-additional-cni-plugins-4wvmp\" (UID: \"1ced47e5-b9ea-4efa-8587-2c824560fd6c\") " pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.841899 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.841874 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m7bz\" (UniqueName: \"kubernetes.io/projected/fb2a6908-4966-4b74-86c6-f31cd952cff2-kube-api-access-5m7bz\") pod \"iptables-alerter-5dwgb\" (UID: \"fb2a6908-4966-4b74-86c6-f31cd952cff2\") " pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.924190 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.924126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" Apr 17 21:36:45.931743 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.931723 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-l7tk5" Apr 17 21:36:45.940433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.940414 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rghrv" Apr 17 21:36:45.944969 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.944952 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nlczj" Apr 17 21:36:45.951468 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.951450 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:36:45.957976 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.957958 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-5dwgb" Apr 17 21:36:45.964517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.964499 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-v9gzn" Apr 17 21:36:45.970981 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.970965 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" Apr 17 21:36:45.975514 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:45.975496 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:36:46.236622 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.236546 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:46.236759 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.236668 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:46.236759 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.236717 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:47.236703713 +0000 UTC m=+3.999463169 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:46.294799 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.294766 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod960eb4ac_0adf_443b_8b6e_b34cc770fb3a.slice/crio-f7d65cc7859297c698e2942cd25020f58ac702e64f1b2511d85723667b18ad46 WatchSource:0}: Error finding container f7d65cc7859297c698e2942cd25020f58ac702e64f1b2511d85723667b18ad46: Status 404 returned error can't find the container with id f7d65cc7859297c698e2942cd25020f58ac702e64f1b2511d85723667b18ad46 Apr 17 21:36:46.301442 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.301412 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb2a6908_4966_4b74_86c6_f31cd952cff2.slice/crio-c8caa4d2492f2e3f477afea5543e31c04463d70023c6c7ef5670241fc26581c4 WatchSource:0}: Error finding container c8caa4d2492f2e3f477afea5543e31c04463d70023c6c7ef5670241fc26581c4: Status 404 returned error can't find the container with id c8caa4d2492f2e3f477afea5543e31c04463d70023c6c7ef5670241fc26581c4 Apr 17 21:36:46.302420 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.302393 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87fbd26a_4a22_4878_91ae_b4b73c69c322.slice/crio-540a72d4c59c7240486bc083b6bcf14f5283893d5d1dd01cffa001da9c01c9aa WatchSource:0}: Error finding container 540a72d4c59c7240486bc083b6bcf14f5283893d5d1dd01cffa001da9c01c9aa: Status 404 returned error can't find the container with id 540a72d4c59c7240486bc083b6bcf14f5283893d5d1dd01cffa001da9c01c9aa Apr 17 21:36:46.303140 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.303094 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod116be4c2_a389_4822_bd06_12d2e0fcf15a.slice/crio-f8350f419d9cd3ea1858cdbb9387ad61e6dcc159e8b48337be9e0831665608db WatchSource:0}: Error finding container f8350f419d9cd3ea1858cdbb9387ad61e6dcc159e8b48337be9e0831665608db: Status 404 returned error can't find the container with id f8350f419d9cd3ea1858cdbb9387ad61e6dcc159e8b48337be9e0831665608db Apr 17 21:36:46.303878 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.303859 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ced47e5_b9ea_4efa_8587_2c824560fd6c.slice/crio-4460fdf8033659a4c271511ff5e74ce49d3a98b422cadcfe16442aa786b2e080 WatchSource:0}: Error finding container 4460fdf8033659a4c271511ff5e74ce49d3a98b422cadcfe16442aa786b2e080: Status 404 returned error can't find the container with id 4460fdf8033659a4c271511ff5e74ce49d3a98b422cadcfe16442aa786b2e080 Apr 17 21:36:46.304529 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.304431 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae12babd_ac52_4639_8c7c_50eb1dd9fb6b.slice/crio-5cfaf2a6bbee3499519eb05f30c371f38b20359822665ec7d2908c4c4091a718 WatchSource:0}: Error finding container 5cfaf2a6bbee3499519eb05f30c371f38b20359822665ec7d2908c4c4091a718: Status 404 returned error can't find the container with id 5cfaf2a6bbee3499519eb05f30c371f38b20359822665ec7d2908c4c4091a718 Apr 17 21:36:46.305782 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.305616 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1458f6c_b1bc_4b38_8141_1f70918a345d.slice/crio-a4ee34356cc74717d46003dc43560303f77becc08bf9e0f2b8c4b0b1016058cd WatchSource:0}: Error finding container a4ee34356cc74717d46003dc43560303f77becc08bf9e0f2b8c4b0b1016058cd: Status 404 returned error can't find the container with id a4ee34356cc74717d46003dc43560303f77becc08bf9e0f2b8c4b0b1016058cd Apr 17 21:36:46.306895 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:36:46.306812 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60e0c58d_3db8_4433_a617_00082bd25488.slice/crio-adcafabd939d9681eb72259bc2bfc2a131e5f4bc548935de880c6dfb4e825658 WatchSource:0}: Error finding container adcafabd939d9681eb72259bc2bfc2a131e5f4bc548935de880c6dfb4e825658: Status 404 returned error can't find the container with id adcafabd939d9681eb72259bc2bfc2a131e5f4bc548935de880c6dfb4e825658 Apr 17 21:36:46.439139 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.438956 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:46.439311 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.439113 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:46.439311 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.439204 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:46.439311 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.439217 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:46.439311 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:46.439263 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:47.439249089 +0000 UTC m=+4.202008550 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:46.651050 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.651013 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 21:31:44 +0000 UTC" deadline="2027-11-15 22:37:57.759784194 +0000 UTC" Apr 17 21:36:46.651050 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.651043 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13849h1m11.108743241s" Apr 17 21:36:46.695610 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.695553 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v9gzn" event={"ID":"87fbd26a-4a22-4878-91ae-b4b73c69c322","Type":"ContainerStarted","Data":"540a72d4c59c7240486bc083b6bcf14f5283893d5d1dd01cffa001da9c01c9aa"} Apr 17 21:36:46.696630 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.696576 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5dwgb" event={"ID":"fb2a6908-4966-4b74-86c6-f31cd952cff2","Type":"ContainerStarted","Data":"c8caa4d2492f2e3f477afea5543e31c04463d70023c6c7ef5670241fc26581c4"} Apr 17 21:36:46.698263 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.698239 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" event={"ID":"2064d33e84866d45f0ed11cc547caffc","Type":"ContainerStarted","Data":"b384e3fb36eb02333e601ed464eb0a87ef6ca04cdf8071c3e41fc7d8fd9a173f"} Apr 17 21:36:46.699640 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.699607 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bmcmp" event={"ID":"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e","Type":"ContainerStarted","Data":"891bacd1ffb9f2fbf16e047737efc391142b1706ef33039d8b8e7d14e3ab8fef"} Apr 17 21:36:46.700777 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.700745 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nlczj" event={"ID":"960eb4ac-0adf-443b-8b6e-b34cc770fb3a","Type":"ContainerStarted","Data":"f7d65cc7859297c698e2942cd25020f58ac702e64f1b2511d85723667b18ad46"} Apr 17 21:36:46.701900 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.701875 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"adcafabd939d9681eb72259bc2bfc2a131e5f4bc548935de880c6dfb4e825658"} Apr 17 21:36:46.703161 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.703134 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rghrv" event={"ID":"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b","Type":"ContainerStarted","Data":"5cfaf2a6bbee3499519eb05f30c371f38b20359822665ec7d2908c4c4091a718"} Apr 17 21:36:46.704527 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.704252 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 21:36:46.706387 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.705857 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" event={"ID":"f1458f6c-b1bc-4b38-8141-1f70918a345d","Type":"ContainerStarted","Data":"a4ee34356cc74717d46003dc43560303f77becc08bf9e0f2b8c4b0b1016058cd"} Apr 17 21:36:46.709015 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.708989 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7tk5" event={"ID":"116be4c2-a389-4822-bd06-12d2e0fcf15a","Type":"ContainerStarted","Data":"f8350f419d9cd3ea1858cdbb9387ad61e6dcc159e8b48337be9e0831665608db"} Apr 17 21:36:46.709982 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.709938 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-27.ec2.internal" podStartSLOduration=2.709922079 podStartE2EDuration="2.709922079s" podCreationTimestamp="2026-04-17 21:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:36:46.709409714 +0000 UTC m=+3.472169194" watchObservedRunningTime="2026-04-17 21:36:46.709922079 +0000 UTC m=+3.472681561" Apr 17 21:36:46.711159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:46.711134 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerStarted","Data":"4460fdf8033659a4c271511ff5e74ce49d3a98b422cadcfe16442aa786b2e080"} Apr 17 21:36:47.247193 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.246585 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:47.247193 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.246724 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:47.247193 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.246802 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:49.246778294 +0000 UTC m=+6.009537755 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:47.448488 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.448448 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:47.448711 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.448692 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:47.448783 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.448719 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:47.448783 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.448734 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:47.448882 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.448794 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:49.448774834 +0000 UTC m=+6.211534294 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:47.692068 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.692039 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:47.692500 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.692157 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:47.692609 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.692578 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:47.692725 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:47.692704 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:47.722217 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.722185 2564 generic.go:358] "Generic (PLEG): container finished" podID="e6054250cf2e363eb96823aa69ef44c1" containerID="2da87258e7d9e9c73cdd41c05a5f023ca50ff450455730b295cd408e7af865f1" exitCode=0 Apr 17 21:36:47.723081 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:47.723057 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" event={"ID":"e6054250cf2e363eb96823aa69ef44c1","Type":"ContainerDied","Data":"2da87258e7d9e9c73cdd41c05a5f023ca50ff450455730b295cd408e7af865f1"} Apr 17 21:36:48.739832 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:48.739788 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" event={"ID":"e6054250cf2e363eb96823aa69ef44c1","Type":"ContainerStarted","Data":"4bf46f15322eca1d65fb3a904f40bc92c2300441fad1cddb92fa7f37594960d2"} Apr 17 21:36:49.263664 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:49.263094 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:49.263664 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.263229 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:49.263664 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.263291 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:53.263273125 +0000 UTC m=+10.026032586 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:49.465450 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:49.465251 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:49.466180 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.465705 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:49.466180 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.465733 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:49.466180 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.465749 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:49.466180 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.465829 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:36:53.465809651 +0000 UTC m=+10.228569108 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:49.690383 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:49.690189 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:49.690383 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.690305 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:49.690649 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:49.690565 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:49.690810 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:49.690759 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:51.689650 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:51.689617 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:51.689650 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:51.689653 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:51.690058 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:51.689749 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:51.690058 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:51.689880 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:53.298680 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:53.298640 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:53.299135 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.298813 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:53.299135 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.298896 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.298874787 +0000 UTC m=+18.061634244 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:36:53.500430 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:53.500346 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:53.500628 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.500477 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:36:53.500628 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.500496 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:36:53.500628 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.500509 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:53.500628 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.500580 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:01.50055325 +0000 UTC m=+18.263312710 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:36:53.692119 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:53.692089 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:53.692292 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.692192 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:53.692292 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:53.692221 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:53.692405 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:53.692317 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:55.689734 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:55.689682 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:55.690285 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:55.689682 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:55.690285 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:55.689816 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:55.690285 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:55.689911 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:57.689587 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:57.689553 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:57.690072 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:57.689633 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:57.690072 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:57.689734 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:57.690072 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:57.689871 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:36:59.689326 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:59.689296 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:36:59.689738 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:36:59.689344 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:36:59.689738 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:59.689423 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:36:59.689738 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:36:59.689584 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:01.358820 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:01.358787 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:01.359269 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.358916 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:37:01.359269 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.358992 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.3589714 +0000 UTC m=+34.121730874 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 21:37:01.560136 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:01.560102 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:01.560307 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.560279 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 21:37:01.560307 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.560303 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 21:37:01.560388 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.560317 2564 projected.go:194] Error preparing data for projected volume kube-api-access-cnbk6 for pod openshift-network-diagnostics/network-check-target-ddrrn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:37:01.560388 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.560372 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6 podName:803936af-5a7f-4be9-bc47-8ca0f94064a9 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.560354351 +0000 UTC m=+34.323113828 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cnbk6" (UniqueName: "kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6") pod "network-check-target-ddrrn" (UID: "803936af-5a7f-4be9-bc47-8ca0f94064a9") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 21:37:01.689886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:01.689778 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:01.689886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:01.689806 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:01.690087 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.689898 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:01.690087 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:01.689988 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:03.692291 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.692126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:03.692805 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.692126 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:03.692805 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:03.692371 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:03.692805 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:03.692442 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:03.765754 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.765733 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:37:03.766024 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766004 2564 generic.go:358] "Generic (PLEG): container finished" podID="60e0c58d-3db8-4433-a617-00082bd25488" containerID="a7023459655d0af4adad7aa1084d574bdd81eb834452f43b6e123aae998de983" exitCode=1 Apr 17 21:37:03.766101 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766067 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"34cc001817cef15c31f1e11a9dd27ae7b75e6efd8c04e79a23aefdc159d062b0"} Apr 17 21:37:03.766148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766099 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"b39423fa9d5d3fe9b0c1140813f6cc6e252542ba7b8763eb1f4e8c5078fdf06a"} Apr 17 21:37:03.766148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766113 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"bbf1e3ffa89681594bd42f1f0b202cb4bb1514e63a82ebe2e382f861b608a65a"} Apr 17 21:37:03.766148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766125 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerDied","Data":"a7023459655d0af4adad7aa1084d574bdd81eb834452f43b6e123aae998de983"} Apr 17 21:37:03.766148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.766139 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"324da8b2303443dbd2c053bb8e304d2318176f0134d9165ce2b1ba32d5b05c2f"} Apr 17 21:37:03.767214 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.767195 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rghrv" event={"ID":"ae12babd-ac52-4639-8c7c-50eb1dd9fb6b","Type":"ContainerStarted","Data":"f8791acbd3c856e5d550934508a2981ffc28e41db41f1b95c766411f8dca0add"} Apr 17 21:37:03.768483 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.768466 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" event={"ID":"f1458f6c-b1bc-4b38-8141-1f70918a345d","Type":"ContainerStarted","Data":"559f3658a10daac582a44a30ed7888aa7e37e1017cfe466832b99b8e33e75297"} Apr 17 21:37:03.769545 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.769527 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-l7tk5" event={"ID":"116be4c2-a389-4822-bd06-12d2e0fcf15a","Type":"ContainerStarted","Data":"255174a145df61c4b9560bcd6a20295057f1c15feb74e3612b08f2dc18f318ce"} Apr 17 21:37:03.770879 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.770857 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="f2be63fa587901fa8c2fe624b4c74a8a133b9012ac6ee10b9b26a7ab89d656e3" exitCode=0 Apr 17 21:37:03.770964 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.770881 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"f2be63fa587901fa8c2fe624b4c74a8a133b9012ac6ee10b9b26a7ab89d656e3"} Apr 17 21:37:03.772192 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.772130 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-v9gzn" event={"ID":"87fbd26a-4a22-4878-91ae-b4b73c69c322","Type":"ContainerStarted","Data":"a96b5949ea22c7d5d39e6f740768e37a1b0111f0d3aa965cbaf2f06788721f5b"} Apr 17 21:37:03.773299 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.773278 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-bmcmp" event={"ID":"ab744ec2-8a2e-4824-b6e1-3ec78e188e1e","Type":"ContainerStarted","Data":"d9f459df44efe246ec7205abc6160a2b844a3b5f31040753e7e33d6680b64e87"} Apr 17 21:37:03.774558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.774535 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nlczj" event={"ID":"960eb4ac-0adf-443b-8b6e-b34cc770fb3a","Type":"ContainerStarted","Data":"9f898984c2c1efc14debdb54ca4b344e5d78e27a36628798ce52e95ad8cc9467"} Apr 17 21:37:03.782722 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.782691 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rghrv" podStartSLOduration=4.020317306 podStartE2EDuration="20.782681889s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.306338638 +0000 UTC m=+3.069098093" lastFinishedPulling="2026-04-17 21:37:03.068703208 +0000 UTC m=+19.831462676" observedRunningTime="2026-04-17 21:37:03.782467581 +0000 UTC m=+20.545227049" watchObservedRunningTime="2026-04-17 21:37:03.782681889 +0000 UTC m=+20.545441366" Apr 17 21:37:03.782928 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.782909 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-27.ec2.internal" podStartSLOduration=19.78290479 podStartE2EDuration="19.78290479s" podCreationTimestamp="2026-04-17 21:36:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:36:48.766150689 +0000 UTC m=+5.528910168" watchObservedRunningTime="2026-04-17 21:37:03.78290479 +0000 UTC m=+20.545664267" Apr 17 21:37:03.799004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.798939 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-bmcmp" podStartSLOduration=4.036756513 podStartE2EDuration="20.798929067s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.300534357 +0000 UTC m=+3.063293814" lastFinishedPulling="2026-04-17 21:37:03.062706905 +0000 UTC m=+19.825466368" observedRunningTime="2026-04-17 21:37:03.79842278 +0000 UTC m=+20.561182258" watchObservedRunningTime="2026-04-17 21:37:03.798929067 +0000 UTC m=+20.561688544" Apr 17 21:37:03.831137 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.831099 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-l7tk5" podStartSLOduration=4.073330861 podStartE2EDuration="20.831087082s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.304968046 +0000 UTC m=+3.067727516" lastFinishedPulling="2026-04-17 21:37:03.062724275 +0000 UTC m=+19.825483737" observedRunningTime="2026-04-17 21:37:03.830855766 +0000 UTC m=+20.593615243" watchObservedRunningTime="2026-04-17 21:37:03.831087082 +0000 UTC m=+20.593846560" Apr 17 21:37:03.844953 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.844907 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nlczj" podStartSLOduration=11.961319471 podStartE2EDuration="20.844889743s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.300049344 +0000 UTC m=+3.062808800" lastFinishedPulling="2026-04-17 21:36:55.183619605 +0000 UTC m=+11.946379072" observedRunningTime="2026-04-17 21:37:03.844298361 +0000 UTC m=+20.607057839" watchObservedRunningTime="2026-04-17 21:37:03.844889743 +0000 UTC m=+20.607649224" Apr 17 21:37:03.860234 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:03.860189 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-v9gzn" podStartSLOduration=4.091726272 podStartE2EDuration="20.860174739s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.304451015 +0000 UTC m=+3.067210483" lastFinishedPulling="2026-04-17 21:37:03.072899482 +0000 UTC m=+19.835658950" observedRunningTime="2026-04-17 21:37:03.859376389 +0000 UTC m=+20.622135867" watchObservedRunningTime="2026-04-17 21:37:03.860174739 +0000 UTC m=+20.622934220" Apr 17 21:37:04.357742 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.357719 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:37:04.778001 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.777964 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-5dwgb" event={"ID":"fb2a6908-4966-4b74-86c6-f31cd952cff2","Type":"ContainerStarted","Data":"38289880f53f08187f22d10ca01586ff6fd1383e9b19bdd7711bccdbebd48a5a"} Apr 17 21:37:04.780747 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.780725 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:37:04.781258 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.781221 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"b8a2fd6e7a2e5ad08d6fc0bfa9f5e5623cac54978bdf7b899053f905c06c42f5"} Apr 17 21:37:04.790945 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.790910 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-5dwgb" podStartSLOduration=5.03142836 podStartE2EDuration="21.790896827s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.303387256 +0000 UTC m=+3.066146726" lastFinishedPulling="2026-04-17 21:37:03.06285573 +0000 UTC m=+19.825615193" observedRunningTime="2026-04-17 21:37:04.790889551 +0000 UTC m=+21.553649035" watchObservedRunningTime="2026-04-17 21:37:04.790896827 +0000 UTC m=+21.553656307" Apr 17 21:37:04.836743 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:04.836672 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 21:37:05.636655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.636615 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:37:05.637321 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.637304 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:37:05.690000 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.689885 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T21:37:04.836693006Z","UUID":"e402511d-8afb-443a-944c-758e19829792","Handler":null,"Name":"","Endpoint":""} Apr 17 21:37:05.693775 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.693748 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:05.693923 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.693748 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 21:37:05.693923 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.693862 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 21:37:05.693923 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.693917 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:05.694120 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:05.693855 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:05.694120 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:05.694007 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:05.785160 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.785126 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" event={"ID":"f1458f6c-b1bc-4b38-8141-1f70918a345d","Type":"ContainerStarted","Data":"b05ad4b925595123b399a2dd23c1c0d0063c604dcbd9a129c8f93de1bbb814d5"} Apr 17 21:37:05.786002 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:05.785978 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-bmcmp" Apr 17 21:37:06.790002 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:06.789810 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:37:06.790475 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:06.790379 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"773fa6e5dbf3d2a38dc7e2ee0fbfeb28051e4fe6cfceb2c8e352bafed345f3ca"} Apr 17 21:37:06.792424 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:06.792396 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" event={"ID":"f1458f6c-b1bc-4b38-8141-1f70918a345d","Type":"ContainerStarted","Data":"b3de9f3bf6576a02acedda993b466654521b0d73983d9b807f1e2b9346d57dc0"} Apr 17 21:37:06.808496 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:06.808454 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-g55xk" podStartSLOduration=4.120068452 podStartE2EDuration="23.808440477s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.308894928 +0000 UTC m=+3.071654397" lastFinishedPulling="2026-04-17 21:37:05.997266967 +0000 UTC m=+22.760026422" observedRunningTime="2026-04-17 21:37:06.808120773 +0000 UTC m=+23.570880252" watchObservedRunningTime="2026-04-17 21:37:06.808440477 +0000 UTC m=+23.571199953" Apr 17 21:37:07.689227 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:07.689198 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:07.689413 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:07.689314 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:07.689413 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:07.689403 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:07.689544 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:07.689519 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:08.798452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:08.798214 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:37:08.799168 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:08.798729 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"e94078f387654d7f6bd21511c275ffd11a127016dba2ecf3602ae285271cd613"} Apr 17 21:37:08.799168 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:08.799038 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:08.799361 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:08.799344 2564 scope.go:117] "RemoveContainer" containerID="a7023459655d0af4adad7aa1084d574bdd81eb834452f43b6e123aae998de983" Apr 17 21:37:08.814153 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:08.814133 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:09.689811 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.689781 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:09.689999 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.689788 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:09.689999 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:09.689906 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:09.689999 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:09.689979 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:09.804294 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.803203 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="8a8c406b61569f640ce0dc477892714a01f16ca5aec0c55ef6112f4ecdcfba54" exitCode=0 Apr 17 21:37:09.804294 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.803331 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"8a8c406b61569f640ce0dc477892714a01f16ca5aec0c55ef6112f4ecdcfba54"} Apr 17 21:37:09.812386 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.811801 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:37:09.812526 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.812502 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" event={"ID":"60e0c58d-3db8-4433-a617-00082bd25488","Type":"ContainerStarted","Data":"3c2acc2b0055391e2138f3317ea0b1118e717170de69291676876dda73a75309"} Apr 17 21:37:09.812915 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.812893 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:09.813414 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.813013 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:09.836499 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.836476 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:09.855763 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:09.855727 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" podStartSLOduration=10.032774524 podStartE2EDuration="26.855714803s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.309370655 +0000 UTC m=+3.072130111" lastFinishedPulling="2026-04-17 21:37:03.13231092 +0000 UTC m=+19.895070390" observedRunningTime="2026-04-17 21:37:09.854052239 +0000 UTC m=+26.616811720" watchObservedRunningTime="2026-04-17 21:37:09.855714803 +0000 UTC m=+26.618474312" Apr 17 21:37:10.467602 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:10.467560 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hqbt5"] Apr 17 21:37:10.467787 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:10.467698 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:10.467859 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:10.467796 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:10.470060 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:10.470034 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ddrrn"] Apr 17 21:37:10.470162 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:10.470106 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:10.470199 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:10.470172 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:11.817544 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:11.817510 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="b558b038191982430cf3e63f8a5841ea9cc48f88033363b7df8b3da3e5e59448" exitCode=0 Apr 17 21:37:11.817996 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:11.817607 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"b558b038191982430cf3e63f8a5841ea9cc48f88033363b7df8b3da3e5e59448"} Apr 17 21:37:12.689855 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:12.689681 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:12.690025 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:12.689745 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:12.690025 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:12.689928 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:12.690025 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:12.690015 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:13.823869 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:13.823836 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="7a5f122d8de0bcbc7b1809a8aaaa9e0898052e42de13403433614c7d70246498" exitCode=0 Apr 17 21:37:13.824303 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:13.823882 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"7a5f122d8de0bcbc7b1809a8aaaa9e0898052e42de13403433614c7d70246498"} Apr 17 21:37:14.689433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:14.689402 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:14.689623 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:14.689549 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-hqbt5" podUID="ee2090c8-65ec-46e0-9614-f6f0ddae32d7" Apr 17 21:37:14.689675 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:14.689619 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:14.689765 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:14.689728 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ddrrn" podUID="803936af-5a7f-4be9-bc47-8ca0f94064a9" Apr 17 21:37:16.026648 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.026562 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l7tk5_116be4c2-a389-4822-bd06-12d2e0fcf15a/dns-node-resolver/0.log" Apr 17 21:37:16.097929 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.097901 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-27.ec2.internal" event="NodeReady" Apr 17 21:37:16.098132 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.098045 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 21:37:16.135397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.135368 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:37:16.154850 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.154821 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-wcgv2"] Apr 17 21:37:16.155032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.154994 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.158677 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.158649 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 21:37:16.158816 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.158702 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-lmbh2\"" Apr 17 21:37:16.159261 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.159242 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 21:37:16.159422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.159403 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 21:37:16.169095 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.169067 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 21:37:16.174156 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.174132 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9fcp4"] Apr 17 21:37:16.174321 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.174300 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.176752 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.176735 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 21:37:16.176962 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.176832 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjbk2\"" Apr 17 21:37:16.177043 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.176836 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 21:37:16.188435 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.188417 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:37:16.188537 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.188442 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9fcp4"] Apr 17 21:37:16.188537 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.188457 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcgv2"] Apr 17 21:37:16.188642 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.188544 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.190995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.190943 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 21:37:16.190995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.190977 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 21:37:16.191159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.191024 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-m9tzd\"" Apr 17 21:37:16.191159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.191110 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 21:37:16.277972 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.277879 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.277972 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.277927 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lrm\" (UniqueName: \"kubernetes.io/projected/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-kube-api-access-c7lrm\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.277972 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.277955 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278040 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278087 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278125 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278158 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278203 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278238 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278228 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk52m\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrw44\" (UniqueName: \"kubernetes.io/projected/d9e8a464-4161-4835-bb2a-311f468b76b3-kube-api-access-lrw44\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.278511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278294 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a464-4161-4835-bb2a-311f468b76b3-tmp-dir\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.278511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278320 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e8a464-4161-4835-bb2a-311f468b76b3-config-volume\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.278511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278362 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.278511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.278389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.379517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379482 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.379517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379528 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lrm\" (UniqueName: \"kubernetes.io/projected/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-kube-api-access-c7lrm\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.379781 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379554 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.379781 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.379659 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:16.379781 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.379720 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert podName:d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf nodeName:}" failed. No retries permitted until 2026-04-17 21:37:16.879699652 +0000 UTC m=+33.642459125 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert") pod "ingress-canary-9fcp4" (UID: "d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf") : secret "canary-serving-cert" not found Apr 17 21:37:16.379781 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379714 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.379996 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379802 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.379996 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.379996 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379941 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.379996 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380022 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk52m\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380048 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrw44\" (UniqueName: \"kubernetes.io/projected/d9e8a464-4161-4835-bb2a-311f468b76b3-kube-api-access-lrw44\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.380081 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380087 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a464-4161-4835-bb2a-311f468b76b3-tmp-dir\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380113 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e8a464-4161-4835-bb2a-311f468b76b3-config-volume\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.380140 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls podName:d9e8a464-4161-4835-bb2a-311f468b76b3 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:16.880125884 +0000 UTC m=+33.642885354 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls") pod "dns-default-wcgv2" (UID: "d9e8a464-4161-4835-bb2a-311f468b76b3") : secret "dns-default-metrics-tls" not found Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380178 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380201 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380668 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.380301 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:16.380668 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.380311 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674876ddd5-8bzls: secret "image-registry-tls" not found Apr 17 21:37:16.380668 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.380340 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls podName:b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb nodeName:}" failed. No retries permitted until 2026-04-17 21:37:16.880330452 +0000 UTC m=+33.643089922 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls") pod "image-registry-674876ddd5-8bzls" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb") : secret "image-registry-tls" not found Apr 17 21:37:16.380668 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380399 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380874 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380677 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.380874 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380842 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9e8a464-4161-4835-bb2a-311f468b76b3-config-volume\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380970 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380887 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9e8a464-4161-4835-bb2a-311f468b76b3-tmp-dir\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.380970 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.380957 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.384569 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.384549 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.384705 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.384572 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.393447 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.393425 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk52m\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.393812 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.393792 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.394028 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.394004 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrw44\" (UniqueName: \"kubernetes.io/projected/d9e8a464-4161-4835-bb2a-311f468b76b3-kube-api-access-lrw44\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.394777 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.394758 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lrm\" (UniqueName: \"kubernetes.io/projected/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-kube-api-access-c7lrm\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.640938 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.640903 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8d4bc"] Apr 17 21:37:16.665834 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.665796 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8d4bc"] Apr 17 21:37:16.665999 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.665937 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.668579 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.668548 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 21:37:16.689259 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.689235 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:16.689381 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.689241 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:16.692069 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.692049 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nlncx\"" Apr 17 21:37:16.692069 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.692063 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 21:37:16.692220 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.692082 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-9r26l\"" Apr 17 21:37:16.692356 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.692336 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 21:37:16.692449 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.692373 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 21:37:16.783256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.783206 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/202e1a9f-d233-463c-8e71-87d017274c62-original-pull-secret\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.783444 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.783322 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-kubelet-config\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.783444 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.783415 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-dbus\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.814174 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.814129 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nlczj_960eb4ac-0adf-443b-8b6e-b34cc770fb3a/node-ca/0.log" Apr 17 21:37:16.883205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.883174 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tlkjn"] Apr 17 21:37:16.884406 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884385 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/202e1a9f-d233-463c-8e71-87d017274c62-original-pull-secret\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.884491 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884432 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:16.884491 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884450 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-kubelet-config\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.884491 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884477 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-dbus\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884499 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884540 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884558 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674876ddd5-8bzls: secret "image-registry-tls" not found Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884629 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls podName:b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.884611544 +0000 UTC m=+34.647371023 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls") pod "image-registry-674876ddd5-8bzls" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb") : secret "image-registry-tls" not found Apr 17 21:37:16.884636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-kubelet-config\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.884889 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884636 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:16.884889 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884695 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls podName:d9e8a464-4161-4835-bb2a-311f468b76b3 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.884683128 +0000 UTC m=+34.647442587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls") pod "dns-default-wcgv2" (UID: "d9e8a464-4161-4835-bb2a-311f468b76b3") : secret "dns-default-metrics-tls" not found Apr 17 21:37:16.884889 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884652 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:16.884889 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:16.884732 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert podName:d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.884723374 +0000 UTC m=+34.647482830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert") pod "ingress-canary-9fcp4" (UID: "d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf") : secret "canary-serving-cert" not found Apr 17 21:37:16.884889 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.884779 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/202e1a9f-d233-463c-8e71-87d017274c62-dbus\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.886580 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.886561 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/202e1a9f-d233-463c-8e71-87d017274c62-original-pull-secret\") pod \"global-pull-secret-syncer-8d4bc\" (UID: \"202e1a9f-d233-463c-8e71-87d017274c62\") " pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.900973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.900915 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tlkjn"] Apr 17 21:37:16.900973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.900970 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:16.903618 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.903583 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 21:37:16.903618 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.903608 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 21:37:16.903618 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.903614 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-z6th7\"" Apr 17 21:37:16.903843 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.903623 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 21:37:16.903990 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.903973 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 21:37:16.976604 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.976562 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8d4bc" Apr 17 21:37:16.985728 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.985700 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:16.985861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.985805 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-949mk\" (UniqueName: \"kubernetes.io/projected/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-api-access-949mk\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:16.985927 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.985867 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2296bc55-e2c4-4c73-b08d-0e0583540f74-data-volume\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:16.985978 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.985932 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2296bc55-e2c4-4c73-b08d-0e0583540f74-crio-socket\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:16.985978 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:16.985972 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.091731 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.091855 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-949mk\" (UniqueName: \"kubernetes.io/projected/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-api-access-949mk\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.091892 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2296bc55-e2c4-4c73-b08d-0e0583540f74-data-volume\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.091971 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2296bc55-e2c4-4c73-b08d-0e0583540f74-crio-socket\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.092024 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092810 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.092150 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.092810 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.092237 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls podName:2296bc55-e2c4-4c73-b08d-0e0583540f74 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:17.592216087 +0000 UTC m=+34.354975560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tlkjn" (UID: "2296bc55-e2c4-4c73-b08d-0e0583540f74") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.092810 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.092243 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2296bc55-e2c4-4c73-b08d-0e0583540f74-crio-socket\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.092966 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.092899 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2296bc55-e2c4-4c73-b08d-0e0583540f74-data-volume\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.093503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.093486 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.104525 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.104500 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-949mk\" (UniqueName: \"kubernetes.io/projected/2296bc55-e2c4-4c73-b08d-0e0583540f74-kube-api-access-949mk\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.135740 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.135688 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8d4bc"] Apr 17 21:37:17.142185 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:17.142158 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod202e1a9f_d233_463c_8e71_87d017274c62.slice/crio-c5c85a3ef088a30900c182328fc9fb287f32ea5a0828b61e21a9acad51f1494b WatchSource:0}: Error finding container c5c85a3ef088a30900c182328fc9fb287f32ea5a0828b61e21a9acad51f1494b: Status 404 returned error can't find the container with id c5c85a3ef088a30900c182328fc9fb287f32ea5a0828b61e21a9acad51f1494b Apr 17 21:37:17.394625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.394547 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:17.394796 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.394717 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 21:37:17.394796 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.394795 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs podName:ee2090c8-65ec-46e0-9614-f6f0ddae32d7 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:49.394774399 +0000 UTC m=+66.157533861 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs") pod "network-metrics-daemon-hqbt5" (UID: "ee2090c8-65ec-46e0-9614-f6f0ddae32d7") : secret "metrics-daemon-secret" not found Apr 17 21:37:17.482295 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.482260 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g4qml"] Apr 17 21:37:17.510929 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.510899 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g4qml"] Apr 17 21:37:17.511084 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.510936 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.513736 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.513713 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 21:37:17.513989 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.513975 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 21:37:17.514078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.514004 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 21:37:17.515024 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.515006 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-ttsv7\"" Apr 17 21:37:17.515119 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.515035 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 21:37:17.596609 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.596566 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:17.596778 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.596654 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:17.596778 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.596721 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.596867 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.596817 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls podName:2296bc55-e2c4-4c73-b08d-0e0583540f74 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:18.596795717 +0000 UTC m=+35.359555177 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tlkjn" (UID: "2296bc55-e2c4-4c73-b08d-0e0583540f74") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:17.599159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.599132 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnbk6\" (UniqueName: \"kubernetes.io/projected/803936af-5a7f-4be9-bc47-8ca0f94064a9-kube-api-access-cnbk6\") pod \"network-check-target-ddrrn\" (UID: \"803936af-5a7f-4be9-bc47-8ca0f94064a9\") " pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:17.606147 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.606124 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:17.697415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.697333 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-cabundle\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.697415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.697390 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-key\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.697655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.697422 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27kzr\" (UniqueName: \"kubernetes.io/projected/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-kube-api-access-27kzr\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.798755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.798702 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-cabundle\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.799012 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.798784 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-key\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.799012 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.798811 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-27kzr\" (UniqueName: \"kubernetes.io/projected/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-kube-api-access-27kzr\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.799612 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.799548 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-cabundle\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.801844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.801821 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-signing-key\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.813306 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.813279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-27kzr\" (UniqueName: \"kubernetes.io/projected/43bdc0ea-3cbf-4656-afdd-8e08712a10ed-kube-api-access-27kzr\") pod \"service-ca-865cb79987-g4qml\" (UID: \"43bdc0ea-3cbf-4656-afdd-8e08712a10ed\") " pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.819951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.819927 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-g4qml" Apr 17 21:37:17.831867 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.831836 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8d4bc" event={"ID":"202e1a9f-d233-463c-8e71-87d017274c62","Type":"ContainerStarted","Data":"c5c85a3ef088a30900c182328fc9fb287f32ea5a0828b61e21a9acad51f1494b"} Apr 17 21:37:17.899415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.899379 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:17.899415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.899428 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:17.899475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899557 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899575 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899583 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899587 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674876ddd5-8bzls: secret "image-registry-tls" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899644 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert podName:d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf nodeName:}" failed. No retries permitted until 2026-04-17 21:37:19.899622205 +0000 UTC m=+36.662381678 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert") pod "ingress-canary-9fcp4" (UID: "d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf") : secret "canary-serving-cert" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899662 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls podName:d9e8a464-4161-4835-bb2a-311f468b76b3 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:19.899653623 +0000 UTC m=+36.662413085 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls") pod "dns-default-wcgv2" (UID: "d9e8a464-4161-4835-bb2a-311f468b76b3") : secret "dns-default-metrics-tls" not found Apr 17 21:37:17.899701 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:17.899678 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls podName:b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb nodeName:}" failed. No retries permitted until 2026-04-17 21:37:19.899668666 +0000 UTC m=+36.662428133 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls") pod "image-registry-674876ddd5-8bzls" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb") : secret "image-registry-tls" not found Apr 17 21:37:18.605060 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:18.605010 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:18.605537 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:18.605157 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:18.605537 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:18.605239 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls podName:2296bc55-e2c4-4c73-b08d-0e0583540f74 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:20.605218529 +0000 UTC m=+37.367978000 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tlkjn" (UID: "2296bc55-e2c4-4c73-b08d-0e0583540f74") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:19.917542 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:19.917512 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:19.917588 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:19.917676 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917699 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917720 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674876ddd5-8bzls: secret "image-registry-tls" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917773 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls podName:b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb nodeName:}" failed. No retries permitted until 2026-04-17 21:37:23.917755041 +0000 UTC m=+40.680514512 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls") pod "image-registry-674876ddd5-8bzls" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb") : secret "image-registry-tls" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917697 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917787 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917835 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert podName:d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf nodeName:}" failed. No retries permitted until 2026-04-17 21:37:23.917820044 +0000 UTC m=+40.680579500 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert") pod "ingress-canary-9fcp4" (UID: "d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf") : secret "canary-serving-cert" not found Apr 17 21:37:19.918088 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:19.917850 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls podName:d9e8a464-4161-4835-bb2a-311f468b76b3 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:23.917843997 +0000 UTC m=+40.680603453 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls") pod "dns-default-wcgv2" (UID: "d9e8a464-4161-4835-bb2a-311f468b76b3") : secret "dns-default-metrics-tls" not found Apr 17 21:37:20.624162 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:20.624129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:20.624337 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:20.624302 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:20.624390 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:20.624380 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls podName:2296bc55-e2c4-4c73-b08d-0e0583540f74 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:24.624363939 +0000 UTC m=+41.387123395 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tlkjn" (UID: "2296bc55-e2c4-4c73-b08d-0e0583540f74") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:21.636361 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.636281 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-g4qml"] Apr 17 21:37:21.639517 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:21.639483 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43bdc0ea_3cbf_4656_afdd_8e08712a10ed.slice/crio-218667d2b79548346af9e281e8349f9e207fb50330c5245c2d9169a9532cb5dc WatchSource:0}: Error finding container 218667d2b79548346af9e281e8349f9e207fb50330c5245c2d9169a9532cb5dc: Status 404 returned error can't find the container with id 218667d2b79548346af9e281e8349f9e207fb50330c5245c2d9169a9532cb5dc Apr 17 21:37:21.645247 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.645219 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ddrrn"] Apr 17 21:37:21.661471 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:21.661436 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod803936af_5a7f_4be9_bc47_8ca0f94064a9.slice/crio-19cdf197fcf7c42d7504d2f731eaaa3a7303928cabac060f75fd4d4ec79175d6 WatchSource:0}: Error finding container 19cdf197fcf7c42d7504d2f731eaaa3a7303928cabac060f75fd4d4ec79175d6: Status 404 returned error can't find the container with id 19cdf197fcf7c42d7504d2f731eaaa3a7303928cabac060f75fd4d4ec79175d6 Apr 17 21:37:21.841926 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.841896 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerStarted","Data":"4132fc8c8a43a649e2005ed811f96a6f337edfb1004ffe3a90c109c4ca8bbab3"} Apr 17 21:37:21.843251 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.843228 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8d4bc" event={"ID":"202e1a9f-d233-463c-8e71-87d017274c62","Type":"ContainerStarted","Data":"a727cfb06dda6621ab41c4675ab88d119c685c8bed9c97cff236beabcc11df36"} Apr 17 21:37:21.844226 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.844202 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ddrrn" event={"ID":"803936af-5a7f-4be9-bc47-8ca0f94064a9","Type":"ContainerStarted","Data":"19cdf197fcf7c42d7504d2f731eaaa3a7303928cabac060f75fd4d4ec79175d6"} Apr 17 21:37:21.845035 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.845015 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g4qml" event={"ID":"43bdc0ea-3cbf-4656-afdd-8e08712a10ed","Type":"ContainerStarted","Data":"218667d2b79548346af9e281e8349f9e207fb50330c5245c2d9169a9532cb5dc"} Apr 17 21:37:21.875913 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:21.875865 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8d4bc" podStartSLOduration=1.5310295310000002 podStartE2EDuration="5.875853245s" podCreationTimestamp="2026-04-17 21:37:16 +0000 UTC" firstStartedPulling="2026-04-17 21:37:17.144265728 +0000 UTC m=+33.907025183" lastFinishedPulling="2026-04-17 21:37:21.489089432 +0000 UTC m=+38.251848897" observedRunningTime="2026-04-17 21:37:21.874989122 +0000 UTC m=+38.637748599" watchObservedRunningTime="2026-04-17 21:37:21.875853245 +0000 UTC m=+38.638612723" Apr 17 21:37:22.850847 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:22.850742 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="4132fc8c8a43a649e2005ed811f96a6f337edfb1004ffe3a90c109c4ca8bbab3" exitCode=0 Apr 17 21:37:22.851282 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:22.850750 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"4132fc8c8a43a649e2005ed811f96a6f337edfb1004ffe3a90c109c4ca8bbab3"} Apr 17 21:37:23.856310 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:23.856265 2564 generic.go:358] "Generic (PLEG): container finished" podID="1ced47e5-b9ea-4efa-8587-2c824560fd6c" containerID="00036510831fe318cee837ec73c3222d61a83d260b0f2151914994399247ecef" exitCode=0 Apr 17 21:37:23.856791 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:23.856331 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerDied","Data":"00036510831fe318cee837ec73c3222d61a83d260b0f2151914994399247ecef"} Apr 17 21:37:23.952643 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:23.952602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:23.952830 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:23.952664 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:23.952830 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:23.952764 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:23.952830 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952819 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 21:37:23.952993 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952860 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 21:37:23.952993 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952882 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert podName:d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.95286497 +0000 UTC m=+48.715624429 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert") pod "ingress-canary-9fcp4" (UID: "d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf") : secret "canary-serving-cert" not found Apr 17 21:37:23.952993 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952915 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls podName:d9e8a464-4161-4835-bb2a-311f468b76b3 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.952898342 +0000 UTC m=+48.715657798 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls") pod "dns-default-wcgv2" (UID: "d9e8a464-4161-4835-bb2a-311f468b76b3") : secret "dns-default-metrics-tls" not found Apr 17 21:37:23.952993 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952966 2564 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 21:37:23.952993 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.952983 2564 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-674876ddd5-8bzls: secret "image-registry-tls" not found Apr 17 21:37:23.953167 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:23.953028 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls podName:b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb nodeName:}" failed. No retries permitted until 2026-04-17 21:37:31.953020832 +0000 UTC m=+48.715780288 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls") pod "image-registry-674876ddd5-8bzls" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb") : secret "image-registry-tls" not found Apr 17 21:37:24.656555 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.656532 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:24.656699 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:24.656678 2564 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 17 21:37:24.656807 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:37:24.656750 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls podName:2296bc55-e2c4-4c73-b08d-0e0583540f74 nodeName:}" failed. No retries permitted until 2026-04-17 21:37:32.656731632 +0000 UTC m=+49.419491101 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls") pod "insights-runtime-extractor-tlkjn" (UID: "2296bc55-e2c4-4c73-b08d-0e0583540f74") : secret "insights-runtime-extractor-tls" not found Apr 17 21:37:24.859462 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.859425 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ddrrn" event={"ID":"803936af-5a7f-4be9-bc47-8ca0f94064a9","Type":"ContainerStarted","Data":"1fd7eb7b2fa5e360dc9d152bd9cd9707d5cc7ae4608e358355ff040c40fb4272"} Apr 17 21:37:24.859936 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.859541 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:24.860812 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.860789 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-g4qml" event={"ID":"43bdc0ea-3cbf-4656-afdd-8e08712a10ed","Type":"ContainerStarted","Data":"257bfbbeb198d44940ede2ef7226bf966be384db57bd38b3780be1794830e135"} Apr 17 21:37:24.863755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.863735 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" event={"ID":"1ced47e5-b9ea-4efa-8587-2c824560fd6c","Type":"ContainerStarted","Data":"33fb3d751a02ad2ca9c20d818f33e3204d4d3a3cbc8deb4e9c71af31dd33d6b5"} Apr 17 21:37:24.874261 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.874224 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ddrrn" podStartSLOduration=38.954289613 podStartE2EDuration="41.874211584s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:37:21.663410529 +0000 UTC m=+38.426169986" lastFinishedPulling="2026-04-17 21:37:24.583332488 +0000 UTC m=+41.346091957" observedRunningTime="2026-04-17 21:37:24.873285797 +0000 UTC m=+41.636045277" watchObservedRunningTime="2026-04-17 21:37:24.874211584 +0000 UTC m=+41.636971097" Apr 17 21:37:24.893894 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.893854 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4wvmp" podStartSLOduration=6.707118266 podStartE2EDuration="41.893841388s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:36:46.306003248 +0000 UTC m=+3.068762709" lastFinishedPulling="2026-04-17 21:37:21.492726372 +0000 UTC m=+38.255485831" observedRunningTime="2026-04-17 21:37:24.892000715 +0000 UTC m=+41.654760195" watchObservedRunningTime="2026-04-17 21:37:24.893841388 +0000 UTC m=+41.656600890" Apr 17 21:37:24.906491 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:24.906452 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-g4qml" podStartSLOduration=4.970948898 podStartE2EDuration="7.906439399s" podCreationTimestamp="2026-04-17 21:37:17 +0000 UTC" firstStartedPulling="2026-04-17 21:37:21.641380604 +0000 UTC m=+38.404140060" lastFinishedPulling="2026-04-17 21:37:24.576871092 +0000 UTC m=+41.339630561" observedRunningTime="2026-04-17 21:37:24.905974193 +0000 UTC m=+41.668733672" watchObservedRunningTime="2026-04-17 21:37:24.906439399 +0000 UTC m=+41.669198876" Apr 17 21:37:32.016304 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.016267 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:32.016814 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.016365 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:32.016814 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.016387 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:32.019354 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.019320 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9e8a464-4161-4835-bb2a-311f468b76b3-metrics-tls\") pod \"dns-default-wcgv2\" (UID: \"d9e8a464-4161-4835-bb2a-311f468b76b3\") " pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:32.019478 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.019383 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"image-registry-674876ddd5-8bzls\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:32.019478 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.019401 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf-cert\") pod \"ingress-canary-9fcp4\" (UID: \"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf\") " pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:32.070487 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.070454 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:32.096232 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.096144 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:32.097757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.097738 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9fcp4" Apr 17 21:37:32.203979 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.203947 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:37:32.207250 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:32.207218 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7940b5f_ae17_4c7d_92e0_c15c7b32c5cb.slice/crio-45e50b47eab516b20f6f2b020d92cfc20b974e212ea1c6a92a380bd071dd1fe3 WatchSource:0}: Error finding container 45e50b47eab516b20f6f2b020d92cfc20b974e212ea1c6a92a380bd071dd1fe3: Status 404 returned error can't find the container with id 45e50b47eab516b20f6f2b020d92cfc20b974e212ea1c6a92a380bd071dd1fe3 Apr 17 21:37:32.221540 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.221509 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9fcp4"] Apr 17 21:37:32.224402 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:32.224372 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd95e33bb_b1a4_4e97_8fbf_eab2a80b4bdf.slice/crio-0af32229a63b5b8b8b9e8ddf2b933ca796031d844544edaa76749b3ee81a9135 WatchSource:0}: Error finding container 0af32229a63b5b8b8b9e8ddf2b933ca796031d844544edaa76749b3ee81a9135: Status 404 returned error can't find the container with id 0af32229a63b5b8b8b9e8ddf2b933ca796031d844544edaa76749b3ee81a9135 Apr 17 21:37:32.239290 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.239268 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-wcgv2"] Apr 17 21:37:32.243231 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:32.243208 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9e8a464_4161_4835_bb2a_311f468b76b3.slice/crio-bd6803a101c5adc5c224015c7c09bf1b1355f51c5d6457b0ce0c504250742ded WatchSource:0}: Error finding container bd6803a101c5adc5c224015c7c09bf1b1355f51c5d6457b0ce0c504250742ded: Status 404 returned error can't find the container with id bd6803a101c5adc5c224015c7c09bf1b1355f51c5d6457b0ce0c504250742ded Apr 17 21:37:32.721586 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.721548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:32.724181 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.724122 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2296bc55-e2c4-4c73-b08d-0e0583540f74-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tlkjn\" (UID: \"2296bc55-e2c4-4c73-b08d-0e0583540f74\") " pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:32.811877 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.811843 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tlkjn" Apr 17 21:37:32.880648 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.880584 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9fcp4" event={"ID":"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf","Type":"ContainerStarted","Data":"0af32229a63b5b8b8b9e8ddf2b933ca796031d844544edaa76749b3ee81a9135"} Apr 17 21:37:32.881865 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.881819 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcgv2" event={"ID":"d9e8a464-4161-4835-bb2a-311f468b76b3","Type":"ContainerStarted","Data":"bd6803a101c5adc5c224015c7c09bf1b1355f51c5d6457b0ce0c504250742ded"} Apr 17 21:37:32.885097 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.885068 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" event={"ID":"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb","Type":"ContainerStarted","Data":"f1fa4dedcc0404539606f77d3b746395059742f55f8816b200538a2da82d2f88"} Apr 17 21:37:32.885188 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.885105 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" event={"ID":"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb","Type":"ContainerStarted","Data":"45e50b47eab516b20f6f2b020d92cfc20b974e212ea1c6a92a380bd071dd1fe3"} Apr 17 21:37:32.885314 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.885292 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:32.906049 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.905699 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" podStartSLOduration=45.905687075 podStartE2EDuration="45.905687075s" podCreationTimestamp="2026-04-17 21:36:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:37:32.903989942 +0000 UTC m=+49.666749419" watchObservedRunningTime="2026-04-17 21:37:32.905687075 +0000 UTC m=+49.668446552" Apr 17 21:37:32.961300 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:32.961266 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tlkjn"] Apr 17 21:37:32.964849 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:32.964817 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2296bc55_e2c4_4c73_b08d_0e0583540f74.slice/crio-413f6e84c4856bb76cccecd5b2aafd108a1f5e4a623b8d09586200a27a064c98 WatchSource:0}: Error finding container 413f6e84c4856bb76cccecd5b2aafd108a1f5e4a623b8d09586200a27a064c98: Status 404 returned error can't find the container with id 413f6e84c4856bb76cccecd5b2aafd108a1f5e4a623b8d09586200a27a064c98 Apr 17 21:37:33.890844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:33.890810 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlkjn" event={"ID":"2296bc55-e2c4-4c73-b08d-0e0583540f74","Type":"ContainerStarted","Data":"71cab35d0a1f63b0acfa20669d82f76fc29135286834cc72445b22054794bbde"} Apr 17 21:37:33.890844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:33.890847 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlkjn" event={"ID":"2296bc55-e2c4-4c73-b08d-0e0583540f74","Type":"ContainerStarted","Data":"413f6e84c4856bb76cccecd5b2aafd108a1f5e4a623b8d09586200a27a064c98"} Apr 17 21:37:34.894831 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:34.894789 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcgv2" event={"ID":"d9e8a464-4161-4835-bb2a-311f468b76b3","Type":"ContainerStarted","Data":"c0a07fe182fd89555bc58e8d5152fd4ea0c4735a49dcd2dc552245b164e5757b"} Apr 17 21:37:34.896256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:34.896227 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9fcp4" event={"ID":"d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf","Type":"ContainerStarted","Data":"5072d10033acb1fdc0495dfbdf9281bd416499a76d2826f6defb329a60e6d4b5"} Apr 17 21:37:34.911779 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:34.911719 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9fcp4" podStartSLOduration=16.384713027 podStartE2EDuration="18.911701763s" podCreationTimestamp="2026-04-17 21:37:16 +0000 UTC" firstStartedPulling="2026-04-17 21:37:32.226388243 +0000 UTC m=+48.989147698" lastFinishedPulling="2026-04-17 21:37:34.753376974 +0000 UTC m=+51.516136434" observedRunningTime="2026-04-17 21:37:34.911660395 +0000 UTC m=+51.674419884" watchObservedRunningTime="2026-04-17 21:37:34.911701763 +0000 UTC m=+51.674461241" Apr 17 21:37:35.902620 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:35.902568 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlkjn" event={"ID":"2296bc55-e2c4-4c73-b08d-0e0583540f74","Type":"ContainerStarted","Data":"f132747fcaebad220c8467a10d1f3eabadebe4d09723b119af9e566b920d4a3c"} Apr 17 21:37:35.904654 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:35.904617 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-wcgv2" event={"ID":"d9e8a464-4161-4835-bb2a-311f468b76b3","Type":"ContainerStarted","Data":"cc0de9c08ffa7527bec0c617893e0ad3a1bec2316277d56d33567e3359c48831"} Apr 17 21:37:35.922720 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:35.922663 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-wcgv2" podStartSLOduration=17.42247825 podStartE2EDuration="19.922646523s" podCreationTimestamp="2026-04-17 21:37:16 +0000 UTC" firstStartedPulling="2026-04-17 21:37:32.247871876 +0000 UTC m=+49.010631335" lastFinishedPulling="2026-04-17 21:37:34.748040152 +0000 UTC m=+51.510799608" observedRunningTime="2026-04-17 21:37:35.920896004 +0000 UTC m=+52.683655485" watchObservedRunningTime="2026-04-17 21:37:35.922646523 +0000 UTC m=+52.685406004" Apr 17 21:37:36.907389 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:36.907364 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:37.741886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.741856 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:37:37.747177 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.747147 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k"] Apr 17 21:37:37.769869 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.769846 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k"] Apr 17 21:37:37.770000 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.769937 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.772702 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.772678 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-qmdrt\"" Apr 17 21:37:37.773001 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.772986 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 17 21:37:37.773263 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.773245 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 17 21:37:37.783340 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.783322 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-798d665f68-m7frd"] Apr 17 21:37:37.800469 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.800431 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.801404 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.801382 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-798d665f68-m7frd"] Apr 17 21:37:37.844283 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.844256 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-cqv24"] Apr 17 21:37:37.862291 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.862269 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-cqv24"] Apr 17 21:37:37.862415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.862373 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:37.863348 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863328 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-installation-pull-secrets\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863462 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863364 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9140d7c7-facf-4eae-9286-221c2b1004b9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.863462 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863391 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwppz\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-kube-api-access-qwppz\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-image-registry-private-configuration\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863508 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-bound-sa-token\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863531 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9140d7c7-facf-4eae-9286-221c2b1004b9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.863730 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863556 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-trusted-ca\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863730 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863615 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-ca-trust-extracted\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863730 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863652 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-tls\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.863730 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.863689 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-certificates\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.865722 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.865704 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-bslkv\"" Apr 17 21:37:37.865916 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.865903 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 21:37:37.865989 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.865954 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 21:37:37.911024 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.910990 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tlkjn" event={"ID":"2296bc55-e2c4-4c73-b08d-0e0583540f74","Type":"ContainerStarted","Data":"ab4f515929cac9ad3dcfc86a9a6b725b134ebd7c7fc02c3adec5248937bf82c9"} Apr 17 21:37:37.934507 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.934457 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tlkjn" podStartSLOduration=18.053594331 podStartE2EDuration="21.93444298s" podCreationTimestamp="2026-04-17 21:37:16 +0000 UTC" firstStartedPulling="2026-04-17 21:37:33.108652329 +0000 UTC m=+49.871411785" lastFinishedPulling="2026-04-17 21:37:36.98950097 +0000 UTC m=+53.752260434" observedRunningTime="2026-04-17 21:37:37.933495293 +0000 UTC m=+54.696254770" watchObservedRunningTime="2026-04-17 21:37:37.93444298 +0000 UTC m=+54.697202461" Apr 17 21:37:37.964279 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964250 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9140d7c7-facf-4eae-9286-221c2b1004b9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.964279 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964282 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-trusted-ca\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.964495 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964308 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhtz\" (UniqueName: \"kubernetes.io/projected/db3b129d-4a40-4b37-82a8-f37d592345aa-kube-api-access-8jhtz\") pod \"downloads-6bcc868b7-cqv24\" (UID: \"db3b129d-4a40-4b37-82a8-f37d592345aa\") " pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:37.964495 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964423 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-ca-trust-extracted\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.964495 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964468 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-tls\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.964674 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964493 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-certificates\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.964829 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-ca-trust-extracted\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965000 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9140d7c7-facf-4eae-9286-221c2b1004b9-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965118 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-installation-pull-secrets\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965160 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9140d7c7-facf-4eae-9286-221c2b1004b9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965188 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qwppz\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-kube-api-access-qwppz\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-image-registry-private-configuration\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965251 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-certificates\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965283 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-bound-sa-token\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.965500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.965433 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-trusted-ca\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.967611 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.967552 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/9140d7c7-facf-4eae-9286-221c2b1004b9-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-lkn7k\" (UID: \"9140d7c7-facf-4eae-9286-221c2b1004b9\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:37.967748 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.967687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-registry-tls\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.967748 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.967705 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-installation-pull-secrets\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.968127 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.968107 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-image-registry-private-configuration\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.981377 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.981356 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-bound-sa-token\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:37.982450 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:37.982425 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwppz\" (UniqueName: \"kubernetes.io/projected/ab02dab3-df62-4fe4-91ca-ca5f92bce3f2-kube-api-access-qwppz\") pod \"image-registry-798d665f68-m7frd\" (UID: \"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2\") " pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:38.065877 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.065798 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhtz\" (UniqueName: \"kubernetes.io/projected/db3b129d-4a40-4b37-82a8-f37d592345aa-kube-api-access-8jhtz\") pod \"downloads-6bcc868b7-cqv24\" (UID: \"db3b129d-4a40-4b37-82a8-f37d592345aa\") " pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:38.075995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.075973 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhtz\" (UniqueName: \"kubernetes.io/projected/db3b129d-4a40-4b37-82a8-f37d592345aa-kube-api-access-8jhtz\") pod \"downloads-6bcc868b7-cqv24\" (UID: \"db3b129d-4a40-4b37-82a8-f37d592345aa\") " pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:38.078733 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.078719 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" Apr 17 21:37:38.109629 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.109586 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:38.172222 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.171705 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:38.224210 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.224169 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k"] Apr 17 21:37:38.229059 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:38.229023 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9140d7c7_facf_4eae_9286_221c2b1004b9.slice/crio-9dde7a2c52563c1668b93aeb39ae2c34102268dce050330694ed9b7287e90727 WatchSource:0}: Error finding container 9dde7a2c52563c1668b93aeb39ae2c34102268dce050330694ed9b7287e90727: Status 404 returned error can't find the container with id 9dde7a2c52563c1668b93aeb39ae2c34102268dce050330694ed9b7287e90727 Apr 17 21:37:38.239142 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.239117 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-798d665f68-m7frd"] Apr 17 21:37:38.241528 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:38.241501 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab02dab3_df62_4fe4_91ca_ca5f92bce3f2.slice/crio-564da8c0f23284bed7d98cc1d9e8b81ef8a9518207cf93f87b1ae5c0881ac673 WatchSource:0}: Error finding container 564da8c0f23284bed7d98cc1d9e8b81ef8a9518207cf93f87b1ae5c0881ac673: Status 404 returned error can't find the container with id 564da8c0f23284bed7d98cc1d9e8b81ef8a9518207cf93f87b1ae5c0881ac673 Apr 17 21:37:38.296552 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.296527 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-cqv24"] Apr 17 21:37:38.299370 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:38.299340 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb3b129d_4a40_4b37_82a8_f37d592345aa.slice/crio-f6082a43596557534a62f936fa0b4bb1128a986290c13157f5dab17c7a6a2160 WatchSource:0}: Error finding container f6082a43596557534a62f936fa0b4bb1128a986290c13157f5dab17c7a6a2160: Status 404 returned error can't find the container with id f6082a43596557534a62f936fa0b4bb1128a986290c13157f5dab17c7a6a2160 Apr 17 21:37:38.916645 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.916576 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-cqv24" event={"ID":"db3b129d-4a40-4b37-82a8-f37d592345aa","Type":"ContainerStarted","Data":"f6082a43596557534a62f936fa0b4bb1128a986290c13157f5dab17c7a6a2160"} Apr 17 21:37:38.918854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.918767 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-798d665f68-m7frd" event={"ID":"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2","Type":"ContainerStarted","Data":"60156a22127815ba64a8339b58b581ddbeb6178fac99e82b96c4fed82ecd91da"} Apr 17 21:37:38.918854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.918807 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-798d665f68-m7frd" event={"ID":"ab02dab3-df62-4fe4-91ca-ca5f92bce3f2","Type":"ContainerStarted","Data":"564da8c0f23284bed7d98cc1d9e8b81ef8a9518207cf93f87b1ae5c0881ac673"} Apr 17 21:37:38.918854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.918826 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:37:38.920186 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.920156 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" event={"ID":"9140d7c7-facf-4eae-9286-221c2b1004b9","Type":"ContainerStarted","Data":"9dde7a2c52563c1668b93aeb39ae2c34102268dce050330694ed9b7287e90727"} Apr 17 21:37:38.937484 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:38.937430 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-798d665f68-m7frd" podStartSLOduration=1.937412873 podStartE2EDuration="1.937412873s" podCreationTimestamp="2026-04-17 21:37:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:37:38.936430081 +0000 UTC m=+55.699189559" watchObservedRunningTime="2026-04-17 21:37:38.937412873 +0000 UTC m=+55.700172348" Apr 17 21:37:39.924696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:39.924585 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" event={"ID":"9140d7c7-facf-4eae-9286-221c2b1004b9","Type":"ContainerStarted","Data":"6be1d5d8677e59d6bf1286fe110e4fb3e269609cfaecc3f39fb848f8c0740f41"} Apr 17 21:37:39.939346 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:39.939297 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-lkn7k" podStartSLOduration=1.521112927 podStartE2EDuration="2.939284054s" podCreationTimestamp="2026-04-17 21:37:37 +0000 UTC" firstStartedPulling="2026-04-17 21:37:38.231344586 +0000 UTC m=+54.994104045" lastFinishedPulling="2026-04-17 21:37:39.649515705 +0000 UTC m=+56.412275172" observedRunningTime="2026-04-17 21:37:39.938394225 +0000 UTC m=+56.701153702" watchObservedRunningTime="2026-04-17 21:37:39.939284054 +0000 UTC m=+56.702043568" Apr 17 21:37:41.835718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:41.835688 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kwgmn" Apr 17 21:37:46.913612 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:46.913562 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-wcgv2" Apr 17 21:37:47.748521 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:47.748485 2564 patch_prober.go:28] interesting pod/image-registry-674876ddd5-8bzls container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 17 21:37:47.748728 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:47.748537 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 21:37:49.462254 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:49.462212 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:49.464725 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:49.464701 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ee2090c8-65ec-46e0-9614-f6f0ddae32d7-metrics-certs\") pod \"network-metrics-daemon-hqbt5\" (UID: \"ee2090c8-65ec-46e0-9614-f6f0ddae32d7\") " pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:49.702567 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:49.702538 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-nlncx\"" Apr 17 21:37:49.710144 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:49.710118 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-hqbt5" Apr 17 21:37:53.611093 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.611058 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-hqbt5"] Apr 17 21:37:53.615243 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:37:53.615220 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee2090c8_65ec_46e0_9614_f6f0ddae32d7.slice/crio-5b9c3bf3eb1b6337fc9467284951569d0224517828480ec091f3f26ab8fbcd88 WatchSource:0}: Error finding container 5b9c3bf3eb1b6337fc9467284951569d0224517828480ec091f3f26ab8fbcd88: Status 404 returned error can't find the container with id 5b9c3bf3eb1b6337fc9467284951569d0224517828480ec091f3f26ab8fbcd88 Apr 17 21:37:53.968565 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.968521 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-cqv24" event={"ID":"db3b129d-4a40-4b37-82a8-f37d592345aa","Type":"ContainerStarted","Data":"633603be6b89e299e58bcfcaacf1592c1a6123e3e72a14429270982d7585ed1d"} Apr 17 21:37:53.968772 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.968575 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:53.970039 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.970012 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hqbt5" event={"ID":"ee2090c8-65ec-46e0-9614-f6f0ddae32d7","Type":"ContainerStarted","Data":"5b9c3bf3eb1b6337fc9467284951569d0224517828480ec091f3f26ab8fbcd88"} Apr 17 21:37:53.983657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.983627 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-cqv24" Apr 17 21:37:53.988359 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:53.988309 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-cqv24" podStartSLOduration=1.695881161 podStartE2EDuration="16.988291192s" podCreationTimestamp="2026-04-17 21:37:37 +0000 UTC" firstStartedPulling="2026-04-17 21:37:38.301328308 +0000 UTC m=+55.064087768" lastFinishedPulling="2026-04-17 21:37:53.593738329 +0000 UTC m=+70.356497799" observedRunningTime="2026-04-17 21:37:53.986189016 +0000 UTC m=+70.748948495" watchObservedRunningTime="2026-04-17 21:37:53.988291192 +0000 UTC m=+70.751050672" Apr 17 21:37:54.978096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:54.978009 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hqbt5" event={"ID":"ee2090c8-65ec-46e0-9614-f6f0ddae32d7","Type":"ContainerStarted","Data":"840ee4cbd815e36058343cf1db752d91b968cd155b7648af2f3909cf664bcbd2"} Apr 17 21:37:55.868232 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:55.868198 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ddrrn" Apr 17 21:37:55.983446 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:55.983403 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-hqbt5" event={"ID":"ee2090c8-65ec-46e0-9614-f6f0ddae32d7","Type":"ContainerStarted","Data":"b5d74338576496254308f7387cb59e117598a79c9a4a14479fb4a50cb50511b1"} Apr 17 21:37:56.002555 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:56.002493 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-hqbt5" podStartSLOduration=71.873548449 podStartE2EDuration="1m13.002475404s" podCreationTimestamp="2026-04-17 21:36:43 +0000 UTC" firstStartedPulling="2026-04-17 21:37:53.616912372 +0000 UTC m=+70.379671832" lastFinishedPulling="2026-04-17 21:37:54.745839326 +0000 UTC m=+71.508598787" observedRunningTime="2026-04-17 21:37:55.998021418 +0000 UTC m=+72.760780899" watchObservedRunningTime="2026-04-17 21:37:56.002475404 +0000 UTC m=+72.765234885" Apr 17 21:37:57.747130 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:57.747101 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:37:59.929176 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:37:59.929146 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-798d665f68-m7frd" Apr 17 21:38:02.764110 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:02.764046 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerName="registry" containerID="cri-o://f1fa4dedcc0404539606f77d3b746395059742f55f8816b200538a2da82d2f88" gracePeriod=30 Apr 17 21:38:03.010392 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.010351 2564 generic.go:358] "Generic (PLEG): container finished" podID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerID="f1fa4dedcc0404539606f77d3b746395059742f55f8816b200538a2da82d2f88" exitCode=0 Apr 17 21:38:03.010509 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.010406 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" event={"ID":"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb","Type":"ContainerDied","Data":"f1fa4dedcc0404539606f77d3b746395059742f55f8816b200538a2da82d2f88"} Apr 17 21:38:03.020893 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.020835 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:38:03.177090 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177060 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177090 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177090 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177122 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177146 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177168 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177192 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177218 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dk52m\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177384 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177249 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca\") pod \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\" (UID: \"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb\") " Apr 17 21:38:03.177750 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177569 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:38:03.177844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.177816 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:38:03.179779 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.179731 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m" (OuterVolumeSpecName: "kube-api-access-dk52m") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "kube-api-access-dk52m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:03.179913 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.179858 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:38:03.179973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.179907 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:03.179973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.179906 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:38:03.179973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.179931 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:38:03.188210 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.188181 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" (UID: "b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278303 2564 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-ca-trust-extracted\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278328 2564 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-tls\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278338 2564 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-bound-sa-token\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278346 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dk52m\" (UniqueName: \"kubernetes.io/projected/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-kube-api-access-dk52m\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278355 2564 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-trusted-ca\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278365 2564 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-installation-pull-secrets\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278379 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278383 2564 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-image-registry-private-configuration\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:03.278712 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:03.278393 2564 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb-registry-certificates\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:38:04.014086 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:04.014002 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" event={"ID":"b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb","Type":"ContainerDied","Data":"45e50b47eab516b20f6f2b020d92cfc20b974e212ea1c6a92a380bd071dd1fe3"} Apr 17 21:38:04.014086 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:04.014031 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-674876ddd5-8bzls" Apr 17 21:38:04.014086 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:04.014045 2564 scope.go:117] "RemoveContainer" containerID="f1fa4dedcc0404539606f77d3b746395059742f55f8816b200538a2da82d2f88" Apr 17 21:38:04.037109 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:04.035431 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:38:04.038820 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:04.038793 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-674876ddd5-8bzls"] Apr 17 21:38:05.694969 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:05.694931 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" path="/var/lib/kubelet/pods/b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb/volumes" Apr 17 21:38:09.875323 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.875287 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2sqmx"] Apr 17 21:38:09.875887 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.875634 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerName="registry" Apr 17 21:38:09.875887 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.875653 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerName="registry" Apr 17 21:38:09.875887 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.875726 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="b7940b5f-ae17-4c7d-92e0-c15c7b32c5cb" containerName="registry" Apr 17 21:38:09.911055 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.911023 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2sqmx"] Apr 17 21:38:09.911189 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.911123 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:09.914138 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.914118 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 21:38:09.914258 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.914191 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 21:38:09.914418 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.914404 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 21:38:09.915353 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.915336 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 21:38:09.915453 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.915366 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 21:38:09.915698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:09.915682 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-9wzfb\"" Apr 17 21:38:10.023102 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.023060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.023289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.023114 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.023289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.023181 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fed131c-a37b-449e-a94e-f54a719a120d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.023289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.023215 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnfv4\" (UniqueName: \"kubernetes.io/projected/8fed131c-a37b-449e-a94e-f54a719a120d-kube-api-access-xnfv4\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.123942 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.123908 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.124140 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.123963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fed131c-a37b-449e-a94e-f54a719a120d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.124140 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.124001 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xnfv4\" (UniqueName: \"kubernetes.io/projected/8fed131c-a37b-449e-a94e-f54a719a120d-kube-api-access-xnfv4\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.124140 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.124030 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.124710 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.124687 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8fed131c-a37b-449e-a94e-f54a719a120d-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.126507 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.126445 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.126507 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.126451 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/8fed131c-a37b-449e-a94e-f54a719a120d-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.132117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.132094 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnfv4\" (UniqueName: \"kubernetes.io/projected/8fed131c-a37b-449e-a94e-f54a719a120d-kube-api-access-xnfv4\") pod \"prometheus-operator-5676c8c784-2sqmx\" (UID: \"8fed131c-a37b-449e-a94e-f54a719a120d\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.220087 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.220049 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" Apr 17 21:38:10.334721 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:10.334686 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-2sqmx"] Apr 17 21:38:10.338001 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:38:10.337967 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fed131c_a37b_449e_a94e_f54a719a120d.slice/crio-d6067b4f7be76481d5ae6ed16c45aa7cb8e48f39332de5ce517c16f63454c543 WatchSource:0}: Error finding container d6067b4f7be76481d5ae6ed16c45aa7cb8e48f39332de5ce517c16f63454c543: Status 404 returned error can't find the container with id d6067b4f7be76481d5ae6ed16c45aa7cb8e48f39332de5ce517c16f63454c543 Apr 17 21:38:11.033286 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:11.033245 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" event={"ID":"8fed131c-a37b-449e-a94e-f54a719a120d","Type":"ContainerStarted","Data":"d6067b4f7be76481d5ae6ed16c45aa7cb8e48f39332de5ce517c16f63454c543"} Apr 17 21:38:13.040023 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:13.039989 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" event={"ID":"8fed131c-a37b-449e-a94e-f54a719a120d","Type":"ContainerStarted","Data":"9e30f6c157d5993deec000b277a5d6ed50f1410049b8210a6d39937f91f32826"} Apr 17 21:38:13.040023 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:13.040026 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" event={"ID":"8fed131c-a37b-449e-a94e-f54a719a120d","Type":"ContainerStarted","Data":"f69aac134c936340c35e95a59223e42be46148e10cc7f547ec8100a404f9ea4a"} Apr 17 21:38:13.056976 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:13.056933 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-2sqmx" podStartSLOduration=2.35711882 podStartE2EDuration="4.056920069s" podCreationTimestamp="2026-04-17 21:38:09 +0000 UTC" firstStartedPulling="2026-04-17 21:38:10.33977602 +0000 UTC m=+87.102535475" lastFinishedPulling="2026-04-17 21:38:12.039577266 +0000 UTC m=+88.802336724" observedRunningTime="2026-04-17 21:38:13.055174224 +0000 UTC m=+89.817933701" watchObservedRunningTime="2026-04-17 21:38:13.056920069 +0000 UTC m=+89.819679546" Apr 17 21:38:15.245023 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.244990 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hdrt9"] Apr 17 21:38:15.248827 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.248805 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.251610 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.251570 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-n69pg\"" Apr 17 21:38:15.251883 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.251853 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 21:38:15.251982 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.251854 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 21:38:15.251982 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.251959 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 21:38:15.362536 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362504 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-root\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.362536 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362545 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.362787 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362570 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-textfile\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.362787 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362641 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-wtmp\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.362787 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362736 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.362787 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362773 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-sys\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.363004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362819 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.363004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362848 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/ea58c7c4-3af0-45e9-a977-19e7daff6f40-kube-api-access-7pbvt\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.363004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.362874 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-metrics-client-ca\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463490 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463453 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463490 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463494 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-sys\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463522 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463541 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/ea58c7c4-3af0-45e9-a977-19e7daff6f40-kube-api-access-7pbvt\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463560 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-metrics-client-ca\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463604 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-root\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:38:15.463630 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463643 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-sys\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463677 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:38:15.463715 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls podName:ea58c7c4-3af0-45e9-a977-19e7daff6f40 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:15.963693225 +0000 UTC m=+92.726452701 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls") pod "node-exporter-hdrt9" (UID: "ea58c7c4-3af0-45e9-a977-19e7daff6f40") : secret "node-exporter-tls" not found Apr 17 21:38:15.463757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463720 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-root\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.464116 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463807 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-textfile\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.464116 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463841 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-wtmp\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.464116 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.463998 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-wtmp\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.464222 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.464138 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-textfile\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.464958 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.464937 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-metrics-client-ca\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.465017 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.464979 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-accelerators-collector-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.466221 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.466192 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.473003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.472977 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pbvt\" (UniqueName: \"kubernetes.io/projected/ea58c7c4-3af0-45e9-a977-19e7daff6f40-kube-api-access-7pbvt\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.968130 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.968097 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:15.970287 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:15.970261 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea58c7c4-3af0-45e9-a977-19e7daff6f40-node-exporter-tls\") pod \"node-exporter-hdrt9\" (UID: \"ea58c7c4-3af0-45e9-a977-19e7daff6f40\") " pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:16.158335 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:16.158297 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hdrt9" Apr 17 21:38:16.168627 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:38:16.168580 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea58c7c4_3af0_45e9_a977_19e7daff6f40.slice/crio-18017b63cdcb4e98fbee09310bb807c3cafb77f5654aefebaea93a0f6d6f10f9 WatchSource:0}: Error finding container 18017b63cdcb4e98fbee09310bb807c3cafb77f5654aefebaea93a0f6d6f10f9: Status 404 returned error can't find the container with id 18017b63cdcb4e98fbee09310bb807c3cafb77f5654aefebaea93a0f6d6f10f9 Apr 17 21:38:17.051657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:17.051585 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hdrt9" event={"ID":"ea58c7c4-3af0-45e9-a977-19e7daff6f40","Type":"ContainerStarted","Data":"7b60762b634566cf5c90bbbdb77b1503bf5855d066e78bac6c756d56d590c302"} Apr 17 21:38:17.051657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:17.051662 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hdrt9" event={"ID":"ea58c7c4-3af0-45e9-a977-19e7daff6f40","Type":"ContainerStarted","Data":"18017b63cdcb4e98fbee09310bb807c3cafb77f5654aefebaea93a0f6d6f10f9"} Apr 17 21:38:18.055007 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:18.054974 2564 generic.go:358] "Generic (PLEG): container finished" podID="ea58c7c4-3af0-45e9-a977-19e7daff6f40" containerID="7b60762b634566cf5c90bbbdb77b1503bf5855d066e78bac6c756d56d590c302" exitCode=0 Apr 17 21:38:18.055498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:18.055057 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hdrt9" event={"ID":"ea58c7c4-3af0-45e9-a977-19e7daff6f40","Type":"ContainerDied","Data":"7b60762b634566cf5c90bbbdb77b1503bf5855d066e78bac6c756d56d590c302"} Apr 17 21:38:19.060046 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.060007 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hdrt9" event={"ID":"ea58c7c4-3af0-45e9-a977-19e7daff6f40","Type":"ContainerStarted","Data":"b9721f5938f612370e945f1beaf16668e9a74b80216d9664fe091c00f609f7ed"} Apr 17 21:38:19.060046 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.060051 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hdrt9" event={"ID":"ea58c7c4-3af0-45e9-a977-19e7daff6f40","Type":"ContainerStarted","Data":"91cdf80d0a7aaccd1050391e5c4a758133419d4394ef486b77b5a1302371c8a8"} Apr 17 21:38:19.082004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.081963 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hdrt9" podStartSLOduration=3.335205676 podStartE2EDuration="4.08194892s" podCreationTimestamp="2026-04-17 21:38:15 +0000 UTC" firstStartedPulling="2026-04-17 21:38:16.170747317 +0000 UTC m=+92.933506774" lastFinishedPulling="2026-04-17 21:38:16.917490557 +0000 UTC m=+93.680250018" observedRunningTime="2026-04-17 21:38:19.080158478 +0000 UTC m=+95.842917979" watchObservedRunningTime="2026-04-17 21:38:19.08194892 +0000 UTC m=+95.844708398" Apr 17 21:38:19.599611 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.599573 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7695548645-6xkws"] Apr 17 21:38:19.621247 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.621213 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7695548645-6xkws"] Apr 17 21:38:19.621390 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.621357 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.625310 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625279 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-6aiid5ir88fan\"" Apr 17 21:38:19.625310 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625300 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 21:38:19.625511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625361 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 21:38:19.625511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625365 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 21:38:19.625511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625422 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-n649s\"" Apr 17 21:38:19.625511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.625436 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 21:38:19.796530 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796495 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c525n\" (UniqueName: \"kubernetes.io/projected/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-kube-api-access-c525n\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796530 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796535 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-metrics-server-audit-profiles\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796767 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796554 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-client-certs\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796767 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796671 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-tls\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796767 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796699 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-client-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796767 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796731 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.796896 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.796776 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-audit-log\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897198 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897120 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-tls\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897198 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897153 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-client-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897198 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897399 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897309 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-audit-log\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897399 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897355 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c525n\" (UniqueName: \"kubernetes.io/projected/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-kube-api-access-c525n\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897399 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-metrics-server-audit-profiles\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897527 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897419 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-client-certs\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897690 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897670 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-audit-log\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.897919 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.897902 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.898482 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.898455 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-metrics-server-audit-profiles\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.899747 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.899723 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-client-certs\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.899830 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.899789 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-client-ca-bundle\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.899893 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.899873 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-secret-metrics-server-tls\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.914885 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.914862 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c525n\" (UniqueName: \"kubernetes.io/projected/d1610621-e8c1-4e95-b7a7-8c1d05baf41e-kube-api-access-c525n\") pod \"metrics-server-7695548645-6xkws\" (UID: \"d1610621-e8c1-4e95-b7a7-8c1d05baf41e\") " pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.930016 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.929997 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:19.997372 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:19.997337 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87"] Apr 17 21:38:20.019211 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.019182 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87"] Apr 17 21:38:20.019370 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.019343 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:20.022051 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.022004 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 21:38:20.022151 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.022119 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zn4df\"" Apr 17 21:38:20.069367 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.069334 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7695548645-6xkws"] Apr 17 21:38:20.082050 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:38:20.082021 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1610621_e8c1_4e95_b7a7_8c1d05baf41e.slice/crio-807201c4a70d50051f8f9cd8d3747eceb9f3843adf99c2dc1f3e8f8162e5d2af WatchSource:0}: Error finding container 807201c4a70d50051f8f9cd8d3747eceb9f3843adf99c2dc1f3e8f8162e5d2af: Status 404 returned error can't find the container with id 807201c4a70d50051f8f9cd8d3747eceb9f3843adf99c2dc1f3e8f8162e5d2af Apr 17 21:38:20.199653 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.199570 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xn87\" (UID: \"0c19899f-0a5f-4d4c-a8fb-a19bb5647096\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:20.300899 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.300864 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xn87\" (UID: \"0c19899f-0a5f-4d4c-a8fb-a19bb5647096\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:20.301037 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:38:20.300985 2564 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 21:38:20.301082 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:38:20.301072 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert podName:0c19899f-0a5f-4d4c-a8fb-a19bb5647096 nodeName:}" failed. No retries permitted until 2026-04-17 21:38:20.801053568 +0000 UTC m=+97.563813023 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-8xn87" (UID: "0c19899f-0a5f-4d4c-a8fb-a19bb5647096") : secret "monitoring-plugin-cert" not found Apr 17 21:38:20.805731 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.805694 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xn87\" (UID: \"0c19899f-0a5f-4d4c-a8fb-a19bb5647096\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:20.808372 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.808337 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/0c19899f-0a5f-4d4c-a8fb-a19bb5647096-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-8xn87\" (UID: \"0c19899f-0a5f-4d4c-a8fb-a19bb5647096\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:20.931547 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:20.931509 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:21.061554 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:21.061521 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87"] Apr 17 21:38:21.064374 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:38:21.064327 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c19899f_0a5f_4d4c_a8fb_a19bb5647096.slice/crio-181c3e189f26e1e179ce203a7ed4036ddb0f7ae860318ab549ed0e0e4ad48f4d WatchSource:0}: Error finding container 181c3e189f26e1e179ce203a7ed4036ddb0f7ae860318ab549ed0e0e4ad48f4d: Status 404 returned error can't find the container with id 181c3e189f26e1e179ce203a7ed4036ddb0f7ae860318ab549ed0e0e4ad48f4d Apr 17 21:38:21.067014 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:21.066987 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7695548645-6xkws" event={"ID":"d1610621-e8c1-4e95-b7a7-8c1d05baf41e","Type":"ContainerStarted","Data":"807201c4a70d50051f8f9cd8d3747eceb9f3843adf99c2dc1f3e8f8162e5d2af"} Apr 17 21:38:22.070510 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:22.070476 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" event={"ID":"0c19899f-0a5f-4d4c-a8fb-a19bb5647096","Type":"ContainerStarted","Data":"181c3e189f26e1e179ce203a7ed4036ddb0f7ae860318ab549ed0e0e4ad48f4d"} Apr 17 21:38:23.074125 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:23.074087 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7695548645-6xkws" event={"ID":"d1610621-e8c1-4e95-b7a7-8c1d05baf41e","Type":"ContainerStarted","Data":"82911ded860596d38d9c359f38396227d8f9d9620832bca6dd0513f796c57566"} Apr 17 21:38:23.090931 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:23.090880 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7695548645-6xkws" podStartSLOduration=2.052369083 podStartE2EDuration="4.090866613s" podCreationTimestamp="2026-04-17 21:38:19 +0000 UTC" firstStartedPulling="2026-04-17 21:38:20.084078532 +0000 UTC m=+96.846837992" lastFinishedPulling="2026-04-17 21:38:22.122576065 +0000 UTC m=+98.885335522" observedRunningTime="2026-04-17 21:38:23.090224459 +0000 UTC m=+99.852983938" watchObservedRunningTime="2026-04-17 21:38:23.090866613 +0000 UTC m=+99.853626090" Apr 17 21:38:24.077778 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:24.077743 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" event={"ID":"0c19899f-0a5f-4d4c-a8fb-a19bb5647096","Type":"ContainerStarted","Data":"38ac9febda7c94b48bc865aa4330e0c6540a3d0018cdcf45ce00afb92dba51a2"} Apr 17 21:38:24.078245 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:24.078004 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:24.083076 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:24.083054 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" Apr 17 21:38:24.093032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:24.092995 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-8xn87" podStartSLOduration=2.921406343 podStartE2EDuration="5.092985385s" podCreationTimestamp="2026-04-17 21:38:19 +0000 UTC" firstStartedPulling="2026-04-17 21:38:21.06708511 +0000 UTC m=+97.829844573" lastFinishedPulling="2026-04-17 21:38:23.238664155 +0000 UTC m=+100.001423615" observedRunningTime="2026-04-17 21:38:24.091161451 +0000 UTC m=+100.853920928" watchObservedRunningTime="2026-04-17 21:38:24.092985385 +0000 UTC m=+100.855744863" Apr 17 21:38:25.761098 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:25.761067 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9fcp4_d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf/serve-healthcheck-canary/0.log" Apr 17 21:38:39.930778 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:39.930739 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:39.930778 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:39.930785 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:59.935834 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:59.935805 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:38:59.939708 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:38:59.939683 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7695548645-6xkws" Apr 17 21:40:44.960373 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.960341 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g"] Apr 17 21:40:44.963531 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.963516 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:44.966156 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.966129 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:40:44.966282 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.966179 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:40:44.967302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.967285 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:40:44.970490 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:44.970261 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g"] Apr 17 21:40:45.025632 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.025569 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.025632 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.025625 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shgbz\" (UniqueName: \"kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.025825 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.025647 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.126727 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.126689 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.126917 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.126740 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shgbz\" (UniqueName: \"kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.126917 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.126855 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.127429 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.127401 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.131086 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.127446 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.136090 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.136066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shgbz\" (UniqueName: \"kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.272918 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.272840 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:40:45.387965 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.387927 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g"] Apr 17 21:40:45.392555 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:40:45.392528 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0b2b86a_1179_4426_921c_37a0389f4766.slice/crio-f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024 WatchSource:0}: Error finding container f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024: Status 404 returned error can't find the container with id f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024 Apr 17 21:40:45.447049 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:45.447018 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" event={"ID":"c0b2b86a-1179-4426-921c-37a0389f4766","Type":"ContainerStarted","Data":"f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024"} Apr 17 21:40:51.466370 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:51.466329 2564 generic.go:358] "Generic (PLEG): container finished" podID="c0b2b86a-1179-4426-921c-37a0389f4766" containerID="29cf92b2242f1840dfef61712a10fc40b7f251dfd9b3338fbc53fd3061c57c3d" exitCode=0 Apr 17 21:40:51.466867 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:51.466409 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" event={"ID":"c0b2b86a-1179-4426-921c-37a0389f4766","Type":"ContainerDied","Data":"29cf92b2242f1840dfef61712a10fc40b7f251dfd9b3338fbc53fd3061c57c3d"} Apr 17 21:40:54.475861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:54.475780 2564 generic.go:358] "Generic (PLEG): container finished" podID="c0b2b86a-1179-4426-921c-37a0389f4766" containerID="9b6ebd971f381a7b0b2db540b01c22dfad1f61eb5dadd550c8d64af2fa819c5e" exitCode=0 Apr 17 21:40:54.476224 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:40:54.475876 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" event={"ID":"c0b2b86a-1179-4426-921c-37a0389f4766","Type":"ContainerDied","Data":"9b6ebd971f381a7b0b2db540b01c22dfad1f61eb5dadd550c8d64af2fa819c5e"} Apr 17 21:41:01.497433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:01.497344 2564 generic.go:358] "Generic (PLEG): container finished" podID="c0b2b86a-1179-4426-921c-37a0389f4766" containerID="4678c03d639dd2a1c872293eef0084f976d4ebf5bb47bf6dc41aef8462e01ef6" exitCode=0 Apr 17 21:41:01.497433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:01.497395 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" event={"ID":"c0b2b86a-1179-4426-921c-37a0389f4766","Type":"ContainerDied","Data":"4678c03d639dd2a1c872293eef0084f976d4ebf5bb47bf6dc41aef8462e01ef6"} Apr 17 21:41:02.615987 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.615961 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:41:02.655294 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.655267 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle\") pod \"c0b2b86a-1179-4426-921c-37a0389f4766\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " Apr 17 21:41:02.655438 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.655307 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shgbz\" (UniqueName: \"kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz\") pod \"c0b2b86a-1179-4426-921c-37a0389f4766\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " Apr 17 21:41:02.655438 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.655345 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util\") pod \"c0b2b86a-1179-4426-921c-37a0389f4766\" (UID: \"c0b2b86a-1179-4426-921c-37a0389f4766\") " Apr 17 21:41:02.655861 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.655836 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle" (OuterVolumeSpecName: "bundle") pod "c0b2b86a-1179-4426-921c-37a0389f4766" (UID: "c0b2b86a-1179-4426-921c-37a0389f4766"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:02.657451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.657427 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz" (OuterVolumeSpecName: "kube-api-access-shgbz") pod "c0b2b86a-1179-4426-921c-37a0389f4766" (UID: "c0b2b86a-1179-4426-921c-37a0389f4766"). InnerVolumeSpecName "kube-api-access-shgbz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:41:02.659173 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.659155 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util" (OuterVolumeSpecName: "util") pod "c0b2b86a-1179-4426-921c-37a0389f4766" (UID: "c0b2b86a-1179-4426-921c-37a0389f4766"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:02.756466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.756389 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:02.756466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.756417 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c0b2b86a-1179-4426-921c-37a0389f4766-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:02.756466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:02.756427 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-shgbz\" (UniqueName: \"kubernetes.io/projected/c0b2b86a-1179-4426-921c-37a0389f4766-kube-api-access-shgbz\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:03.504152 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:03.504114 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" event={"ID":"c0b2b86a-1179-4426-921c-37a0389f4766","Type":"ContainerDied","Data":"f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024"} Apr 17 21:41:03.504152 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:03.504146 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8d6aa8290a74bb927990dfc19133feb2dc08d9c44884b1ec10ed825f630d024" Apr 17 21:41:03.504393 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:03.504166 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5vvc2g" Apr 17 21:41:07.374867 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.374830 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v"] Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375137 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="pull" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375151 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="pull" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375160 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="util" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375166 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="util" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375174 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="extract" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375180 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="extract" Apr 17 21:41:07.375320 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.375223 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0b2b86a-1179-4426-921c-37a0389f4766" containerName="extract" Apr 17 21:41:07.408471 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.408447 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v"] Apr 17 21:41:07.408636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.408554 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.411423 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.411400 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:41:07.411524 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.411459 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-qjpk6\"" Apr 17 21:41:07.411524 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.411477 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 17 21:41:07.490402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.490364 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj7t4\" (UniqueName: \"kubernetes.io/projected/3f025a86-2a64-485b-bb10-68c979cf03fa-kube-api-access-qj7t4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.490558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.490421 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f025a86-2a64-485b-bb10-68c979cf03fa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.590984 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.590955 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj7t4\" (UniqueName: \"kubernetes.io/projected/3f025a86-2a64-485b-bb10-68c979cf03fa-kube-api-access-qj7t4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.591163 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.591006 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f025a86-2a64-485b-bb10-68c979cf03fa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.591329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.591311 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3f025a86-2a64-485b-bb10-68c979cf03fa-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.598682 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.598656 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj7t4\" (UniqueName: \"kubernetes.io/projected/3f025a86-2a64-485b-bb10-68c979cf03fa-kube-api-access-qj7t4\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-z9v7v\" (UID: \"3f025a86-2a64-485b-bb10-68c979cf03fa\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.717824 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.717751 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" Apr 17 21:41:07.840587 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:07.840559 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v"] Apr 17 21:41:07.843699 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:07.843671 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f025a86_2a64_485b_bb10_68c979cf03fa.slice/crio-ed9fbf3a1064cf5f117467cf3540456baed2b1432e1b859734070ecf053c0f4f WatchSource:0}: Error finding container ed9fbf3a1064cf5f117467cf3540456baed2b1432e1b859734070ecf053c0f4f: Status 404 returned error can't find the container with id ed9fbf3a1064cf5f117467cf3540456baed2b1432e1b859734070ecf053c0f4f Apr 17 21:41:08.518990 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:08.518950 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" event={"ID":"3f025a86-2a64-485b-bb10-68c979cf03fa","Type":"ContainerStarted","Data":"ed9fbf3a1064cf5f117467cf3540456baed2b1432e1b859734070ecf053c0f4f"} Apr 17 21:41:11.529640 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:11.529586 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" event={"ID":"3f025a86-2a64-485b-bb10-68c979cf03fa","Type":"ContainerStarted","Data":"4185575eca798a55b88124d041248e05ae24ab92006806f63a460cda7baba316"} Apr 17 21:41:11.555045 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:11.554979 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-z9v7v" podStartSLOduration=1.550724324 podStartE2EDuration="4.55496023s" podCreationTimestamp="2026-04-17 21:41:07 +0000 UTC" firstStartedPulling="2026-04-17 21:41:07.846021292 +0000 UTC m=+264.608780748" lastFinishedPulling="2026-04-17 21:41:10.850257191 +0000 UTC m=+267.613016654" observedRunningTime="2026-04-17 21:41:11.548454747 +0000 UTC m=+268.311214225" watchObservedRunningTime="2026-04-17 21:41:11.55496023 +0000 UTC m=+268.317719709" Apr 17 21:41:13.695045 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.695008 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb"] Apr 17 21:41:13.698451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.698432 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.701054 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.701033 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:41:13.702321 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.702306 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:41:13.702407 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.702306 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:41:13.706216 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.706182 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb"] Apr 17 21:41:13.739833 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.739799 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.740025 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.739856 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.740025 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.739882 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8tfx\" (UniqueName: \"kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.841349 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.841315 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.841349 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.841352 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.841558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.841373 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q8tfx\" (UniqueName: \"kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.841727 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.841705 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.841786 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.841773 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:13.856544 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:13.856516 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8tfx\" (UniqueName: \"kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx\") pod \"77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:14.007692 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.007584 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:14.121272 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.121241 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb"] Apr 17 21:41:14.124391 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:14.124362 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod647d76f3_6336_4954_af84_f3dbfd465f68.slice/crio-2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac WatchSource:0}: Error finding container 2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac: Status 404 returned error can't find the container with id 2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac Apr 17 21:41:14.502583 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.502551 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xc9bk"] Apr 17 21:41:14.506021 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.505996 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.508459 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.508435 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 21:41:14.508558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.508438 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-fw9wx\"" Apr 17 21:41:14.509771 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.509756 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 21:41:14.512196 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.512174 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xc9bk"] Apr 17 21:41:14.538923 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.538895 2564 generic.go:358] "Generic (PLEG): container finished" podID="647d76f3-6336-4954-af84-f3dbfd465f68" containerID="4e5dcaac019b9d171985c0425909a1669c000870342e35170ffebd578bb015d6" exitCode=0 Apr 17 21:41:14.539066 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.538965 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" event={"ID":"647d76f3-6336-4954-af84-f3dbfd465f68","Type":"ContainerDied","Data":"4e5dcaac019b9d171985c0425909a1669c000870342e35170ffebd578bb015d6"} Apr 17 21:41:14.539066 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.538992 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" event={"ID":"647d76f3-6336-4954-af84-f3dbfd465f68","Type":"ContainerStarted","Data":"2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac"} Apr 17 21:41:14.547789 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.547770 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lhz\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-kube-api-access-c2lhz\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.547876 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.547801 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.648747 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.648715 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lhz\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-kube-api-access-c2lhz\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.648896 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.648763 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.656356 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.656333 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.656487 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.656472 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lhz\" (UniqueName: \"kubernetes.io/projected/34b4cdfc-e883-44ab-9d97-6608b982ebe0-kube-api-access-c2lhz\") pod \"cert-manager-webhook-597b96b99b-xc9bk\" (UID: \"34b4cdfc-e883-44ab-9d97-6608b982ebe0\") " pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.827485 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.827409 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:14.948438 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:14.948353 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-xc9bk"] Apr 17 21:41:14.950868 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:14.950839 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b4cdfc_e883_44ab_9d97_6608b982ebe0.slice/crio-0c9966737bfe3835e1da3479f43de8a14482f9062bc1a8977a5ff51e11216ecf WatchSource:0}: Error finding container 0c9966737bfe3835e1da3479f43de8a14482f9062bc1a8977a5ff51e11216ecf: Status 404 returned error can't find the container with id 0c9966737bfe3835e1da3479f43de8a14482f9062bc1a8977a5ff51e11216ecf Apr 17 21:41:15.543560 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:15.543524 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" event={"ID":"34b4cdfc-e883-44ab-9d97-6608b982ebe0","Type":"ContainerStarted","Data":"0c9966737bfe3835e1da3479f43de8a14482f9062bc1a8977a5ff51e11216ecf"} Apr 17 21:41:18.560233 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:18.560198 2564 generic.go:358] "Generic (PLEG): container finished" podID="647d76f3-6336-4954-af84-f3dbfd465f68" containerID="cf7891edbebe427d6df6ab8655c807d91514b1855f1f85bb5771ba2048a4e292" exitCode=0 Apr 17 21:41:18.560682 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:18.560290 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" event={"ID":"647d76f3-6336-4954-af84-f3dbfd465f68","Type":"ContainerDied","Data":"cf7891edbebe427d6df6ab8655c807d91514b1855f1f85bb5771ba2048a4e292"} Apr 17 21:41:18.561611 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:18.561576 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" event={"ID":"34b4cdfc-e883-44ab-9d97-6608b982ebe0","Type":"ContainerStarted","Data":"2290ed83bac5a52cf12e41fc325682fab01aab2e15bd3720bb0587fcb3bec8d6"} Apr 17 21:41:18.561788 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:18.561714 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:18.590145 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:18.590107 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" podStartSLOduration=1.559521994 podStartE2EDuration="4.590094927s" podCreationTimestamp="2026-04-17 21:41:14 +0000 UTC" firstStartedPulling="2026-04-17 21:41:14.952678813 +0000 UTC m=+271.715438269" lastFinishedPulling="2026-04-17 21:41:17.983251745 +0000 UTC m=+274.746011202" observedRunningTime="2026-04-17 21:41:18.58872649 +0000 UTC m=+275.351485967" watchObservedRunningTime="2026-04-17 21:41:18.590094927 +0000 UTC m=+275.352854405" Apr 17 21:41:19.567003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:19.566967 2564 generic.go:358] "Generic (PLEG): container finished" podID="647d76f3-6336-4954-af84-f3dbfd465f68" containerID="c8895a4aa2f4dc7384b677bff76c94ff3f169c9bf95710267a4a938340488528" exitCode=0 Apr 17 21:41:19.567377 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:19.567052 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" event={"ID":"647d76f3-6336-4954-af84-f3dbfd465f68","Type":"ContainerDied","Data":"c8895a4aa2f4dc7384b677bff76c94ff3f169c9bf95710267a4a938340488528"} Apr 17 21:41:20.693812 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.693786 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:20.798062 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.798028 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8tfx\" (UniqueName: \"kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx\") pod \"647d76f3-6336-4954-af84-f3dbfd465f68\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " Apr 17 21:41:20.798282 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.798100 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util\") pod \"647d76f3-6336-4954-af84-f3dbfd465f68\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " Apr 17 21:41:20.798282 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.798214 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle\") pod \"647d76f3-6336-4954-af84-f3dbfd465f68\" (UID: \"647d76f3-6336-4954-af84-f3dbfd465f68\") " Apr 17 21:41:20.798659 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.798631 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle" (OuterVolumeSpecName: "bundle") pod "647d76f3-6336-4954-af84-f3dbfd465f68" (UID: "647d76f3-6336-4954-af84-f3dbfd465f68"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:20.800197 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.800175 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx" (OuterVolumeSpecName: "kube-api-access-q8tfx") pod "647d76f3-6336-4954-af84-f3dbfd465f68" (UID: "647d76f3-6336-4954-af84-f3dbfd465f68"). InnerVolumeSpecName "kube-api-access-q8tfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:41:20.802389 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.802364 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util" (OuterVolumeSpecName: "util") pod "647d76f3-6336-4954-af84-f3dbfd465f68" (UID: "647d76f3-6336-4954-af84-f3dbfd465f68"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:20.898943 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.898908 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:20.898943 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.898939 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q8tfx\" (UniqueName: \"kubernetes.io/projected/647d76f3-6336-4954-af84-f3dbfd465f68-kube-api-access-q8tfx\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:20.898943 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:20.898948 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/647d76f3-6336-4954-af84-f3dbfd465f68-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:21.575117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:21.575083 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" event={"ID":"647d76f3-6336-4954-af84-f3dbfd465f68","Type":"ContainerDied","Data":"2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac"} Apr 17 21:41:21.575117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:21.575120 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2ba5f62eabaab1d7f60313bd17a89b38b6858f0b84b02d462d6e3799eff540ac" Apr 17 21:41:21.575317 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:21.575125 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/77defbb6647882b321b46d16bdefe62633c3425354d3d93571a1d6a87ffl6jb" Apr 17 21:41:24.569680 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:24.569647 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-xc9bk" Apr 17 21:41:26.141366 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141330 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl"] Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141603 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="util" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141615 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="util" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141625 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="pull" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141630 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="pull" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141638 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="extract" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141644 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="extract" Apr 17 21:41:26.141774 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.141684 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="647d76f3-6336-4954-af84-f3dbfd465f68" containerName="extract" Apr 17 21:41:26.144428 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.144412 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.147070 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.147042 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 21:41:26.147194 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.147048 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 21:41:26.147194 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.147150 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-kjqql\"" Apr 17 21:41:26.153332 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.153311 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl"] Apr 17 21:41:26.235465 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.235428 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf52de7f-83de-4f9e-9f54-61cc669988ec-tmp\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.235634 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.235494 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk87l\" (UniqueName: \"kubernetes.io/projected/bf52de7f-83de-4f9e-9f54-61cc669988ec-kube-api-access-kk87l\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.336710 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.336671 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk87l\" (UniqueName: \"kubernetes.io/projected/bf52de7f-83de-4f9e-9f54-61cc669988ec-kube-api-access-kk87l\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.336880 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.336733 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf52de7f-83de-4f9e-9f54-61cc669988ec-tmp\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.337074 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.337056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bf52de7f-83de-4f9e-9f54-61cc669988ec-tmp\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.344655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.344629 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk87l\" (UniqueName: \"kubernetes.io/projected/bf52de7f-83de-4f9e-9f54-61cc669988ec-kube-api-access-kk87l\") pod \"openshift-lws-operator-bfc7f696d-82jnl\" (UID: \"bf52de7f-83de-4f9e-9f54-61cc669988ec\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.453229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.453137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" Apr 17 21:41:26.584658 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.584618 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl"] Apr 17 21:41:26.588643 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:26.588611 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf52de7f_83de_4f9e_9f54_61cc669988ec.slice/crio-db308b8ae42a6e56bb3138dcd74ea0cd69e814144530d10fbe9279607ed30737 WatchSource:0}: Error finding container db308b8ae42a6e56bb3138dcd74ea0cd69e814144530d10fbe9279607ed30737: Status 404 returned error can't find the container with id db308b8ae42a6e56bb3138dcd74ea0cd69e814144530d10fbe9279607ed30737 Apr 17 21:41:26.596679 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:26.596649 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" event={"ID":"bf52de7f-83de-4f9e-9f54-61cc669988ec","Type":"ContainerStarted","Data":"db308b8ae42a6e56bb3138dcd74ea0cd69e814144530d10fbe9279607ed30737"} Apr 17 21:41:29.608203 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:29.608166 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" event={"ID":"bf52de7f-83de-4f9e-9f54-61cc669988ec","Type":"ContainerStarted","Data":"f05a2fc957a752661ad7053dce08aa26ee7ca2d550af5f7de520fec9474352f7"} Apr 17 21:41:29.624356 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:29.624308 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-82jnl" podStartSLOduration=1.408877592 podStartE2EDuration="3.624296539s" podCreationTimestamp="2026-04-17 21:41:26 +0000 UTC" firstStartedPulling="2026-04-17 21:41:26.59011181 +0000 UTC m=+283.352871265" lastFinishedPulling="2026-04-17 21:41:28.805530753 +0000 UTC m=+285.568290212" observedRunningTime="2026-04-17 21:41:29.622404447 +0000 UTC m=+286.385163925" watchObservedRunningTime="2026-04-17 21:41:29.624296539 +0000 UTC m=+286.387056016" Apr 17 21:41:31.925059 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.925025 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9"] Apr 17 21:41:31.928443 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.928425 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:31.930960 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.930937 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:41:31.931215 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.931200 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:41:31.932608 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.932566 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:41:31.934657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:31.934633 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9"] Apr 17 21:41:32.085401 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.085370 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m6bhf\" (UniqueName: \"kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.085401 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.085409 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.085574 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.085432 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.186010 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.185930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m6bhf\" (UniqueName: \"kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.186010 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.185972 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.186190 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.186023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.186342 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.186314 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.186398 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.186382 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.193200 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.193176 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m6bhf\" (UniqueName: \"kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf\") pod \"3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.239132 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.239104 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:32.355166 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.355134 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9"] Apr 17 21:41:32.357973 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:32.357933 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda5308c43_32a5_47f7_9798_e12533c6d795.slice/crio-233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc WatchSource:0}: Error finding container 233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc: Status 404 returned error can't find the container with id 233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc Apr 17 21:41:32.619537 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.619502 2564 generic.go:358] "Generic (PLEG): container finished" podID="a5308c43-32a5-47f7-9798-e12533c6d795" containerID="0c070b3ce692e1b63ce51a10da06e78ab6f7669b7076f9045dd7285e4720cc85" exitCode=0 Apr 17 21:41:32.619698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.619567 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" event={"ID":"a5308c43-32a5-47f7-9798-e12533c6d795","Type":"ContainerDied","Data":"0c070b3ce692e1b63ce51a10da06e78ab6f7669b7076f9045dd7285e4720cc85"} Apr 17 21:41:32.619698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:32.619608 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" event={"ID":"a5308c43-32a5-47f7-9798-e12533c6d795","Type":"ContainerStarted","Data":"233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc"} Apr 17 21:41:33.284826 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.284803 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-77zgj"] Apr 17 21:41:33.287798 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.287782 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.290195 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.290177 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-j72ht\"" Apr 17 21:41:33.294152 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.294129 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-77zgj"] Apr 17 21:41:33.395415 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.395389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfbrf\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-kube-api-access-cfbrf\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.395558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.395431 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-bound-sa-token\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.496395 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.496316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cfbrf\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-kube-api-access-cfbrf\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.496395 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.496360 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-bound-sa-token\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.504163 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.504129 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-bound-sa-token\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.504278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.504241 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfbrf\" (UniqueName: \"kubernetes.io/projected/3b28c75a-ba0c-4bc7-8880-2fad9d2e8945-kube-api-access-cfbrf\") pod \"cert-manager-759f64656b-77zgj\" (UID: \"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945\") " pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.623895 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.623861 2564 generic.go:358] "Generic (PLEG): container finished" podID="a5308c43-32a5-47f7-9798-e12533c6d795" containerID="4acfb93fdec636066e3cedb5d27e2fcdc44edceee9e26328a50d498c2aa8c8b8" exitCode=0 Apr 17 21:41:33.624059 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.623953 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" event={"ID":"a5308c43-32a5-47f7-9798-e12533c6d795","Type":"ContainerDied","Data":"4acfb93fdec636066e3cedb5d27e2fcdc44edceee9e26328a50d498c2aa8c8b8"} Apr 17 21:41:33.637248 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.637231 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-77zgj" Apr 17 21:41:33.767072 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:33.767046 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-77zgj"] Apr 17 21:41:33.769454 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:33.769424 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b28c75a_ba0c_4bc7_8880_2fad9d2e8945.slice/crio-f3e979150d08d7d1fa670c02a78e5b78b0a6bd29f8dc03631d834886022096da WatchSource:0}: Error finding container f3e979150d08d7d1fa670c02a78e5b78b0a6bd29f8dc03631d834886022096da: Status 404 returned error can't find the container with id f3e979150d08d7d1fa670c02a78e5b78b0a6bd29f8dc03631d834886022096da Apr 17 21:41:34.629194 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:34.629155 2564 generic.go:358] "Generic (PLEG): container finished" podID="a5308c43-32a5-47f7-9798-e12533c6d795" containerID="fac4d8be83afe5da0b055ff94243d9d090ef8c36a75695356d1110bef3ac3d3a" exitCode=0 Apr 17 21:41:34.629638 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:34.629240 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" event={"ID":"a5308c43-32a5-47f7-9798-e12533c6d795","Type":"ContainerDied","Data":"fac4d8be83afe5da0b055ff94243d9d090ef8c36a75695356d1110bef3ac3d3a"} Apr 17 21:41:34.630517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:34.630493 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-77zgj" event={"ID":"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945","Type":"ContainerStarted","Data":"09d08f16b1b8f2f67ef0439c862054d9d783048fdf42734d8135782d440a11a3"} Apr 17 21:41:34.630640 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:34.630524 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-77zgj" event={"ID":"3b28c75a-ba0c-4bc7-8880-2fad9d2e8945","Type":"ContainerStarted","Data":"f3e979150d08d7d1fa670c02a78e5b78b0a6bd29f8dc03631d834886022096da"} Apr 17 21:41:34.657796 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:34.657755 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-77zgj" podStartSLOduration=1.657741804 podStartE2EDuration="1.657741804s" podCreationTimestamp="2026-04-17 21:41:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:41:34.65624867 +0000 UTC m=+291.419008147" watchObservedRunningTime="2026-04-17 21:41:34.657741804 +0000 UTC m=+291.420501282" Apr 17 21:41:35.753802 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.753780 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:35.914972 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.914881 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle\") pod \"a5308c43-32a5-47f7-9798-e12533c6d795\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " Apr 17 21:41:35.915137 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.914978 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m6bhf\" (UniqueName: \"kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf\") pod \"a5308c43-32a5-47f7-9798-e12533c6d795\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " Apr 17 21:41:35.915137 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.915013 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util\") pod \"a5308c43-32a5-47f7-9798-e12533c6d795\" (UID: \"a5308c43-32a5-47f7-9798-e12533c6d795\") " Apr 17 21:41:35.915695 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.915636 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle" (OuterVolumeSpecName: "bundle") pod "a5308c43-32a5-47f7-9798-e12533c6d795" (UID: "a5308c43-32a5-47f7-9798-e12533c6d795"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:35.917150 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.917126 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf" (OuterVolumeSpecName: "kube-api-access-m6bhf") pod "a5308c43-32a5-47f7-9798-e12533c6d795" (UID: "a5308c43-32a5-47f7-9798-e12533c6d795"). InnerVolumeSpecName "kube-api-access-m6bhf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:41:35.919798 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:35.919775 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util" (OuterVolumeSpecName: "util") pod "a5308c43-32a5-47f7-9798-e12533c6d795" (UID: "a5308c43-32a5-47f7-9798-e12533c6d795"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:36.015721 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.015685 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m6bhf\" (UniqueName: \"kubernetes.io/projected/a5308c43-32a5-47f7-9798-e12533c6d795-kube-api-access-m6bhf\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:36.015721 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.015716 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:36.015721 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.015726 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/a5308c43-32a5-47f7-9798-e12533c6d795-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:36.638110 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.638078 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" event={"ID":"a5308c43-32a5-47f7-9798-e12533c6d795","Type":"ContainerDied","Data":"233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc"} Apr 17 21:41:36.638110 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.638106 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3081035efbc3f76b8ae1b663abbd2a6bc5b4896fe94818011b247755c5pvln9" Apr 17 21:41:36.638313 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:36.638112 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="233c07a4907755207747b89660dabf4b5b7de02e95da291ad517ce2254fa6fdc" Apr 17 21:41:43.631412 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:43.631379 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:41:43.632003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:43.631985 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:41:43.639306 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:43.639284 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 21:41:45.929257 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929223 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9"] Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929620 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="pull" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929649 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="pull" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929667 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="extract" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929675 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="extract" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929689 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="util" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929697 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="util" Apr 17 21:41:45.933178 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.929753 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="a5308c43-32a5-47f7-9798-e12533c6d795" containerName="extract" Apr 17 21:41:45.934655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.934636 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:45.937263 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.937240 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:41:45.937369 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.937240 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:41:45.938503 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.938483 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:41:45.940355 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:45.940334 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9"] Apr 17 21:41:46.095035 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.095001 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.095220 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.095047 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kwd7\" (UniqueName: \"kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.095220 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.095072 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.195781 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.195696 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.195781 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.195751 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kwd7\" (UniqueName: \"kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.195968 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.195795 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.196101 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.196078 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.196223 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.196120 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.208671 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.208642 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kwd7\" (UniqueName: \"kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7\") pod \"f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.244587 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.244552 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:46.366411 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.366385 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9"] Apr 17 21:41:46.368569 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:46.368533 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9eeec88_2982_483c_aab4_d120c66eb552.slice/crio-c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481 WatchSource:0}: Error finding container c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481: Status 404 returned error can't find the container with id c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481 Apr 17 21:41:46.370478 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.370459 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:41:46.656131 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.656097 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9"] Apr 17 21:41:46.659922 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.659901 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.662759 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.662737 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 21:41:46.663073 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.663057 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 21:41:46.663164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.663136 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-lx4wk\"" Apr 17 21:41:46.663164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.663135 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 21:41:46.663408 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.663388 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 21:41:46.668250 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.668222 2564 generic.go:358] "Generic (PLEG): container finished" podID="c9eeec88-2982-483c-aab4-d120c66eb552" containerID="a001e827bbeb7a997c872fe9709c1f20511c43b6df47b2b94b0087cfbb445263" exitCode=0 Apr 17 21:41:46.668343 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.668303 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" event={"ID":"c9eeec88-2982-483c-aab4-d120c66eb552","Type":"ContainerDied","Data":"a001e827bbeb7a997c872fe9709c1f20511c43b6df47b2b94b0087cfbb445263"} Apr 17 21:41:46.668392 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.668339 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" event={"ID":"c9eeec88-2982-483c-aab4-d120c66eb552","Type":"ContainerStarted","Data":"c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481"} Apr 17 21:41:46.673047 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.673026 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9"] Apr 17 21:41:46.802417 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.802307 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75pc\" (UniqueName: \"kubernetes.io/projected/6d0fa348-40af-4bc6-a265-834e1ef67d2b-kube-api-access-c75pc\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.802417 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.802410 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.802695 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.802456 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.903676 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.903645 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.903868 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.903698 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.903868 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.903774 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c75pc\" (UniqueName: \"kubernetes.io/projected/6d0fa348-40af-4bc6-a265-834e1ef67d2b-kube-api-access-c75pc\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.906093 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.906070 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.906212 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.906124 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/6d0fa348-40af-4bc6-a265-834e1ef67d2b-webhook-cert\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.911250 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.911231 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75pc\" (UniqueName: \"kubernetes.io/projected/6d0fa348-40af-4bc6-a265-834e1ef67d2b-kube-api-access-c75pc\") pod \"opendatahub-operator-controller-manager-6bfddf7b9f-btzv9\" (UID: \"6d0fa348-40af-4bc6-a265-834e1ef67d2b\") " pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:46.970907 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:46.970871 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:47.093165 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:47.093131 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9"] Apr 17 21:41:47.095836 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:47.095805 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d0fa348_40af_4bc6_a265_834e1ef67d2b.slice/crio-90597ddb83749bad160148f9923a45f423c5b18f699954b6f1735a0ac657ea9f WatchSource:0}: Error finding container 90597ddb83749bad160148f9923a45f423c5b18f699954b6f1735a0ac657ea9f: Status 404 returned error can't find the container with id 90597ddb83749bad160148f9923a45f423c5b18f699954b6f1735a0ac657ea9f Apr 17 21:41:47.674619 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:47.674508 2564 generic.go:358] "Generic (PLEG): container finished" podID="c9eeec88-2982-483c-aab4-d120c66eb552" containerID="088e424f0eb93493e52e48075b85973dfa341a98ee28da94df0fa4769e9d7b05" exitCode=0 Apr 17 21:41:47.674778 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:47.674635 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" event={"ID":"c9eeec88-2982-483c-aab4-d120c66eb552","Type":"ContainerDied","Data":"088e424f0eb93493e52e48075b85973dfa341a98ee28da94df0fa4769e9d7b05"} Apr 17 21:41:47.676105 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:47.676073 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" event={"ID":"6d0fa348-40af-4bc6-a265-834e1ef67d2b","Type":"ContainerStarted","Data":"90597ddb83749bad160148f9923a45f423c5b18f699954b6f1735a0ac657ea9f"} Apr 17 21:41:48.681497 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:48.681464 2564 generic.go:358] "Generic (PLEG): container finished" podID="c9eeec88-2982-483c-aab4-d120c66eb552" containerID="c8f8f476afb1aab5f9c4782c3476cbce50d5d19aa871a2ac2a5669089f126744" exitCode=0 Apr 17 21:41:48.681994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:48.681572 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" event={"ID":"c9eeec88-2982-483c-aab4-d120c66eb552","Type":"ContainerDied","Data":"c8f8f476afb1aab5f9c4782c3476cbce50d5d19aa871a2ac2a5669089f126744"} Apr 17 21:41:49.686478 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.686439 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" event={"ID":"6d0fa348-40af-4bc6-a265-834e1ef67d2b","Type":"ContainerStarted","Data":"5eae4c6a964466699958657c7e38749db1f35a93072cb63cf61d9328c7f88153"} Apr 17 21:41:49.686971 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.686715 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:41:49.711823 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.711759 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" podStartSLOduration=1.283160422 podStartE2EDuration="3.711744527s" podCreationTimestamp="2026-04-17 21:41:46 +0000 UTC" firstStartedPulling="2026-04-17 21:41:47.097545577 +0000 UTC m=+303.860305033" lastFinishedPulling="2026-04-17 21:41:49.52612968 +0000 UTC m=+306.288889138" observedRunningTime="2026-04-17 21:41:49.70921917 +0000 UTC m=+306.471978648" watchObservedRunningTime="2026-04-17 21:41:49.711744527 +0000 UTC m=+306.474504002" Apr 17 21:41:49.803387 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.803363 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:49.931524 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.931494 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle\") pod \"c9eeec88-2982-483c-aab4-d120c66eb552\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " Apr 17 21:41:49.931760 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.931550 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kwd7\" (UniqueName: \"kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7\") pod \"c9eeec88-2982-483c-aab4-d120c66eb552\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " Apr 17 21:41:49.931760 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.931587 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util\") pod \"c9eeec88-2982-483c-aab4-d120c66eb552\" (UID: \"c9eeec88-2982-483c-aab4-d120c66eb552\") " Apr 17 21:41:49.932369 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.932342 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle" (OuterVolumeSpecName: "bundle") pod "c9eeec88-2982-483c-aab4-d120c66eb552" (UID: "c9eeec88-2982-483c-aab4-d120c66eb552"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:49.933749 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.933717 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7" (OuterVolumeSpecName: "kube-api-access-9kwd7") pod "c9eeec88-2982-483c-aab4-d120c66eb552" (UID: "c9eeec88-2982-483c-aab4-d120c66eb552"). InnerVolumeSpecName "kube-api-access-9kwd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:41:49.937408 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:49.937383 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util" (OuterVolumeSpecName: "util") pod "c9eeec88-2982-483c-aab4-d120c66eb552" (UID: "c9eeec88-2982-483c-aab4-d120c66eb552"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:41:50.032301 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.032272 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:50.032301 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.032297 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9kwd7\" (UniqueName: \"kubernetes.io/projected/c9eeec88-2982-483c-aab4-d120c66eb552-kube-api-access-9kwd7\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:50.032301 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.032306 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c9eeec88-2982-483c-aab4-d120c66eb552-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:41:50.691207 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.691179 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" Apr 17 21:41:50.691636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.691213 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f77c1a82ade775f7608969957d57ee0b8db93eeada9825bd6f7f7156c9n6rz9" event={"ID":"c9eeec88-2982-483c-aab4-d120c66eb552","Type":"ContainerDied","Data":"c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481"} Apr 17 21:41:50.691636 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:50.691239 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c268038b5fba4da57bc5fe58ec4b0873aec1e1d407386f89fb6c565c9b443481" Apr 17 21:41:56.458239 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458201 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv"] Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458610 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="pull" Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458628 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="pull" Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458652 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="extract" Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458661 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="extract" Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458683 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="util" Apr 17 21:41:56.458718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458691 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="util" Apr 17 21:41:56.459013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.458760 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c9eeec88-2982-483c-aab4-d120c66eb552" containerName="extract" Apr 17 21:41:56.461711 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.461690 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.468202 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.468178 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 21:41:56.468202 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.468195 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-4bhq5\"" Apr 17 21:41:56.468337 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.468210 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 21:41:56.468337 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.468222 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 21:41:56.486139 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.486113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-metrics-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.486237 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.486148 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.486237 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.486212 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xlp9\" (UniqueName: \"kubernetes.io/projected/8d9ad950-c742-40fe-9770-d484bfdea043-kube-api-access-7xlp9\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.486314 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.486235 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d9ad950-c742-40fe-9770-d484bfdea043-manager-config\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.490397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.490371 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv"] Apr 17 21:41:56.587013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.586975 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xlp9\" (UniqueName: \"kubernetes.io/projected/8d9ad950-c742-40fe-9770-d484bfdea043-kube-api-access-7xlp9\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.587013 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.587013 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d9ad950-c742-40fe-9770-d484bfdea043-manager-config\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.587256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.587053 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-metrics-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.587256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.587076 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.587735 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.587707 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8d9ad950-c742-40fe-9770-d484bfdea043-manager-config\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.589571 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.589556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-metrics-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.589659 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.589615 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8d9ad950-c742-40fe-9770-d484bfdea043-cert\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.600473 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.600452 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xlp9\" (UniqueName: \"kubernetes.io/projected/8d9ad950-c742-40fe-9770-d484bfdea043-kube-api-access-7xlp9\") pod \"lws-controller-manager-64dbd89fbc-44bpv\" (UID: \"8d9ad950-c742-40fe-9770-d484bfdea043\") " pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.770433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.770346 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:56.887706 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:56.887678 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv"] Apr 17 21:41:56.890025 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:41:56.889993 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d9ad950_c742_40fe_9770_d484bfdea043.slice/crio-4daed4d7735581ef29337cf06a3d1a080cd4395dba6d154fd6ad520478eedb55 WatchSource:0}: Error finding container 4daed4d7735581ef29337cf06a3d1a080cd4395dba6d154fd6ad520478eedb55: Status 404 returned error can't find the container with id 4daed4d7735581ef29337cf06a3d1a080cd4395dba6d154fd6ad520478eedb55 Apr 17 21:41:57.714129 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:57.714085 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" event={"ID":"8d9ad950-c742-40fe-9770-d484bfdea043","Type":"ContainerStarted","Data":"4daed4d7735581ef29337cf06a3d1a080cd4395dba6d154fd6ad520478eedb55"} Apr 17 21:41:58.718780 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:58.718693 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" event={"ID":"8d9ad950-c742-40fe-9770-d484bfdea043","Type":"ContainerStarted","Data":"30e5b16953faa7031d7addc2e2d4167fc6e656e539231969f05d80efde6e2477"} Apr 17 21:41:58.719201 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:58.718922 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:41:58.737234 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:41:58.737191 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" podStartSLOduration=1.19945684 podStartE2EDuration="2.737178839s" podCreationTimestamp="2026-04-17 21:41:56 +0000 UTC" firstStartedPulling="2026-04-17 21:41:56.891827164 +0000 UTC m=+313.654586620" lastFinishedPulling="2026-04-17 21:41:58.429549151 +0000 UTC m=+315.192308619" observedRunningTime="2026-04-17 21:41:58.735858164 +0000 UTC m=+315.498617643" watchObservedRunningTime="2026-04-17 21:41:58.737178839 +0000 UTC m=+315.499938317" Apr 17 21:42:00.693744 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:00.693715 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6bfddf7b9f-btzv9" Apr 17 21:42:07.134439 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.134398 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c"] Apr 17 21:42:07.138682 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.138660 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.141693 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.141668 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:42:07.141788 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.141728 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:42:07.142753 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.142734 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:42:07.146256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.146235 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c"] Apr 17 21:42:07.157654 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.157614 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-std4z\" (UniqueName: \"kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.157755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.157731 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.157814 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.157781 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.258459 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.258426 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.258663 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.258488 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.258663 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.258538 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-std4z\" (UniqueName: \"kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.258824 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.258804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.258906 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.258832 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.267134 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.267113 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-std4z\" (UniqueName: \"kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z\") pod \"4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.448803 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.448721 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:07.566275 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.566247 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c"] Apr 17 21:42:07.568358 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:42:07.568331 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf189a27e_bc2b_4684_bb19_54de56f94452.slice/crio-805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4 WatchSource:0}: Error finding container 805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4: Status 404 returned error can't find the container with id 805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4 Apr 17 21:42:07.750055 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.749974 2564 generic.go:358] "Generic (PLEG): container finished" podID="f189a27e-bc2b-4684-bb19-54de56f94452" containerID="e8c496a8828e5d3163bf554b34aa9a50d5dd46eeac9a701b5dbd43d5546ed3af" exitCode=0 Apr 17 21:42:07.750055 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.750028 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" event={"ID":"f189a27e-bc2b-4684-bb19-54de56f94452","Type":"ContainerDied","Data":"e8c496a8828e5d3163bf554b34aa9a50d5dd46eeac9a701b5dbd43d5546ed3af"} Apr 17 21:42:07.750055 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:07.750051 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" event={"ID":"f189a27e-bc2b-4684-bb19-54de56f94452","Type":"ContainerStarted","Data":"805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4"} Apr 17 21:42:08.754685 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:08.754650 2564 generic.go:358] "Generic (PLEG): container finished" podID="f189a27e-bc2b-4684-bb19-54de56f94452" containerID="8aa2a79d8cc1413267e1b8870913f2bd7e29233d45a3b8107d25f0a6afdb2822" exitCode=0 Apr 17 21:42:08.755102 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:08.754739 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" event={"ID":"f189a27e-bc2b-4684-bb19-54de56f94452","Type":"ContainerDied","Data":"8aa2a79d8cc1413267e1b8870913f2bd7e29233d45a3b8107d25f0a6afdb2822"} Apr 17 21:42:09.724170 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:09.724141 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64dbd89fbc-44bpv" Apr 17 21:42:09.759549 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:09.759518 2564 generic.go:358] "Generic (PLEG): container finished" podID="f189a27e-bc2b-4684-bb19-54de56f94452" containerID="cd81d167448e0051d4e3e5f26f88ea8a2746ef0a9e3e84ffbc6de684e7e21a4b" exitCode=0 Apr 17 21:42:09.759995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:09.759608 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" event={"ID":"f189a27e-bc2b-4684-bb19-54de56f94452","Type":"ContainerDied","Data":"cd81d167448e0051d4e3e5f26f88ea8a2746ef0a9e3e84ffbc6de684e7e21a4b"} Apr 17 21:42:10.881248 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.881226 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:10.986723 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.986691 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle\") pod \"f189a27e-bc2b-4684-bb19-54de56f94452\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " Apr 17 21:42:10.986892 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.986746 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-std4z\" (UniqueName: \"kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z\") pod \"f189a27e-bc2b-4684-bb19-54de56f94452\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " Apr 17 21:42:10.986892 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.986779 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util\") pod \"f189a27e-bc2b-4684-bb19-54de56f94452\" (UID: \"f189a27e-bc2b-4684-bb19-54de56f94452\") " Apr 17 21:42:10.987662 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.987632 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle" (OuterVolumeSpecName: "bundle") pod "f189a27e-bc2b-4684-bb19-54de56f94452" (UID: "f189a27e-bc2b-4684-bb19-54de56f94452"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:42:10.988827 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.988808 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z" (OuterVolumeSpecName: "kube-api-access-std4z") pod "f189a27e-bc2b-4684-bb19-54de56f94452" (UID: "f189a27e-bc2b-4684-bb19-54de56f94452"). InnerVolumeSpecName "kube-api-access-std4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:42:10.994275 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:10.994249 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util" (OuterVolumeSpecName: "util") pod "f189a27e-bc2b-4684-bb19-54de56f94452" (UID: "f189a27e-bc2b-4684-bb19-54de56f94452"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:42:11.087402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.087318 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-std4z\" (UniqueName: \"kubernetes.io/projected/f189a27e-bc2b-4684-bb19-54de56f94452-kube-api-access-std4z\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:11.087402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.087346 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:11.087402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.087356 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f189a27e-bc2b-4684-bb19-54de56f94452-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:11.768486 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.768460 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" Apr 17 21:42:11.768486 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.768466 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4c892a6d2d8a57acbc49427d3a7d24e253bab25be2c607ef405d2c4835b544c" event={"ID":"f189a27e-bc2b-4684-bb19-54de56f94452","Type":"ContainerDied","Data":"805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4"} Apr 17 21:42:11.768486 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:11.768492 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="805f9f3060716db66acf251c2664979ab1009b00d47d0a40a732d35a2415c5f4" Apr 17 21:42:16.823882 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.823849 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv"] Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824138 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="util" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824149 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="util" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824164 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="extract" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824169 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="extract" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824180 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="pull" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824186 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="pull" Apr 17 21:42:16.824329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.824238 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="f189a27e-bc2b-4684-bb19-54de56f94452" containerName="extract" Apr 17 21:42:16.828572 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.828555 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:16.831469 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.831437 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 17 21:42:16.832204 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.832180 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 17 21:42:16.832357 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.832238 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-b2qj7\"" Apr 17 21:42:16.837954 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.837929 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv"] Apr 17 21:42:16.932133 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.932092 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:16.932133 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.932138 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:16.932354 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:16.932160 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxmd6\" (UniqueName: \"kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.033466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.033412 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.033466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.033470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.033765 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.033621 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxmd6\" (UniqueName: \"kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.033829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.033814 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.033867 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.033846 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.042844 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.042814 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxmd6\" (UniqueName: \"kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6\") pod \"d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.138946 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.138909 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:17.306059 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.306026 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv"] Apr 17 21:42:17.311346 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:42:17.311316 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd4536b_e8cd_4f25_b01c_9a7bd0883826.slice/crio-302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06 WatchSource:0}: Error finding container 302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06: Status 404 returned error can't find the container with id 302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06 Apr 17 21:42:17.790934 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.790836 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerID="edb54b6bebfebae9a4d49c0a2e0894196c6d66597387a9b35e44051766384015" exitCode=0 Apr 17 21:42:17.791116 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.790928 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" event={"ID":"3bd4536b-e8cd-4f25-b01c-9a7bd0883826","Type":"ContainerDied","Data":"edb54b6bebfebae9a4d49c0a2e0894196c6d66597387a9b35e44051766384015"} Apr 17 21:42:17.791116 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:17.790970 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" event={"ID":"3bd4536b-e8cd-4f25-b01c-9a7bd0883826","Type":"ContainerStarted","Data":"302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06"} Apr 17 21:42:19.799496 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:19.799461 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerID="34ffbf549895ffe0257baf53d495e7a80e2a930ea6f25e1864011ebe6729dd35" exitCode=0 Apr 17 21:42:19.799959 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:19.799515 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" event={"ID":"3bd4536b-e8cd-4f25-b01c-9a7bd0883826","Type":"ContainerDied","Data":"34ffbf549895ffe0257baf53d495e7a80e2a930ea6f25e1864011ebe6729dd35"} Apr 17 21:42:20.805656 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:20.805624 2564 generic.go:358] "Generic (PLEG): container finished" podID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerID="9fdd1053a79ae7d5b19a6496045265bc9eff4406d3874c67dba46af732c14e86" exitCode=0 Apr 17 21:42:20.806056 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:20.805691 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" event={"ID":"3bd4536b-e8cd-4f25-b01c-9a7bd0883826","Type":"ContainerDied","Data":"9fdd1053a79ae7d5b19a6496045265bc9eff4406d3874c67dba46af732c14e86"} Apr 17 21:42:21.932642 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.932616 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:21.973548 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.973524 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle\") pod \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " Apr 17 21:42:21.973697 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.973560 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxmd6\" (UniqueName: \"kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6\") pod \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " Apr 17 21:42:21.973697 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.973660 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util\") pod \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\" (UID: \"3bd4536b-e8cd-4f25-b01c-9a7bd0883826\") " Apr 17 21:42:21.974475 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.974443 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle" (OuterVolumeSpecName: "bundle") pod "3bd4536b-e8cd-4f25-b01c-9a7bd0883826" (UID: "3bd4536b-e8cd-4f25-b01c-9a7bd0883826"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:42:21.975558 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.975540 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6" (OuterVolumeSpecName: "kube-api-access-mxmd6") pod "3bd4536b-e8cd-4f25-b01c-9a7bd0883826" (UID: "3bd4536b-e8cd-4f25-b01c-9a7bd0883826"). InnerVolumeSpecName "kube-api-access-mxmd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:42:21.978835 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:21.978800 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util" (OuterVolumeSpecName: "util") pod "3bd4536b-e8cd-4f25-b01c-9a7bd0883826" (UID: "3bd4536b-e8cd-4f25-b01c-9a7bd0883826"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:42:22.074239 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.074162 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:22.074239 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.074188 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:22.074239 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.074200 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxmd6\" (UniqueName: \"kubernetes.io/projected/3bd4536b-e8cd-4f25-b01c-9a7bd0883826-kube-api-access-mxmd6\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:42:22.815229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.815199 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" event={"ID":"3bd4536b-e8cd-4f25-b01c-9a7bd0883826","Type":"ContainerDied","Data":"302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06"} Apr 17 21:42:22.815229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.815217 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d813cbca2f2d4ae5bb9c3e9ca6bc9dc97fa22f4f10cc797dd3b2c1f0c2jxnzv" Apr 17 21:42:22.815229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:22.815234 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302d60e0ec64807df9e140727c37e332d6945de45fae7d45c7be408175261e06" Apr 17 21:42:22.849945 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:42:22.849916 2564 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bd4536b_e8cd_4f25_b01c_9a7bd0883826.slice\": RecentStats: unable to find data in memory cache]" Apr 17 21:42:32.507468 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507428 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds"] Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507759 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="extract" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507772 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="extract" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507786 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="pull" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507792 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="pull" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507804 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="util" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507810 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="util" Apr 17 21:42:32.507955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.507858 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="3bd4536b-e8cd-4f25-b01c-9a7bd0883826" containerName="extract" Apr 17 21:42:32.511751 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.511727 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.514498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.514323 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-p2pg9\"" Apr 17 21:42:32.514498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.514353 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 21:42:32.514498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.514382 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 21:42:32.514498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.514396 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 21:42:32.520738 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.520714 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds"] Apr 17 21:42:32.661505 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661469 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661511 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6d13e4eb-609d-477a-832d-fdb2831db5a9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661580 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661613 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661656 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661698 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661690 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661905 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661713 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661905 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661744 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.661905 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.661779 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfsxd\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-kube-api-access-wfsxd\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762364 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762274 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762364 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762321 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762615 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762470 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfsxd\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-kube-api-access-wfsxd\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762615 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762517 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762615 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762552 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6d13e4eb-609d-477a-832d-fdb2831db5a9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762615 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762609 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762634 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762706 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762725 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.762829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762812 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.763068 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.762909 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.763125 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.763096 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.763212 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.763194 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/6d13e4eb-609d-477a-832d-fdb2831db5a9-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.764845 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.764827 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.764990 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.764969 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.770079 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.770058 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.770278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.770260 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfsxd\" (UniqueName: \"kubernetes.io/projected/6d13e4eb-609d-477a-832d-fdb2831db5a9-kube-api-access-wfsxd\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds\" (UID: \"6d13e4eb-609d-477a-832d-fdb2831db5a9\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.823773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.823745 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:32.944456 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:32.944429 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds"] Apr 17 21:42:32.947001 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:42:32.946975 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d13e4eb_609d_477a_832d_fdb2831db5a9.slice/crio-929a5158b7520c7765045d3dbeb8975cecc4391068659dae07bfc3cac452ce54 WatchSource:0}: Error finding container 929a5158b7520c7765045d3dbeb8975cecc4391068659dae07bfc3cac452ce54: Status 404 returned error can't find the container with id 929a5158b7520c7765045d3dbeb8975cecc4391068659dae07bfc3cac452ce54 Apr 17 21:42:33.854899 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:33.854863 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" event={"ID":"6d13e4eb-609d-477a-832d-fdb2831db5a9","Type":"ContainerStarted","Data":"929a5158b7520c7765045d3dbeb8975cecc4391068659dae07bfc3cac452ce54"} Apr 17 21:42:35.306877 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:35.306841 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:42:35.307229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:35.306941 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:42:35.307229 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:35.306990 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:42:35.863156 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:35.863120 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" event={"ID":"6d13e4eb-609d-477a-832d-fdb2831db5a9","Type":"ContainerStarted","Data":"3187b2c9067b66ccd69d4885250b9eaa328d26508dd3da0b3e24f65d03f96cf3"} Apr 17 21:42:35.884662 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:35.884614 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" podStartSLOduration=1.526796875 podStartE2EDuration="3.884578305s" podCreationTimestamp="2026-04-17 21:42:32 +0000 UTC" firstStartedPulling="2026-04-17 21:42:32.948797302 +0000 UTC m=+349.711556767" lastFinishedPulling="2026-04-17 21:42:35.306578737 +0000 UTC m=+352.069338197" observedRunningTime="2026-04-17 21:42:35.882533159 +0000 UTC m=+352.645292636" watchObservedRunningTime="2026-04-17 21:42:35.884578305 +0000 UTC m=+352.647337783" Apr 17 21:42:36.824126 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:36.824087 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:36.828480 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:36.828456 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:36.867020 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:36.866987 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:42:36.867881 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:42:36.867863 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds" Apr 17 21:43:04.945353 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.945320 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:04.948829 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.948814 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:04.951580 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.951553 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-zhw9w\"" Apr 17 21:43:04.951580 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.951567 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 21:43:04.952801 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.952785 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 21:43:04.966712 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:04.966682 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:05.025692 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.025661 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5zf9\" (UniqueName: \"kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9\") pod \"kuadrant-operator-catalog-kcq29\" (UID: \"2fe18da0-b2e5-4531-8595-c128045bbfa5\") " pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:05.126139 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.126107 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5zf9\" (UniqueName: \"kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9\") pod \"kuadrant-operator-catalog-kcq29\" (UID: \"2fe18da0-b2e5-4531-8595-c128045bbfa5\") " pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:05.133640 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.133616 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5zf9\" (UniqueName: \"kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9\") pod \"kuadrant-operator-catalog-kcq29\" (UID: \"2fe18da0-b2e5-4531-8595-c128045bbfa5\") " pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:05.257880 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.257805 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:05.316677 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.316635 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:05.373814 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.373786 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:05.375867 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:05.375837 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fe18da0_b2e5_4531_8595_c128045bbfa5.slice/crio-11a730dd16d8c281fa86b55aa4e035a253f662428c1285e7257061245d0de699 WatchSource:0}: Error finding container 11a730dd16d8c281fa86b55aa4e035a253f662428c1285e7257061245d0de699: Status 404 returned error can't find the container with id 11a730dd16d8c281fa86b55aa4e035a253f662428c1285e7257061245d0de699 Apr 17 21:43:05.523394 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.523319 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dkhf2"] Apr 17 21:43:05.533770 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.533747 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dkhf2"] Apr 17 21:43:05.533895 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.533848 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:05.629800 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.629771 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqq4p\" (UniqueName: \"kubernetes.io/projected/ec702473-f90c-46e0-b46a-cca01ac0f169-kube-api-access-hqq4p\") pod \"kuadrant-operator-catalog-dkhf2\" (UID: \"ec702473-f90c-46e0-b46a-cca01ac0f169\") " pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:05.731097 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.731065 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqq4p\" (UniqueName: \"kubernetes.io/projected/ec702473-f90c-46e0-b46a-cca01ac0f169-kube-api-access-hqq4p\") pod \"kuadrant-operator-catalog-dkhf2\" (UID: \"ec702473-f90c-46e0-b46a-cca01ac0f169\") " pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:05.738693 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.738664 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqq4p\" (UniqueName: \"kubernetes.io/projected/ec702473-f90c-46e0-b46a-cca01ac0f169-kube-api-access-hqq4p\") pod \"kuadrant-operator-catalog-dkhf2\" (UID: \"ec702473-f90c-46e0-b46a-cca01ac0f169\") " pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:05.843988 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.843960 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:05.962658 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.962621 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" event={"ID":"2fe18da0-b2e5-4531-8595-c128045bbfa5","Type":"ContainerStarted","Data":"11a730dd16d8c281fa86b55aa4e035a253f662428c1285e7257061245d0de699"} Apr 17 21:43:05.969951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:05.969899 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-dkhf2"] Apr 17 21:43:06.006873 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:06.006838 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec702473_f90c_46e0_b46a_cca01ac0f169.slice/crio-c79d7b08c6cb53f42e3becc1b975e56fcd0020bde2f48898b12e6c1a6637df08 WatchSource:0}: Error finding container c79d7b08c6cb53f42e3becc1b975e56fcd0020bde2f48898b12e6c1a6637df08: Status 404 returned error can't find the container with id c79d7b08c6cb53f42e3becc1b975e56fcd0020bde2f48898b12e6c1a6637df08 Apr 17 21:43:06.967407 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:06.967367 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" event={"ID":"ec702473-f90c-46e0-b46a-cca01ac0f169","Type":"ContainerStarted","Data":"c79d7b08c6cb53f42e3becc1b975e56fcd0020bde2f48898b12e6c1a6637df08"} Apr 17 21:43:07.971800 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:07.971764 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" event={"ID":"2fe18da0-b2e5-4531-8595-c128045bbfa5","Type":"ContainerStarted","Data":"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269"} Apr 17 21:43:07.972260 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:07.971828 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" podUID="2fe18da0-b2e5-4531-8595-c128045bbfa5" containerName="registry-server" containerID="cri-o://6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269" gracePeriod=2 Apr 17 21:43:07.973139 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:07.973112 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" event={"ID":"ec702473-f90c-46e0-b46a-cca01ac0f169","Type":"ContainerStarted","Data":"906e255a29cdb80f5ed76043abc017d185dd60332f681a347d2c9718be3f62dd"} Apr 17 21:43:07.986010 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:07.985963 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" podStartSLOduration=2.091787958 podStartE2EDuration="3.985946931s" podCreationTimestamp="2026-04-17 21:43:04 +0000 UTC" firstStartedPulling="2026-04-17 21:43:05.377087803 +0000 UTC m=+382.139847260" lastFinishedPulling="2026-04-17 21:43:07.271246766 +0000 UTC m=+384.034006233" observedRunningTime="2026-04-17 21:43:07.985316789 +0000 UTC m=+384.748076280" watchObservedRunningTime="2026-04-17 21:43:07.985946931 +0000 UTC m=+384.748706410" Apr 17 21:43:07.999235 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:07.999195 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" podStartSLOduration=1.7357782 podStartE2EDuration="2.99918365s" podCreationTimestamp="2026-04-17 21:43:05 +0000 UTC" firstStartedPulling="2026-04-17 21:43:06.008403794 +0000 UTC m=+382.771163265" lastFinishedPulling="2026-04-17 21:43:07.271809256 +0000 UTC m=+384.034568715" observedRunningTime="2026-04-17 21:43:07.998094254 +0000 UTC m=+384.760853731" watchObservedRunningTime="2026-04-17 21:43:07.99918365 +0000 UTC m=+384.761943128" Apr 17 21:43:08.205862 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.205840 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:08.252205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.252127 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5zf9\" (UniqueName: \"kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9\") pod \"2fe18da0-b2e5-4531-8595-c128045bbfa5\" (UID: \"2fe18da0-b2e5-4531-8595-c128045bbfa5\") " Apr 17 21:43:08.254216 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.254196 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9" (OuterVolumeSpecName: "kube-api-access-f5zf9") pod "2fe18da0-b2e5-4531-8595-c128045bbfa5" (UID: "2fe18da0-b2e5-4531-8595-c128045bbfa5"). InnerVolumeSpecName "kube-api-access-f5zf9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:43:08.353645 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.353615 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f5zf9\" (UniqueName: \"kubernetes.io/projected/2fe18da0-b2e5-4531-8595-c128045bbfa5-kube-api-access-f5zf9\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:08.979918 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.979880 2564 generic.go:358] "Generic (PLEG): container finished" podID="2fe18da0-b2e5-4531-8595-c128045bbfa5" containerID="6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269" exitCode=0 Apr 17 21:43:08.980346 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.979943 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" Apr 17 21:43:08.980346 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.979962 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" event={"ID":"2fe18da0-b2e5-4531-8595-c128045bbfa5","Type":"ContainerDied","Data":"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269"} Apr 17 21:43:08.980346 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.980000 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-kcq29" event={"ID":"2fe18da0-b2e5-4531-8595-c128045bbfa5","Type":"ContainerDied","Data":"11a730dd16d8c281fa86b55aa4e035a253f662428c1285e7257061245d0de699"} Apr 17 21:43:08.980346 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.980016 2564 scope.go:117] "RemoveContainer" containerID="6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269" Apr 17 21:43:08.988713 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.988695 2564 scope.go:117] "RemoveContainer" containerID="6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269" Apr 17 21:43:08.988954 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:43:08.988933 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269\": container with ID starting with 6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269 not found: ID does not exist" containerID="6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269" Apr 17 21:43:08.989006 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:08.988963 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269"} err="failed to get container status \"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269\": rpc error: code = NotFound desc = could not find container \"6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269\": container with ID starting with 6f277b50a216f85209e499452243bdf0b4b249698f597f78eedda128aba3a269 not found: ID does not exist" Apr 17 21:43:09.000321 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:09.000296 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:09.001907 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:09.001888 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-kcq29"] Apr 17 21:43:09.693422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:09.693390 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fe18da0-b2e5-4531-8595-c128045bbfa5" path="/var/lib/kubelet/pods/2fe18da0-b2e5-4531-8595-c128045bbfa5/volumes" Apr 17 21:43:15.844696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:15.844654 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:15.844696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:15.844697 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:15.866111 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:15.866084 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:16.025226 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:16.025200 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-dkhf2" Apr 17 21:43:20.555685 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.555646 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh"] Apr 17 21:43:20.556164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.556001 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2fe18da0-b2e5-4531-8595-c128045bbfa5" containerName="registry-server" Apr 17 21:43:20.556164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.556017 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fe18da0-b2e5-4531-8595-c128045bbfa5" containerName="registry-server" Apr 17 21:43:20.556164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.556094 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2fe18da0-b2e5-4531-8595-c128045bbfa5" containerName="registry-server" Apr 17 21:43:20.560717 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.560701 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.563702 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.563680 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qh6pg\"" Apr 17 21:43:20.564646 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.564625 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh"] Apr 17 21:43:20.654828 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.654797 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.655032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.654841 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.655032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.654913 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz2bq\" (UniqueName: \"kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.756228 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.756197 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.756392 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.756254 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz2bq\" (UniqueName: \"kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.756392 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.756285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.756625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.756570 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.756625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.756588 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.763848 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.763817 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz2bq\" (UniqueName: \"kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq\") pod \"0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.870623 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.870583 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:20.988618 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:20.988581 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh"] Apr 17 21:43:20.990675 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:20.990647 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d66543_a4d5_4050_bb06_e7dff0084985.slice/crio-d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396 WatchSource:0}: Error finding container d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396: Status 404 returned error can't find the container with id d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396 Apr 17 21:43:21.021373 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.021347 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" event={"ID":"95d66543-a4d5-4050-bb06-e7dff0084985","Type":"ContainerStarted","Data":"d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396"} Apr 17 21:43:21.356190 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.356155 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx"] Apr 17 21:43:21.359436 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.359419 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.368910 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.368885 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx"] Apr 17 21:43:21.463219 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.463180 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.463394 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.463236 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.463394 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.463278 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqnqf\" (UniqueName: \"kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.564429 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.564397 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.564915 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.564444 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.564915 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.564485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqnqf\" (UniqueName: \"kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.564915 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.564773 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.564915 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.564858 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.573693 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.573674 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqnqf\" (UniqueName: \"kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf\") pod \"5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.668865 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.668773 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:21.788727 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.788706 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx"] Apr 17 21:43:21.791035 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:21.791004 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32c546ce_12e5_4717_8def_5b55510d8aeb.slice/crio-5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735 WatchSource:0}: Error finding container 5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735: Status 404 returned error can't find the container with id 5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735 Apr 17 21:43:21.956329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.956260 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx"] Apr 17 21:43:21.959568 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.959551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:21.966498 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:21.966470 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx"] Apr 17 21:43:22.030833 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.030795 2564 generic.go:358] "Generic (PLEG): container finished" podID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerID="b5e4157540e9b93f9dcf16bbb5af8d03bb2feafc53b024ed0567037981049dd2" exitCode=0 Apr 17 21:43:22.030994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.030880 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" event={"ID":"32c546ce-12e5-4717-8def-5b55510d8aeb","Type":"ContainerDied","Data":"b5e4157540e9b93f9dcf16bbb5af8d03bb2feafc53b024ed0567037981049dd2"} Apr 17 21:43:22.030994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.030918 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" event={"ID":"32c546ce-12e5-4717-8def-5b55510d8aeb","Type":"ContainerStarted","Data":"5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735"} Apr 17 21:43:22.032299 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.032276 2564 generic.go:358] "Generic (PLEG): container finished" podID="95d66543-a4d5-4050-bb06-e7dff0084985" containerID="c2f2d92955948e9e94503ba8df7bb3af0c4ae0d465da784fbad0d30448919889" exitCode=0 Apr 17 21:43:22.032441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.032307 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" event={"ID":"95d66543-a4d5-4050-bb06-e7dff0084985","Type":"ContainerDied","Data":"c2f2d92955948e9e94503ba8df7bb3af0c4ae0d465da784fbad0d30448919889"} Apr 17 21:43:22.068807 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.068780 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvwdv\" (UniqueName: \"kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.068934 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.068832 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.068934 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.068880 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.169686 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.169654 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rvwdv\" (UniqueName: \"kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.169854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.169712 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.169854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.169747 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.170152 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.170133 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.170210 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.170191 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.177180 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.177147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvwdv\" (UniqueName: \"kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv\") pod \"9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.269427 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.269337 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:22.356808 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.356780 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr"] Apr 17 21:43:22.362752 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.362724 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.367212 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.367187 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr"] Apr 17 21:43:22.396802 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.396777 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx"] Apr 17 21:43:22.398578 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:22.398552 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70e4b3a8_b823_4c1b_8bc7_6b87cc668888.slice/crio-0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b WatchSource:0}: Error finding container 0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b: Status 404 returned error can't find the container with id 0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b Apr 17 21:43:22.472458 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.472434 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkvp8\" (UniqueName: \"kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.472616 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.472520 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.472616 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.472541 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.573105 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.573023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.573105 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.573069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.573572 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.573121 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkvp8\" (UniqueName: \"kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.573572 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.573367 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.573572 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.573420 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.584543 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.584519 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkvp8\" (UniqueName: \"kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8\") pod \"19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.675854 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.675818 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:22.936666 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:22.936494 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr"] Apr 17 21:43:22.996434 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:22.996393 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d6243aa_b495_4a71_8b42_309b97bacf8d.slice/crio-ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7 WatchSource:0}: Error finding container ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7: Status 404 returned error can't find the container with id ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7 Apr 17 21:43:23.037585 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.037557 2564 generic.go:358] "Generic (PLEG): container finished" podID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerID="68157db0e14db8051e8fc5199fc9e928437ce13feaa8dca53f42aa630137c8cb" exitCode=0 Apr 17 21:43:23.037715 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.037640 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" event={"ID":"70e4b3a8-b823-4c1b-8bc7-6b87cc668888","Type":"ContainerDied","Data":"68157db0e14db8051e8fc5199fc9e928437ce13feaa8dca53f42aa630137c8cb"} Apr 17 21:43:23.037715 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.037674 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" event={"ID":"70e4b3a8-b823-4c1b-8bc7-6b87cc668888","Type":"ContainerStarted","Data":"0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b"} Apr 17 21:43:23.038885 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.038822 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" event={"ID":"2d6243aa-b495-4a71-8b42-309b97bacf8d","Type":"ContainerStarted","Data":"ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7"} Apr 17 21:43:23.040634 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.040560 2564 generic.go:358] "Generic (PLEG): container finished" podID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerID="6044aef9c2380a316d476af2adb94a14095b47edb69a7c117f637cb72ad54ab9" exitCode=0 Apr 17 21:43:23.040739 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.040632 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" event={"ID":"32c546ce-12e5-4717-8def-5b55510d8aeb","Type":"ContainerDied","Data":"6044aef9c2380a316d476af2adb94a14095b47edb69a7c117f637cb72ad54ab9"} Apr 17 21:43:23.042355 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.042328 2564 generic.go:358] "Generic (PLEG): container finished" podID="95d66543-a4d5-4050-bb06-e7dff0084985" containerID="084c4723a6cf831de3d474d679ee931c88c880f4ce979f90527b07d654c3706a" exitCode=0 Apr 17 21:43:23.042424 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:23.042399 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" event={"ID":"95d66543-a4d5-4050-bb06-e7dff0084985","Type":"ContainerDied","Data":"084c4723a6cf831de3d474d679ee931c88c880f4ce979f90527b07d654c3706a"} Apr 17 21:43:24.047470 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.047375 2564 generic.go:358] "Generic (PLEG): container finished" podID="95d66543-a4d5-4050-bb06-e7dff0084985" containerID="1d0a5ddc923c4331d4c7ebbbeb84ce84ba7ea7fc63bf394d397a2e37b2cae2ba" exitCode=0 Apr 17 21:43:24.047470 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.047459 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" event={"ID":"95d66543-a4d5-4050-bb06-e7dff0084985","Type":"ContainerDied","Data":"1d0a5ddc923c4331d4c7ebbbeb84ce84ba7ea7fc63bf394d397a2e37b2cae2ba"} Apr 17 21:43:24.049153 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.049132 2564 generic.go:358] "Generic (PLEG): container finished" podID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerID="7517596bd9c94e14e6cb97cf35605d44651b4b2a2a09aff8a329ae86f0a5812a" exitCode=0 Apr 17 21:43:24.049252 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.049198 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" event={"ID":"70e4b3a8-b823-4c1b-8bc7-6b87cc668888","Type":"ContainerDied","Data":"7517596bd9c94e14e6cb97cf35605d44651b4b2a2a09aff8a329ae86f0a5812a"} Apr 17 21:43:24.050391 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.050369 2564 generic.go:358] "Generic (PLEG): container finished" podID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerID="a873dcfbe6c4275dd10eac947b00f1b0278651c891d18e216edeb221e41429b4" exitCode=0 Apr 17 21:43:24.050481 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.050448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" event={"ID":"2d6243aa-b495-4a71-8b42-309b97bacf8d","Type":"ContainerDied","Data":"a873dcfbe6c4275dd10eac947b00f1b0278651c891d18e216edeb221e41429b4"} Apr 17 21:43:24.052427 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.052410 2564 generic.go:358] "Generic (PLEG): container finished" podID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerID="7ec140d72cf86030aafe247aa778662c28d971cb1f4214894a6dcf811cf04bb5" exitCode=0 Apr 17 21:43:24.052512 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:24.052436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" event={"ID":"32c546ce-12e5-4717-8def-5b55510d8aeb","Type":"ContainerDied","Data":"7ec140d72cf86030aafe247aa778662c28d971cb1f4214894a6dcf811cf04bb5"} Apr 17 21:43:25.060544 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.060458 2564 generic.go:358] "Generic (PLEG): container finished" podID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerID="4d62342588009b622c399f862d168ae41bfd8321993c7e1c92e4d6e25f3eca23" exitCode=0 Apr 17 21:43:25.060544 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.060500 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" event={"ID":"70e4b3a8-b823-4c1b-8bc7-6b87cc668888","Type":"ContainerDied","Data":"4d62342588009b622c399f862d168ae41bfd8321993c7e1c92e4d6e25f3eca23"} Apr 17 21:43:25.062069 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.062051 2564 generic.go:358] "Generic (PLEG): container finished" podID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerID="1b849fd4b40c91b1609d8c66fdbee63fa989956f9652f8958540e39463876f3d" exitCode=0 Apr 17 21:43:25.062150 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.062129 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" event={"ID":"2d6243aa-b495-4a71-8b42-309b97bacf8d","Type":"ContainerDied","Data":"1b849fd4b40c91b1609d8c66fdbee63fa989956f9652f8958540e39463876f3d"} Apr 17 21:43:25.214775 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.214747 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:25.220395 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.220371 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:25.296232 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296204 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle\") pod \"32c546ce-12e5-4717-8def-5b55510d8aeb\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " Apr 17 21:43:25.296397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296243 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqnqf\" (UniqueName: \"kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf\") pod \"32c546ce-12e5-4717-8def-5b55510d8aeb\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " Apr 17 21:43:25.296397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296294 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util\") pod \"32c546ce-12e5-4717-8def-5b55510d8aeb\" (UID: \"32c546ce-12e5-4717-8def-5b55510d8aeb\") " Apr 17 21:43:25.296397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296316 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz2bq\" (UniqueName: \"kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq\") pod \"95d66543-a4d5-4050-bb06-e7dff0084985\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " Apr 17 21:43:25.296397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296341 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle\") pod \"95d66543-a4d5-4050-bb06-e7dff0084985\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " Apr 17 21:43:25.296648 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296513 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util\") pod \"95d66543-a4d5-4050-bb06-e7dff0084985\" (UID: \"95d66543-a4d5-4050-bb06-e7dff0084985\") " Apr 17 21:43:25.297065 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.296963 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle" (OuterVolumeSpecName: "bundle") pod "95d66543-a4d5-4050-bb06-e7dff0084985" (UID: "95d66543-a4d5-4050-bb06-e7dff0084985"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:25.297065 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.297027 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle" (OuterVolumeSpecName: "bundle") pod "32c546ce-12e5-4717-8def-5b55510d8aeb" (UID: "32c546ce-12e5-4717-8def-5b55510d8aeb"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:25.298548 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.298522 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf" (OuterVolumeSpecName: "kube-api-access-jqnqf") pod "32c546ce-12e5-4717-8def-5b55510d8aeb" (UID: "32c546ce-12e5-4717-8def-5b55510d8aeb"). InnerVolumeSpecName "kube-api-access-jqnqf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:43:25.298786 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.298767 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq" (OuterVolumeSpecName: "kube-api-access-jz2bq") pod "95d66543-a4d5-4050-bb06-e7dff0084985" (UID: "95d66543-a4d5-4050-bb06-e7dff0084985"). InnerVolumeSpecName "kube-api-access-jz2bq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:43:25.301583 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.301555 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util" (OuterVolumeSpecName: "util") pod "32c546ce-12e5-4717-8def-5b55510d8aeb" (UID: "32c546ce-12e5-4717-8def-5b55510d8aeb"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:25.302086 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.302067 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util" (OuterVolumeSpecName: "util") pod "95d66543-a4d5-4050-bb06-e7dff0084985" (UID: "95d66543-a4d5-4050-bb06-e7dff0084985"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:25.397819 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397791 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:25.397819 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397818 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:25.397994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397828 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jqnqf\" (UniqueName: \"kubernetes.io/projected/32c546ce-12e5-4717-8def-5b55510d8aeb-kube-api-access-jqnqf\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:25.397994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397838 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/32c546ce-12e5-4717-8def-5b55510d8aeb-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:25.397994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397847 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jz2bq\" (UniqueName: \"kubernetes.io/projected/95d66543-a4d5-4050-bb06-e7dff0084985-kube-api-access-jz2bq\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:25.397994 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:25.397856 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/95d66543-a4d5-4050-bb06-e7dff0084985-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:26.068527 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.068453 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" Apr 17 21:43:26.068527 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.068500 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh" event={"ID":"95d66543-a4d5-4050-bb06-e7dff0084985","Type":"ContainerDied","Data":"d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396"} Apr 17 21:43:26.068972 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.068530 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1cc8248466b8c499beb057a2cb4f8455d443f95d9216fd11856151f868b1396" Apr 17 21:43:26.070448 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.070412 2564 generic.go:358] "Generic (PLEG): container finished" podID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerID="7544cfbe8e2a26abd2b0d8d4b8d442214959ecbb76ed1c2534f5e33aae1bd707" exitCode=0 Apr 17 21:43:26.070583 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.070445 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" event={"ID":"2d6243aa-b495-4a71-8b42-309b97bacf8d","Type":"ContainerDied","Data":"7544cfbe8e2a26abd2b0d8d4b8d442214959ecbb76ed1c2534f5e33aae1bd707"} Apr 17 21:43:26.072143 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.072113 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" event={"ID":"32c546ce-12e5-4717-8def-5b55510d8aeb","Type":"ContainerDied","Data":"5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735"} Apr 17 21:43:26.072143 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.072133 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5dc75224e0a90b5710e58d06e83693a470032ee80de9172a933221d87e8a0735" Apr 17 21:43:26.072356 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.072341 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx" Apr 17 21:43:26.195335 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.195312 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:26.304921 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.304884 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle\") pod \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " Apr 17 21:43:26.305078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.304995 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvwdv\" (UniqueName: \"kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv\") pod \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " Apr 17 21:43:26.305078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.305044 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util\") pod \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\" (UID: \"70e4b3a8-b823-4c1b-8bc7-6b87cc668888\") " Apr 17 21:43:26.305505 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.305470 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle" (OuterVolumeSpecName: "bundle") pod "70e4b3a8-b823-4c1b-8bc7-6b87cc668888" (UID: "70e4b3a8-b823-4c1b-8bc7-6b87cc668888"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:26.307126 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.307098 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv" (OuterVolumeSpecName: "kube-api-access-rvwdv") pod "70e4b3a8-b823-4c1b-8bc7-6b87cc668888" (UID: "70e4b3a8-b823-4c1b-8bc7-6b87cc668888"). InnerVolumeSpecName "kube-api-access-rvwdv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:43:26.310667 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.310632 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util" (OuterVolumeSpecName: "util") pod "70e4b3a8-b823-4c1b-8bc7-6b87cc668888" (UID: "70e4b3a8-b823-4c1b-8bc7-6b87cc668888"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:26.406056 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.406029 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rvwdv\" (UniqueName: \"kubernetes.io/projected/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-kube-api-access-rvwdv\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:26.406056 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.406057 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:26.406233 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:26.406069 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/70e4b3a8-b823-4c1b-8bc7-6b87cc668888-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:27.076913 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.076877 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" Apr 17 21:43:27.077402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.076878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx" event={"ID":"70e4b3a8-b823-4c1b-8bc7-6b87cc668888","Type":"ContainerDied","Data":"0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b"} Apr 17 21:43:27.077402 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.076996 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0289df1f912947919a7cec6e9da9fb5e3305e5eae924d165d3f28a0b5003d60b" Apr 17 21:43:27.201628 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.201606 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:27.313256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.313216 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util\") pod \"2d6243aa-b495-4a71-8b42-309b97bacf8d\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " Apr 17 21:43:27.313256 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.313258 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle\") pod \"2d6243aa-b495-4a71-8b42-309b97bacf8d\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " Apr 17 21:43:27.313459 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.313329 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gkvp8\" (UniqueName: \"kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8\") pod \"2d6243aa-b495-4a71-8b42-309b97bacf8d\" (UID: \"2d6243aa-b495-4a71-8b42-309b97bacf8d\") " Apr 17 21:43:27.313886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.313855 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle" (OuterVolumeSpecName: "bundle") pod "2d6243aa-b495-4a71-8b42-309b97bacf8d" (UID: "2d6243aa-b495-4a71-8b42-309b97bacf8d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:27.315428 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.315407 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8" (OuterVolumeSpecName: "kube-api-access-gkvp8") pod "2d6243aa-b495-4a71-8b42-309b97bacf8d" (UID: "2d6243aa-b495-4a71-8b42-309b97bacf8d"). InnerVolumeSpecName "kube-api-access-gkvp8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:43:27.321032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.321007 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util" (OuterVolumeSpecName: "util") pod "2d6243aa-b495-4a71-8b42-309b97bacf8d" (UID: "2d6243aa-b495-4a71-8b42-309b97bacf8d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:43:27.413937 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.413901 2564 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-util\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:27.413937 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.413931 2564 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2d6243aa-b495-4a71-8b42-309b97bacf8d-bundle\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:27.414096 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:27.413950 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gkvp8\" (UniqueName: \"kubernetes.io/projected/2d6243aa-b495-4a71-8b42-309b97bacf8d-kube-api-access-gkvp8\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:43:28.082351 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:28.082258 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" event={"ID":"2d6243aa-b495-4a71-8b42-309b97bacf8d","Type":"ContainerDied","Data":"ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7"} Apr 17 21:43:28.082351 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:28.082276 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr" Apr 17 21:43:28.082351 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:28.082291 2564 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea0ed15969675a954190e8c195f9ffdfd4417073be1a9ee7df0af3ed7328b3f7" Apr 17 21:43:36.396374 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396334 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p"] Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396643 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396655 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396664 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396670 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396678 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396684 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396692 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396697 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396704 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396709 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396716 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396721 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396728 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396733 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396740 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396745 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396754 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396758 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396763 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396768 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="extract" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396774 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396780 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="util" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396789 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="pull" Apr 17 21:43:36.396813 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396793 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="pull" Apr 17 21:43:36.397541 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396838 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="70e4b3a8-b823-4c1b-8bc7-6b87cc668888" containerName="extract" Apr 17 21:43:36.397541 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396847 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="95d66543-a4d5-4050-bb06-e7dff0084985" containerName="extract" Apr 17 21:43:36.397541 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396854 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d6243aa-b495-4a71-8b42-309b97bacf8d" containerName="extract" Apr 17 21:43:36.397541 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.396860 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="32c546ce-12e5-4717-8def-5b55510d8aeb" containerName="extract" Apr 17 21:43:36.403531 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.403510 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:36.428961 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.428924 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-cnlnv\"" Apr 17 21:43:36.429191 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.429175 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 17 21:43:36.432433 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.432413 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p"] Apr 17 21:43:36.593685 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.593646 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6pjr\" (UniqueName: \"kubernetes.io/projected/d7abbdd5-20c3-43dc-ab24-9b5945b9a147-kube-api-access-h6pjr\") pod \"dns-operator-controller-manager-648d5c98bc-tm25p\" (UID: \"d7abbdd5-20c3-43dc-ab24-9b5945b9a147\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:36.694489 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.694405 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h6pjr\" (UniqueName: \"kubernetes.io/projected/d7abbdd5-20c3-43dc-ab24-9b5945b9a147-kube-api-access-h6pjr\") pod \"dns-operator-controller-manager-648d5c98bc-tm25p\" (UID: \"d7abbdd5-20c3-43dc-ab24-9b5945b9a147\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:36.709090 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.709066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6pjr\" (UniqueName: \"kubernetes.io/projected/d7abbdd5-20c3-43dc-ab24-9b5945b9a147-kube-api-access-h6pjr\") pod \"dns-operator-controller-manager-648d5c98bc-tm25p\" (UID: \"d7abbdd5-20c3-43dc-ab24-9b5945b9a147\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:36.712841 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.712813 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:36.838547 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:36.838522 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p"] Apr 17 21:43:36.840445 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:36.840418 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7abbdd5_20c3_43dc_ab24_9b5945b9a147.slice/crio-558617e1234b0e57ac0da7e12c916fd5520e7b37d545e754f3f491b2e542efbd WatchSource:0}: Error finding container 558617e1234b0e57ac0da7e12c916fd5520e7b37d545e754f3f491b2e542efbd: Status 404 returned error can't find the container with id 558617e1234b0e57ac0da7e12c916fd5520e7b37d545e754f3f491b2e542efbd Apr 17 21:43:37.113084 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:37.113041 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" event={"ID":"d7abbdd5-20c3-43dc-ab24-9b5945b9a147","Type":"ContainerStarted","Data":"558617e1234b0e57ac0da7e12c916fd5520e7b37d545e754f3f491b2e542efbd"} Apr 17 21:43:39.123587 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:39.123556 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" event={"ID":"d7abbdd5-20c3-43dc-ab24-9b5945b9a147","Type":"ContainerStarted","Data":"83eb3eff41de31ad49aee6bb0e05fc2037919fe0f9700af5269c276022c9e59a"} Apr 17 21:43:39.123986 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:39.123634 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:39.143780 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:39.143731 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" podStartSLOduration=1.022893331 podStartE2EDuration="3.143717926s" podCreationTimestamp="2026-04-17 21:43:36 +0000 UTC" firstStartedPulling="2026-04-17 21:43:36.842295915 +0000 UTC m=+413.605055372" lastFinishedPulling="2026-04-17 21:43:38.963120502 +0000 UTC m=+415.725879967" observedRunningTime="2026-04-17 21:43:39.140964009 +0000 UTC m=+415.903723490" watchObservedRunningTime="2026-04-17 21:43:39.143717926 +0000 UTC m=+415.906477404" Apr 17 21:43:41.672005 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.671970 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f"] Apr 17 21:43:41.677762 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.677743 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.680325 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.680306 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v5tkv\"" Apr 17 21:43:41.685046 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.685022 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f"] Apr 17 21:43:41.736508 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.736479 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.736867 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.736518 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tj8zv\" (UniqueName: \"kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.836895 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.836866 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.837054 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.836904 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tj8zv\" (UniqueName: \"kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.837214 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.837195 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.850466 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.850440 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tj8zv\" (UniqueName: \"kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:41.989772 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:41.989688 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:42.120296 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:42.120269 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f"] Apr 17 21:43:42.122053 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:42.122024 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1a056c0_0049_4901_8fc4_6e8f3d367a9b.slice/crio-cf2cd86628baf6716f125b42ced52fce9a6795d3b1dfd581ddf15e5d414d1dc9 WatchSource:0}: Error finding container cf2cd86628baf6716f125b42ced52fce9a6795d3b1dfd581ddf15e5d414d1dc9: Status 404 returned error can't find the container with id cf2cd86628baf6716f125b42ced52fce9a6795d3b1dfd581ddf15e5d414d1dc9 Apr 17 21:43:42.137380 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:42.137354 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" event={"ID":"e1a056c0-0049-4901-8fc4-6e8f3d367a9b","Type":"ContainerStarted","Data":"cf2cd86628baf6716f125b42ced52fce9a6795d3b1dfd581ddf15e5d414d1dc9"} Apr 17 21:43:48.162372 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.162323 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" event={"ID":"e1a056c0-0049-4901-8fc4-6e8f3d367a9b","Type":"ContainerStarted","Data":"7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08"} Apr 17 21:43:48.162832 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.162410 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:43:48.180871 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.180827 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" podStartSLOduration=2.166636767 podStartE2EDuration="7.18081151s" podCreationTimestamp="2026-04-17 21:43:41 +0000 UTC" firstStartedPulling="2026-04-17 21:43:42.124530335 +0000 UTC m=+418.887289794" lastFinishedPulling="2026-04-17 21:43:47.13870508 +0000 UTC m=+423.901464537" observedRunningTime="2026-04-17 21:43:48.178830135 +0000 UTC m=+424.941589613" watchObservedRunningTime="2026-04-17 21:43:48.18081151 +0000 UTC m=+424.943570989" Apr 17 21:43:48.792266 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.792231 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qbt98"] Apr 17 21:43:48.796077 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.796057 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:48.798657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.798638 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-d24dr\"" Apr 17 21:43:48.805407 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.805384 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qbt98"] Apr 17 21:43:48.895019 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.894985 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsxrt\" (UniqueName: \"kubernetes.io/projected/69318013-30f1-47fd-94e0-e17179bc801e-kube-api-access-rsxrt\") pod \"authorino-operator-657f44b778-qbt98\" (UID: \"69318013-30f1-47fd-94e0-e17179bc801e\") " pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:48.995646 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:48.995588 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rsxrt\" (UniqueName: \"kubernetes.io/projected/69318013-30f1-47fd-94e0-e17179bc801e-kube-api-access-rsxrt\") pod \"authorino-operator-657f44b778-qbt98\" (UID: \"69318013-30f1-47fd-94e0-e17179bc801e\") " pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:49.003879 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:49.003846 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsxrt\" (UniqueName: \"kubernetes.io/projected/69318013-30f1-47fd-94e0-e17179bc801e-kube-api-access-rsxrt\") pod \"authorino-operator-657f44b778-qbt98\" (UID: \"69318013-30f1-47fd-94e0-e17179bc801e\") " pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:49.107113 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:49.107080 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:49.229115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:49.229084 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-qbt98"] Apr 17 21:43:49.231566 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:43:49.231537 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69318013_30f1_47fd_94e0_e17179bc801e.slice/crio-fb5177fa7dcea8cadc241fc1ac4cefad8008b7f652e578f7af213ddcb844ab61 WatchSource:0}: Error finding container fb5177fa7dcea8cadc241fc1ac4cefad8008b7f652e578f7af213ddcb844ab61: Status 404 returned error can't find the container with id fb5177fa7dcea8cadc241fc1ac4cefad8008b7f652e578f7af213ddcb844ab61 Apr 17 21:43:50.130845 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:50.130811 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-tm25p" Apr 17 21:43:50.173444 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:50.173382 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" event={"ID":"69318013-30f1-47fd-94e0-e17179bc801e","Type":"ContainerStarted","Data":"fb5177fa7dcea8cadc241fc1ac4cefad8008b7f652e578f7af213ddcb844ab61"} Apr 17 21:43:52.182035 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:52.181999 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" event={"ID":"69318013-30f1-47fd-94e0-e17179bc801e","Type":"ContainerStarted","Data":"e5059a65eff7c63de9b09d64e2cd1a6d23c35544303adcc383901916c9751662"} Apr 17 21:43:52.182438 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:52.182122 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:43:52.198258 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:52.198206 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" podStartSLOduration=2.008228721 podStartE2EDuration="4.198191199s" podCreationTimestamp="2026-04-17 21:43:48 +0000 UTC" firstStartedPulling="2026-04-17 21:43:49.234060812 +0000 UTC m=+425.996820278" lastFinishedPulling="2026-04-17 21:43:51.424023281 +0000 UTC m=+428.186782756" observedRunningTime="2026-04-17 21:43:52.196495622 +0000 UTC m=+428.959255099" watchObservedRunningTime="2026-04-17 21:43:52.198191199 +0000 UTC m=+428.960950677" Apr 17 21:43:59.169817 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:43:59.169738 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:44:00.855912 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.855875 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f"] Apr 17 21:44:00.856329 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.856131 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" containerName="manager" containerID="cri-o://7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08" gracePeriod=2 Apr 17 21:44:00.865951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.865893 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f"] Apr 17 21:44:00.878294 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.878156 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:00.878485 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.878471 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" containerName="manager" Apr 17 21:44:00.878537 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.878487 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" containerName="manager" Apr 17 21:44:00.878577 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.878547 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" containerName="manager" Apr 17 21:44:00.881470 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.881455 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:00.898067 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.898039 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:00.911643 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.911584 2564 status_manager.go:895] "Failed to get status for pod" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" is forbidden: User \"system:node:ip-10-0-132-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-27.ec2.internal' and this object" Apr 17 21:44:00.990174 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.990145 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:00.990313 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:00.990185 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5pr4\" (UniqueName: \"kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.079428 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.079405 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:44:01.081880 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.081855 2564 status_manager.go:895] "Failed to get status for pod" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" is forbidden: User \"system:node:ip-10-0-132-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-27.ec2.internal' and this object" Apr 17 21:44:01.091272 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.091251 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.091343 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.091288 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n5pr4\" (UniqueName: \"kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.091643 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.091619 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.107624 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.107539 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5pr4\" (UniqueName: \"kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d6hv4\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.192344 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.192317 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume\") pod \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " Apr 17 21:44:01.192531 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.192351 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tj8zv\" (UniqueName: \"kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv\") pod \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\" (UID: \"e1a056c0-0049-4901-8fc4-6e8f3d367a9b\") " Apr 17 21:44:01.192820 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.192795 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e1a056c0-0049-4901-8fc4-6e8f3d367a9b" (UID: "e1a056c0-0049-4901-8fc4-6e8f3d367a9b"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:44:01.194423 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.194401 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv" (OuterVolumeSpecName: "kube-api-access-tj8zv") pod "e1a056c0-0049-4901-8fc4-6e8f3d367a9b" (UID: "e1a056c0-0049-4901-8fc4-6e8f3d367a9b"). InnerVolumeSpecName "kube-api-access-tj8zv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:44:01.215877 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.215848 2564 generic.go:358] "Generic (PLEG): container finished" podID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" containerID="7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08" exitCode=0 Apr 17 21:44:01.216003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.215896 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" Apr 17 21:44:01.216003 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.215939 2564 scope.go:117] "RemoveContainer" containerID="7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08" Apr 17 21:44:01.218523 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.218494 2564 status_manager.go:895] "Failed to get status for pod" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" is forbidden: User \"system:node:ip-10-0-132-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-27.ec2.internal' and this object" Apr 17 21:44:01.224863 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.224846 2564 scope.go:117] "RemoveContainer" containerID="7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08" Apr 17 21:44:01.225115 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:44:01.225098 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08\": container with ID starting with 7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08 not found: ID does not exist" containerID="7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08" Apr 17 21:44:01.225152 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.225124 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08"} err="failed to get container status \"7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08\": rpc error: code = NotFound desc = could not find container \"7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08\": container with ID starting with 7057afb500180164edfc6d9fe693edc969761d94511ce5afa14f68b30fadac08 not found: ID does not exist" Apr 17 21:44:01.225864 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.225844 2564 status_manager.go:895] "Failed to get status for pod" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-nhj9f" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-nhj9f\" is forbidden: User \"system:node:ip-10-0-132-27.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-27.ec2.internal' and this object" Apr 17 21:44:01.235028 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.234999 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:01.293341 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.293307 2564 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-extensions-socket-volume\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:44:01.293341 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.293336 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tj8zv\" (UniqueName: \"kubernetes.io/projected/e1a056c0-0049-4901-8fc4-6e8f3d367a9b-kube-api-access-tj8zv\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:44:01.361276 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.361249 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:01.363571 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:44:01.363548 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod301794ce_1781_47c6_9311_5c6df4a3cd7e.slice/crio-091285be0cc89cfab065371c697e5ca2860976d843a3eaae5836869d2f68a728 WatchSource:0}: Error finding container 091285be0cc89cfab065371c697e5ca2860976d843a3eaae5836869d2f68a728: Status 404 returned error can't find the container with id 091285be0cc89cfab065371c697e5ca2860976d843a3eaae5836869d2f68a728 Apr 17 21:44:01.694579 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:01.694500 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a056c0-0049-4901-8fc4-6e8f3d367a9b" path="/var/lib/kubelet/pods/e1a056c0-0049-4901-8fc4-6e8f3d367a9b/volumes" Apr 17 21:44:02.221930 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:02.221897 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" event={"ID":"301794ce-1781-47c6-9311-5c6df4a3cd7e","Type":"ContainerStarted","Data":"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f"} Apr 17 21:44:02.221930 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:02.221934 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" event={"ID":"301794ce-1781-47c6-9311-5c6df4a3cd7e","Type":"ContainerStarted","Data":"091285be0cc89cfab065371c697e5ca2860976d843a3eaae5836869d2f68a728"} Apr 17 21:44:02.222328 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:02.221965 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:02.245924 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:02.245874 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" podStartSLOduration=2.24584022 podStartE2EDuration="2.24584022s" podCreationTimestamp="2026-04-17 21:44:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:44:02.245103645 +0000 UTC m=+439.007863123" watchObservedRunningTime="2026-04-17 21:44:02.24584022 +0000 UTC m=+439.008599697" Apr 17 21:44:03.188092 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:03.188061 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-qbt98" Apr 17 21:44:13.228434 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:13.228406 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:31.928357 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:31.928324 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:31.928772 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:31.928547 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" podUID="301794ce-1781-47c6-9311-5c6df4a3cd7e" containerName="manager" containerID="cri-o://497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f" gracePeriod=10 Apr 17 21:44:32.163477 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.163452 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:32.242264 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.242200 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume\") pod \"301794ce-1781-47c6-9311-5c6df4a3cd7e\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " Apr 17 21:44:32.242397 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.242272 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5pr4\" (UniqueName: \"kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4\") pod \"301794ce-1781-47c6-9311-5c6df4a3cd7e\" (UID: \"301794ce-1781-47c6-9311-5c6df4a3cd7e\") " Apr 17 21:44:32.242642 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.242585 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "301794ce-1781-47c6-9311-5c6df4a3cd7e" (UID: "301794ce-1781-47c6-9311-5c6df4a3cd7e"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 21:44:32.244323 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.244300 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4" (OuterVolumeSpecName: "kube-api-access-n5pr4") pod "301794ce-1781-47c6-9311-5c6df4a3cd7e" (UID: "301794ce-1781-47c6-9311-5c6df4a3cd7e"). InnerVolumeSpecName "kube-api-access-n5pr4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:44:32.331447 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.331412 2564 generic.go:358] "Generic (PLEG): container finished" podID="301794ce-1781-47c6-9311-5c6df4a3cd7e" containerID="497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f" exitCode=0 Apr 17 21:44:32.331624 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.331479 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" Apr 17 21:44:32.331624 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.331499 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" event={"ID":"301794ce-1781-47c6-9311-5c6df4a3cd7e","Type":"ContainerDied","Data":"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f"} Apr 17 21:44:32.331624 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.331539 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4" event={"ID":"301794ce-1781-47c6-9311-5c6df4a3cd7e","Type":"ContainerDied","Data":"091285be0cc89cfab065371c697e5ca2860976d843a3eaae5836869d2f68a728"} Apr 17 21:44:32.331624 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.331557 2564 scope.go:117] "RemoveContainer" containerID="497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f" Apr 17 21:44:32.340111 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.340098 2564 scope.go:117] "RemoveContainer" containerID="497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f" Apr 17 21:44:32.340369 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:44:32.340351 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f\": container with ID starting with 497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f not found: ID does not exist" containerID="497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f" Apr 17 21:44:32.340410 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.340378 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f"} err="failed to get container status \"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f\": rpc error: code = NotFound desc = could not find container \"497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f\": container with ID starting with 497cca448b420256a648a7b6211259d3456e5b0571a1eb813607ffa667aa3e5f not found: ID does not exist" Apr 17 21:44:32.342849 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.342832 2564 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/301794ce-1781-47c6-9311-5c6df4a3cd7e-extensions-socket-volume\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:44:32.342914 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.342850 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n5pr4\" (UniqueName: \"kubernetes.io/projected/301794ce-1781-47c6-9311-5c6df4a3cd7e-kube-api-access-n5pr4\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:44:32.354282 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.354260 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:32.356427 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:32.356407 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d6hv4"] Apr 17 21:44:33.694557 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:33.694524 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="301794ce-1781-47c6-9311-5c6df4a3cd7e" path="/var/lib/kubelet/pods/301794ce-1781-47c6-9311-5c6df4a3cd7e/volumes" Apr 17 21:44:48.150476 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.150440 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8"] Apr 17 21:44:48.150890 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.150765 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="301794ce-1781-47c6-9311-5c6df4a3cd7e" containerName="manager" Apr 17 21:44:48.150890 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.150776 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="301794ce-1781-47c6-9311-5c6df4a3cd7e" containerName="manager" Apr 17 21:44:48.150890 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.150844 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="301794ce-1781-47c6-9311-5c6df4a3cd7e" containerName="manager" Apr 17 21:44:48.155192 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.155177 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.158033 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.157992 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-bn97q\"" Apr 17 21:44:48.165429 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.165397 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8"] Apr 17 21:44:48.266939 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.266904 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.266939 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.266939 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.266971 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.266987 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a23de3a7-3163-4907-9eb0-72991fc680a6-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.267060 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.267095 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267159 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.267117 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267316 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.267160 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n978\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-kube-api-access-8n978\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.267316 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.267193 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.367956 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368006 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368004 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368023 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368052 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368067 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a23de3a7-3163-4907-9eb0-72991fc680a6-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368094 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368115 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368133 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8n978\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-kube-api-access-8n978\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368416 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368544 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368738 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368668 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368738 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368684 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.368819 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.368800 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/a23de3a7-3163-4907-9eb0-72991fc680a6-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.370388 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.370370 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.370534 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.370517 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.375401 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.375377 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.375522 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.375379 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n978\" (UniqueName: \"kubernetes.io/projected/a23de3a7-3163-4907-9eb0-72991fc680a6-kube-api-access-8n978\") pod \"maas-default-gateway-openshift-default-58b6f876-rqwh8\" (UID: \"a23de3a7-3163-4907-9eb0-72991fc680a6\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.470212 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.470145 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:48.597337 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.597304 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8"] Apr 17 21:44:48.598995 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:44:48.598967 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda23de3a7_3163_4907_9eb0_72991fc680a6.slice/crio-e80b8786c7f8ede82318835a2c12488183dccec050925ed3cbc2c08ebb85632d WatchSource:0}: Error finding container e80b8786c7f8ede82318835a2c12488183dccec050925ed3cbc2c08ebb85632d: Status 404 returned error can't find the container with id e80b8786c7f8ede82318835a2c12488183dccec050925ed3cbc2c08ebb85632d Apr 17 21:44:48.601245 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.601213 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:44:48.601358 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.601305 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:44:48.601419 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:48.601363 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 21:44:49.399755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:49.399716 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" event={"ID":"a23de3a7-3163-4907-9eb0-72991fc680a6","Type":"ContainerStarted","Data":"0091710431db073dbd710af70b40a1291e799881b787983ba9ffb7115a365893"} Apr 17 21:44:49.399755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:49.399754 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" event={"ID":"a23de3a7-3163-4907-9eb0-72991fc680a6","Type":"ContainerStarted","Data":"e80b8786c7f8ede82318835a2c12488183dccec050925ed3cbc2c08ebb85632d"} Apr 17 21:44:49.419206 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:49.419156 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" podStartSLOduration=1.419143142 podStartE2EDuration="1.419143142s" podCreationTimestamp="2026-04-17 21:44:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 21:44:49.417537777 +0000 UTC m=+486.180297256" watchObservedRunningTime="2026-04-17 21:44:49.419143142 +0000 UTC m=+486.181902619" Apr 17 21:44:49.470826 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:49.470794 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:49.475579 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:49.475558 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:50.403425 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:50.403394 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:50.404428 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:50.404409 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-rqwh8" Apr 17 21:44:52.320939 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.320858 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:44:52.324251 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.324234 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.326714 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.326695 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 21:44:52.326812 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.326700 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-qh6pg\"" Apr 17 21:44:52.332907 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.332625 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:44:52.395217 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.395182 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:44:52.504662 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.504623 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.504835 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.504679 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnf4g\" (UniqueName: \"kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.605205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.605166 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.605409 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.605259 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tnf4g\" (UniqueName: \"kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.605819 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.605800 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.613625 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.613605 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnf4g\" (UniqueName: \"kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g\") pod \"limitador-limitador-7d549b5b-nsvj5\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.637443 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.637418 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:52.783641 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:52.783615 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:44:52.785108 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:44:52.785080 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddadc5d5b_671c_4a25_a5c1_32a1a54fe4ab.slice/crio-f40a2c5ce11347b94f2dede34537f936c07283eedfcb79f5e58e912cc8846a7a WatchSource:0}: Error finding container f40a2c5ce11347b94f2dede34537f936c07283eedfcb79f5e58e912cc8846a7a: Status 404 returned error can't find the container with id f40a2c5ce11347b94f2dede34537f936c07283eedfcb79f5e58e912cc8846a7a Apr 17 21:44:53.168248 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.168210 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:44:53.172831 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.172815 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:44:53.175333 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.175310 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-c5c6h\"" Apr 17 21:44:53.176826 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.176803 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:44:53.309246 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.309215 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jm6c5\" (UniqueName: \"kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5\") pod \"authorino-7498df8756-7ldkp\" (UID: \"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f\") " pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:44:53.409846 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.409814 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jm6c5\" (UniqueName: \"kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5\") pod \"authorino-7498df8756-7ldkp\" (UID: \"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f\") " pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:44:53.414097 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.414070 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" event={"ID":"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab","Type":"ContainerStarted","Data":"f40a2c5ce11347b94f2dede34537f936c07283eedfcb79f5e58e912cc8846a7a"} Apr 17 21:44:53.418995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.418948 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jm6c5\" (UniqueName: \"kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5\") pod \"authorino-7498df8756-7ldkp\" (UID: \"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f\") " pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:44:53.483012 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.482986 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:44:53.661478 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:53.661446 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:44:53.666315 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:44:53.666273 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc0b07de7_4ca9_4a1c_9a22_b0fa3377106f.slice/crio-e3f01a0c18a030ed9af19624b0fb80e3c8099d6237696edee3d7efc7c435843d WatchSource:0}: Error finding container e3f01a0c18a030ed9af19624b0fb80e3c8099d6237696edee3d7efc7c435843d: Status 404 returned error can't find the container with id e3f01a0c18a030ed9af19624b0fb80e3c8099d6237696edee3d7efc7c435843d Apr 17 21:44:54.421458 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:54.421421 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-7ldkp" event={"ID":"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f","Type":"ContainerStarted","Data":"e3f01a0c18a030ed9af19624b0fb80e3c8099d6237696edee3d7efc7c435843d"} Apr 17 21:44:57.434491 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:57.434448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" event={"ID":"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab","Type":"ContainerStarted","Data":"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d"} Apr 17 21:44:57.435029 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:57.434576 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:44:57.435773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:57.435748 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-7ldkp" event={"ID":"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f","Type":"ContainerStarted","Data":"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988"} Apr 17 21:44:57.453584 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:57.453540 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" podStartSLOduration=1.6453363859999999 podStartE2EDuration="5.453530168s" podCreationTimestamp="2026-04-17 21:44:52 +0000 UTC" firstStartedPulling="2026-04-17 21:44:52.786858607 +0000 UTC m=+489.549618062" lastFinishedPulling="2026-04-17 21:44:56.595052388 +0000 UTC m=+493.357811844" observedRunningTime="2026-04-17 21:44:57.45184128 +0000 UTC m=+494.214600757" watchObservedRunningTime="2026-04-17 21:44:57.453530168 +0000 UTC m=+494.216289645" Apr 17 21:44:57.466067 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:44:57.466024 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-7ldkp" podStartSLOduration=1.541936254 podStartE2EDuration="4.466012038s" podCreationTimestamp="2026-04-17 21:44:53 +0000 UTC" firstStartedPulling="2026-04-17 21:44:53.668693607 +0000 UTC m=+490.431453068" lastFinishedPulling="2026-04-17 21:44:56.592769381 +0000 UTC m=+493.355528852" observedRunningTime="2026-04-17 21:44:57.464361808 +0000 UTC m=+494.227121286" watchObservedRunningTime="2026-04-17 21:44:57.466012038 +0000 UTC m=+494.228771515" Apr 17 21:45:07.626260 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:07.626228 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:45:07.626723 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:07.626472 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" podUID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" containerName="limitador" containerID="cri-o://314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d" gracePeriod=30 Apr 17 21:45:07.627160 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:07.627065 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:45:08.165177 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.165153 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:45:08.235117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.235044 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file\") pod \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " Apr 17 21:45:08.235117 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.235093 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnf4g\" (UniqueName: \"kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g\") pod \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\" (UID: \"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab\") " Apr 17 21:45:08.235391 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.235366 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file" (OuterVolumeSpecName: "config-file") pod "dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" (UID: "dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 21:45:08.237115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.237094 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g" (OuterVolumeSpecName: "kube-api-access-tnf4g") pod "dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" (UID: "dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab"). InnerVolumeSpecName "kube-api-access-tnf4g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:45:08.335951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.335923 2564 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-config-file\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:08.335951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.335952 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tnf4g\" (UniqueName: \"kubernetes.io/projected/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab-kube-api-access-tnf4g\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:08.477115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.477079 2564 generic.go:358] "Generic (PLEG): container finished" podID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" containerID="314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d" exitCode=0 Apr 17 21:45:08.477278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.477147 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" Apr 17 21:45:08.477278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.477161 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" event={"ID":"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab","Type":"ContainerDied","Data":"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d"} Apr 17 21:45:08.477278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.477200 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-nsvj5" event={"ID":"dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab","Type":"ContainerDied","Data":"f40a2c5ce11347b94f2dede34537f936c07283eedfcb79f5e58e912cc8846a7a"} Apr 17 21:45:08.477278 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.477216 2564 scope.go:117] "RemoveContainer" containerID="314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d" Apr 17 21:45:08.486191 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.486170 2564 scope.go:117] "RemoveContainer" containerID="314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d" Apr 17 21:45:08.486474 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:45:08.486455 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d\": container with ID starting with 314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d not found: ID does not exist" containerID="314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d" Apr 17 21:45:08.486523 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.486483 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d"} err="failed to get container status \"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d\": rpc error: code = NotFound desc = could not find container \"314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d\": container with ID starting with 314a8ecfaac929c4f7e004986d7ba47d66d851df109f7dccf7f934f781db1f0d not found: ID does not exist" Apr 17 21:45:08.498362 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.498333 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:45:08.501292 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:08.501271 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-nsvj5"] Apr 17 21:45:09.694207 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:09.694177 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" path="/var/lib/kubelet/pods/dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab/volumes" Apr 17 21:45:13.392059 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.392024 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-rkqpw"] Apr 17 21:45:13.392441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.392338 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" containerName="limitador" Apr 17 21:45:13.392441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.392349 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" containerName="limitador" Apr 17 21:45:13.392441 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.392417 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="dadc5d5b-671c-4a25-a5c1-32a1a54fe4ab" containerName="limitador" Apr 17 21:45:13.396488 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.396472 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.399185 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.399159 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-pf959\"" Apr 17 21:45:13.399306 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.399162 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 17 21:45:13.400804 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.400781 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rkqpw"] Apr 17 21:45:13.479027 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.478989 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6be905ab-e9cd-42c4-982a-d80fc553f8f9-data\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.479216 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.479051 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpgb\" (UniqueName: \"kubernetes.io/projected/6be905ab-e9cd-42c4-982a-d80fc553f8f9-kube-api-access-9qpgb\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.580115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.580080 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6be905ab-e9cd-42c4-982a-d80fc553f8f9-data\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.580289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.580140 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpgb\" (UniqueName: \"kubernetes.io/projected/6be905ab-e9cd-42c4-982a-d80fc553f8f9-kube-api-access-9qpgb\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.580511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.580489 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/6be905ab-e9cd-42c4-982a-d80fc553f8f9-data\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.587621 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.587569 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpgb\" (UniqueName: \"kubernetes.io/projected/6be905ab-e9cd-42c4-982a-d80fc553f8f9-kube-api-access-9qpgb\") pod \"postgres-868db5846d-rkqpw\" (UID: \"6be905ab-e9cd-42c4-982a-d80fc553f8f9\") " pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.708436 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.708345 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:13.832307 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:13.832282 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-rkqpw"] Apr 17 21:45:13.834139 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:13.834111 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6be905ab_e9cd_42c4_982a_d80fc553f8f9.slice/crio-3281260c557b946adae9183311243dc6a7f2a8b4de5673a7270e8b21dd8073c0 WatchSource:0}: Error finding container 3281260c557b946adae9183311243dc6a7f2a8b4de5673a7270e8b21dd8073c0: Status 404 returned error can't find the container with id 3281260c557b946adae9183311243dc6a7f2a8b4de5673a7270e8b21dd8073c0 Apr 17 21:45:14.507510 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:14.507471 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rkqpw" event={"ID":"6be905ab-e9cd-42c4-982a-d80fc553f8f9","Type":"ContainerStarted","Data":"3281260c557b946adae9183311243dc6a7f2a8b4de5673a7270e8b21dd8073c0"} Apr 17 21:45:19.528954 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:19.528914 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-rkqpw" event={"ID":"6be905ab-e9cd-42c4-982a-d80fc553f8f9","Type":"ContainerStarted","Data":"14b110c337eca2d22ce7b21ff5b9f72f9a99f7991f3895001ef70f5ec274de18"} Apr 17 21:45:19.529456 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:19.528967 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:19.545421 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:19.545370 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-rkqpw" podStartSLOduration=1.7978781160000001 podStartE2EDuration="6.545355568s" podCreationTimestamp="2026-04-17 21:45:13 +0000 UTC" firstStartedPulling="2026-04-17 21:45:13.835759559 +0000 UTC m=+510.598519016" lastFinishedPulling="2026-04-17 21:45:18.583237012 +0000 UTC m=+515.345996468" observedRunningTime="2026-04-17 21:45:19.543090921 +0000 UTC m=+516.305850400" watchObservedRunningTime="2026-04-17 21:45:19.545355568 +0000 UTC m=+516.308115045" Apr 17 21:45:25.563706 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:25.563631 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-rkqpw" Apr 17 21:45:26.329652 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.329611 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-5b656864f6-7gq75"] Apr 17 21:45:26.333048 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.333030 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.340138 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.339895 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-5b656864f6-7gq75"] Apr 17 21:45:26.400585 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.400546 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4lpc\" (UniqueName: \"kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc\") pod \"authorino-5b656864f6-7gq75\" (UID: \"80f9981b-bbe7-42b4-937b-b740e292f587\") " pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.420381 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.420348 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5b656864f6-7gq75"] Apr 17 21:45:26.420628 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:45:26.420609 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-r4lpc], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-5b656864f6-7gq75" podUID="80f9981b-bbe7-42b4-937b-b740e292f587" Apr 17 21:45:26.445228 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.445190 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:45:26.448699 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.448681 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.451312 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.451290 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 17 21:45:26.455970 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.455812 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:45:26.501438 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.501402 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxqvp\" (UniqueName: \"kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.501633 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.501459 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.501633 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.501577 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r4lpc\" (UniqueName: \"kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc\") pod \"authorino-5b656864f6-7gq75\" (UID: \"80f9981b-bbe7-42b4-937b-b740e292f587\") " pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.510930 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.510909 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4lpc\" (UniqueName: \"kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc\") pod \"authorino-5b656864f6-7gq75\" (UID: \"80f9981b-bbe7-42b4-937b-b740e292f587\") " pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.556536 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.556506 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.561583 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.561558 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:26.602851 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.602819 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxqvp\" (UniqueName: \"kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.603233 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.602873 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.605218 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.605199 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.614793 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.614774 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxqvp\" (UniqueName: \"kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp\") pod \"authorino-55b44f5d48-lx5xx\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.704115 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.704083 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4lpc\" (UniqueName: \"kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc\") pod \"80f9981b-bbe7-42b4-937b-b740e292f587\" (UID: \"80f9981b-bbe7-42b4-937b-b740e292f587\") " Apr 17 21:45:26.706337 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.706303 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc" (OuterVolumeSpecName: "kube-api-access-r4lpc") pod "80f9981b-bbe7-42b4-937b-b740e292f587" (UID: "80f9981b-bbe7-42b4-937b-b740e292f587"). InnerVolumeSpecName "kube-api-access-r4lpc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:45:26.759655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.759617 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:45:26.804968 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.804927 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r4lpc\" (UniqueName: \"kubernetes.io/projected/80f9981b-bbe7-42b4-937b-b740e292f587-kube-api-access-r4lpc\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:26.881521 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:26.881494 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:45:26.883205 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:26.883175 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3cbbc88_5d35_42e3_a3ac_e14e31ce1d19.slice/crio-de79d1660ed250ffe664a1de4a2638a31bcfa2e2758808a237c39ab80dcb3c82 WatchSource:0}: Error finding container de79d1660ed250ffe664a1de4a2638a31bcfa2e2758808a237c39ab80dcb3c82: Status 404 returned error can't find the container with id de79d1660ed250ffe664a1de4a2638a31bcfa2e2758808a237c39ab80dcb3c82 Apr 17 21:45:27.561646 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:27.561622 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-5b656864f6-7gq75" Apr 17 21:45:27.561750 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:27.561621 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" event={"ID":"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19","Type":"ContainerStarted","Data":"de79d1660ed250ffe664a1de4a2638a31bcfa2e2758808a237c39ab80dcb3c82"} Apr 17 21:45:27.588452 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:27.588426 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-5b656864f6-7gq75"] Apr 17 21:45:27.592763 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:27.592743 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-5b656864f6-7gq75"] Apr 17 21:45:27.694427 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:27.694385 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80f9981b-bbe7-42b4-937b-b740e292f587" path="/var/lib/kubelet/pods/80f9981b-bbe7-42b4-937b-b740e292f587/volumes" Apr 17 21:45:28.565985 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:28.565946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" event={"ID":"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19","Type":"ContainerStarted","Data":"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761"} Apr 17 21:45:28.583259 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:28.583207 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" podStartSLOduration=2.007424568 podStartE2EDuration="2.583192198s" podCreationTimestamp="2026-04-17 21:45:26 +0000 UTC" firstStartedPulling="2026-04-17 21:45:26.884564488 +0000 UTC m=+523.647323956" lastFinishedPulling="2026-04-17 21:45:27.460332127 +0000 UTC m=+524.223091586" observedRunningTime="2026-04-17 21:45:28.582108541 +0000 UTC m=+525.344868019" watchObservedRunningTime="2026-04-17 21:45:28.583192198 +0000 UTC m=+525.345951675" Apr 17 21:45:28.607718 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:28.607582 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:45:28.607962 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:28.607937 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-7ldkp" podUID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" containerName="authorino" containerID="cri-o://168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988" gracePeriod=30 Apr 17 21:45:28.890572 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:28.890550 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:45:29.026429 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.026398 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jm6c5\" (UniqueName: \"kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5\") pod \"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f\" (UID: \"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f\") " Apr 17 21:45:29.028399 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.028368 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5" (OuterVolumeSpecName: "kube-api-access-jm6c5") pod "c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" (UID: "c0b07de7-4ca9-4a1c-9a22-b0fa3377106f"). InnerVolumeSpecName "kube-api-access-jm6c5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:45:29.126964 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.126897 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jm6c5\" (UniqueName: \"kubernetes.io/projected/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f-kube-api-access-jm6c5\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:29.570285 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.570200 2564 generic.go:358] "Generic (PLEG): container finished" podID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" containerID="168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988" exitCode=0 Apr 17 21:45:29.570285 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.570251 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-7ldkp" Apr 17 21:45:29.570511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.570284 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-7ldkp" event={"ID":"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f","Type":"ContainerDied","Data":"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988"} Apr 17 21:45:29.570511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.570327 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-7ldkp" event={"ID":"c0b07de7-4ca9-4a1c-9a22-b0fa3377106f","Type":"ContainerDied","Data":"e3f01a0c18a030ed9af19624b0fb80e3c8099d6237696edee3d7efc7c435843d"} Apr 17 21:45:29.570511 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.570346 2564 scope.go:117] "RemoveContainer" containerID="168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988" Apr 17 21:45:29.579078 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.579063 2564 scope.go:117] "RemoveContainer" containerID="168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988" Apr 17 21:45:29.579316 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:45:29.579297 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988\": container with ID starting with 168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988 not found: ID does not exist" containerID="168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988" Apr 17 21:45:29.579386 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.579321 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988"} err="failed to get container status \"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988\": rpc error: code = NotFound desc = could not find container \"168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988\": container with ID starting with 168dccd01a62eb8321981e2d98cbd9d8c931c2a761f3955f6df3a8c17e6c8988 not found: ID does not exist" Apr 17 21:45:29.587628 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.587583 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:29.587940 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.587927 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" containerName="authorino" Apr 17 21:45:29.587982 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.587942 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" containerName="authorino" Apr 17 21:45:29.588039 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.588027 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" containerName="authorino" Apr 17 21:45:29.642696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.642665 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:45:29.642696 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.642699 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:29.642887 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.642715 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-7ldkp"] Apr 17 21:45:29.642887 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.642756 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:29.645416 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.645398 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-w5d4j\"" Apr 17 21:45:29.694083 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.694055 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0b07de7-4ca9-4a1c-9a22-b0fa3377106f" path="/var/lib/kubelet/pods/c0b07de7-4ca9-4a1c-9a22-b0fa3377106f/volumes" Apr 17 21:45:29.731729 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.731699 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh2bs\" (UniqueName: \"kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs\") pod \"maas-controller-c44cf996f-dlq2f\" (UID: \"72455840-4261-47a5-aea1-a4b6e1ffdb11\") " pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:29.810136 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.810109 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:29.832965 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.832886 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qh2bs\" (UniqueName: \"kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs\") pod \"maas-controller-c44cf996f-dlq2f\" (UID: \"72455840-4261-47a5-aea1-a4b6e1ffdb11\") " pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:29.836628 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.836607 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:29.836723 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.836713 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:29.840875 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.840856 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh2bs\" (UniqueName: \"kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs\") pod \"maas-controller-c44cf996f-dlq2f\" (UID: \"72455840-4261-47a5-aea1-a4b6e1ffdb11\") " pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:29.933548 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.933516 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxcl9\" (UniqueName: \"kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9\") pod \"maas-controller-545987446b-wvtj2\" (UID: \"c52a8b73-2de6-4436-b9ee-88b9412369eb\") " pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:29.952765 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:29.952741 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:30.034414 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.034365 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rxcl9\" (UniqueName: \"kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9\") pod \"maas-controller-545987446b-wvtj2\" (UID: \"c52a8b73-2de6-4436-b9ee-88b9412369eb\") " pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:30.042768 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.042741 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxcl9\" (UniqueName: \"kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9\") pod \"maas-controller-545987446b-wvtj2\" (UID: \"c52a8b73-2de6-4436-b9ee-88b9412369eb\") " pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:30.076699 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.076676 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:30.078473 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:30.078448 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72455840_4261_47a5_aea1_a4b6e1ffdb11.slice/crio-429ee18f7b2b5aeb7a2a5227acf4c8197df8e5de37e8c26f7fe949662b0d3fe1 WatchSource:0}: Error finding container 429ee18f7b2b5aeb7a2a5227acf4c8197df8e5de37e8c26f7fe949662b0d3fe1: Status 404 returned error can't find the container with id 429ee18f7b2b5aeb7a2a5227acf4c8197df8e5de37e8c26f7fe949662b0d3fe1 Apr 17 21:45:30.153639 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.153577 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:30.272995 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.272971 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:30.274776 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:30.274747 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc52a8b73_2de6_4436_b9ee_88b9412369eb.slice/crio-6546484c47a2fb63ef2894dd3899122873d4c433615f6f367d67a04f5bd90836 WatchSource:0}: Error finding container 6546484c47a2fb63ef2894dd3899122873d4c433615f6f367d67a04f5bd90836: Status 404 returned error can't find the container with id 6546484c47a2fb63ef2894dd3899122873d4c433615f6f367d67a04f5bd90836 Apr 17 21:45:30.581079 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.580984 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-dlq2f" event={"ID":"72455840-4261-47a5-aea1-a4b6e1ffdb11","Type":"ContainerStarted","Data":"429ee18f7b2b5aeb7a2a5227acf4c8197df8e5de37e8c26f7fe949662b0d3fe1"} Apr 17 21:45:30.581947 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:30.581927 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-wvtj2" event={"ID":"c52a8b73-2de6-4436-b9ee-88b9412369eb","Type":"ContainerStarted","Data":"6546484c47a2fb63ef2894dd3899122873d4c433615f6f367d67a04f5bd90836"} Apr 17 21:45:33.597840 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.597784 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-dlq2f" event={"ID":"72455840-4261-47a5-aea1-a4b6e1ffdb11","Type":"ContainerStarted","Data":"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37"} Apr 17 21:45:33.598304 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.597905 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:33.599101 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.599081 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-wvtj2" event={"ID":"c52a8b73-2de6-4436-b9ee-88b9412369eb","Type":"ContainerStarted","Data":"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d"} Apr 17 21:45:33.599241 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.599224 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:33.613449 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.613397 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-c44cf996f-dlq2f" podStartSLOduration=1.835799233 podStartE2EDuration="4.613384655s" podCreationTimestamp="2026-04-17 21:45:29 +0000 UTC" firstStartedPulling="2026-04-17 21:45:30.079861526 +0000 UTC m=+526.842620985" lastFinishedPulling="2026-04-17 21:45:32.857446941 +0000 UTC m=+529.620206407" observedRunningTime="2026-04-17 21:45:33.612150336 +0000 UTC m=+530.374909814" watchObservedRunningTime="2026-04-17 21:45:33.613384655 +0000 UTC m=+530.376144133" Apr 17 21:45:33.626005 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:33.625952 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-545987446b-wvtj2" podStartSLOduration=2.034148613 podStartE2EDuration="4.625937525s" podCreationTimestamp="2026-04-17 21:45:29 +0000 UTC" firstStartedPulling="2026-04-17 21:45:30.276108345 +0000 UTC m=+527.038867802" lastFinishedPulling="2026-04-17 21:45:32.867897255 +0000 UTC m=+529.630656714" observedRunningTime="2026-04-17 21:45:33.625692628 +0000 UTC m=+530.388452117" watchObservedRunningTime="2026-04-17 21:45:33.625937525 +0000 UTC m=+530.388697004" Apr 17 21:45:44.609138 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.609101 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:44.609575 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.609156 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:44.657556 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.657522 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:44.657748 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.657727 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-c44cf996f-dlq2f" podUID="72455840-4261-47a5-aea1-a4b6e1ffdb11" containerName="manager" containerID="cri-o://be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37" gracePeriod=10 Apr 17 21:45:44.903177 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.903153 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:44.950472 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.950437 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:45:44.950795 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.950783 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72455840-4261-47a5-aea1-a4b6e1ffdb11" containerName="manager" Apr 17 21:45:44.950839 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.950797 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="72455840-4261-47a5-aea1-a4b6e1ffdb11" containerName="manager" Apr 17 21:45:44.950878 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.950857 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="72455840-4261-47a5-aea1-a4b6e1ffdb11" containerName="manager" Apr 17 21:45:44.953973 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.953958 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:44.960084 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:44.960059 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:45:45.057097 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.057060 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh2bs\" (UniqueName: \"kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs\") pod \"72455840-4261-47a5-aea1-a4b6e1ffdb11\" (UID: \"72455840-4261-47a5-aea1-a4b6e1ffdb11\") " Apr 17 21:45:45.057299 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.057154 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v5jkc\" (UniqueName: \"kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc\") pod \"maas-controller-5c4844f674-6pgs5\" (UID: \"78909686-b704-4628-b0bb-919e316cf769\") " pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:45.059183 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.059148 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs" (OuterVolumeSpecName: "kube-api-access-qh2bs") pod "72455840-4261-47a5-aea1-a4b6e1ffdb11" (UID: "72455840-4261-47a5-aea1-a4b6e1ffdb11"). InnerVolumeSpecName "kube-api-access-qh2bs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:45:45.157790 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.157699 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v5jkc\" (UniqueName: \"kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc\") pod \"maas-controller-5c4844f674-6pgs5\" (UID: \"78909686-b704-4628-b0bb-919e316cf769\") " pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:45.157790 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.157787 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qh2bs\" (UniqueName: \"kubernetes.io/projected/72455840-4261-47a5-aea1-a4b6e1ffdb11-kube-api-access-qh2bs\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:45.165882 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.165860 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v5jkc\" (UniqueName: \"kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc\") pod \"maas-controller-5c4844f674-6pgs5\" (UID: \"78909686-b704-4628-b0bb-919e316cf769\") " pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:45.264942 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.264894 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:45.385304 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.385279 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:45:45.387226 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:45.387198 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78909686_b704_4628_b0bb_919e316cf769.slice/crio-4cddcaf1643a4299ad7c0b21d4db942a86f1ec2bb149fcce6991f280fad4f90f WatchSource:0}: Error finding container 4cddcaf1643a4299ad7c0b21d4db942a86f1ec2bb149fcce6991f280fad4f90f: Status 404 returned error can't find the container with id 4cddcaf1643a4299ad7c0b21d4db942a86f1ec2bb149fcce6991f280fad4f90f Apr 17 21:45:45.642025 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.641990 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-6pgs5" event={"ID":"78909686-b704-4628-b0bb-919e316cf769","Type":"ContainerStarted","Data":"4cddcaf1643a4299ad7c0b21d4db942a86f1ec2bb149fcce6991f280fad4f90f"} Apr 17 21:45:45.643090 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.643062 2564 generic.go:358] "Generic (PLEG): container finished" podID="72455840-4261-47a5-aea1-a4b6e1ffdb11" containerID="be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37" exitCode=0 Apr 17 21:45:45.643205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.643121 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-c44cf996f-dlq2f" Apr 17 21:45:45.643205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.643129 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-dlq2f" event={"ID":"72455840-4261-47a5-aea1-a4b6e1ffdb11","Type":"ContainerDied","Data":"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37"} Apr 17 21:45:45.643205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.643153 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-c44cf996f-dlq2f" event={"ID":"72455840-4261-47a5-aea1-a4b6e1ffdb11","Type":"ContainerDied","Data":"429ee18f7b2b5aeb7a2a5227acf4c8197df8e5de37e8c26f7fe949662b0d3fe1"} Apr 17 21:45:45.643205 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.643169 2564 scope.go:117] "RemoveContainer" containerID="be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37" Apr 17 21:45:45.651639 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.651563 2564 scope.go:117] "RemoveContainer" containerID="be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37" Apr 17 21:45:45.651866 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:45:45.651846 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37\": container with ID starting with be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37 not found: ID does not exist" containerID="be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37" Apr 17 21:45:45.651936 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.651877 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37"} err="failed to get container status \"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37\": rpc error: code = NotFound desc = could not find container \"be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37\": container with ID starting with be6873982c6cf4feef2537a465a158b9dfeae3dfc77b5dff819965013f047d37 not found: ID does not exist" Apr 17 21:45:45.666499 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.666475 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:45.668687 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.668665 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-c44cf996f-dlq2f"] Apr 17 21:45:45.694148 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:45.694112 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72455840-4261-47a5-aea1-a4b6e1ffdb11" path="/var/lib/kubelet/pods/72455840-4261-47a5-aea1-a4b6e1ffdb11/volumes" Apr 17 21:45:46.647657 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:46.647618 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-6pgs5" event={"ID":"78909686-b704-4628-b0bb-919e316cf769","Type":"ContainerStarted","Data":"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092"} Apr 17 21:45:46.648145 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:46.647759 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:46.663885 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:46.663838 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5c4844f674-6pgs5" podStartSLOduration=2.3698798549999998 podStartE2EDuration="2.663825387s" podCreationTimestamp="2026-04-17 21:45:44 +0000 UTC" firstStartedPulling="2026-04-17 21:45:45.388666368 +0000 UTC m=+542.151425824" lastFinishedPulling="2026-04-17 21:45:45.6826119 +0000 UTC m=+542.445371356" observedRunningTime="2026-04-17 21:45:46.661340576 +0000 UTC m=+543.424100067" watchObservedRunningTime="2026-04-17 21:45:46.663825387 +0000 UTC m=+543.426584865" Apr 17 21:45:51.292265 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.292229 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:45:51.295827 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.295812 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.298185 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.298161 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-lpm9t\"" Apr 17 21:45:51.298310 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.298163 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 17 21:45:51.298310 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.298283 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 17 21:45:51.303095 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.303073 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:45:51.409955 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.409923 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqkxd\" (UniqueName: \"kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.410136 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.409993 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.511170 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.511136 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.511350 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.511182 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqkxd\" (UniqueName: \"kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.513820 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.513795 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.518301 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.518279 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqkxd\" (UniqueName: \"kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd\") pod \"maas-api-85cd687b77-l2zwx\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.608066 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.608034 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:51.741517 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:51.741484 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:45:51.743340 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:45:51.743304 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f59b14_6871_4ebd_b3e6_2ebd6104de37.slice/crio-72e856219e2ec1239d22e1336db1e0bfeefd0d59a2970325622e40fb45a64918 WatchSource:0}: Error finding container 72e856219e2ec1239d22e1336db1e0bfeefd0d59a2970325622e40fb45a64918: Status 404 returned error can't find the container with id 72e856219e2ec1239d22e1336db1e0bfeefd0d59a2970325622e40fb45a64918 Apr 17 21:45:52.673627 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:52.673572 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85cd687b77-l2zwx" event={"ID":"02f59b14-6871-4ebd-b3e6-2ebd6104de37","Type":"ContainerStarted","Data":"72e856219e2ec1239d22e1336db1e0bfeefd0d59a2970325622e40fb45a64918"} Apr 17 21:45:53.678240 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:53.678204 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85cd687b77-l2zwx" event={"ID":"02f59b14-6871-4ebd-b3e6-2ebd6104de37","Type":"ContainerStarted","Data":"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541"} Apr 17 21:45:53.678658 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:53.678294 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:53.692509 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:53.692449 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-85cd687b77-l2zwx" podStartSLOduration=1.270402984 podStartE2EDuration="2.692435152s" podCreationTimestamp="2026-04-17 21:45:51 +0000 UTC" firstStartedPulling="2026-04-17 21:45:51.745174469 +0000 UTC m=+548.507933925" lastFinishedPulling="2026-04-17 21:45:53.167206634 +0000 UTC m=+549.929966093" observedRunningTime="2026-04-17 21:45:53.692120288 +0000 UTC m=+550.454879765" watchObservedRunningTime="2026-04-17 21:45:53.692435152 +0000 UTC m=+550.455194629" Apr 17 21:45:57.657665 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:57.657631 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:45:57.694669 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:57.694641 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:57.694884 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:57.694861 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-545987446b-wvtj2" podUID="c52a8b73-2de6-4436-b9ee-88b9412369eb" containerName="manager" containerID="cri-o://697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d" gracePeriod=10 Apr 17 21:45:57.936340 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:57.936314 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:58.073722 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.073691 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rxcl9\" (UniqueName: \"kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9\") pod \"c52a8b73-2de6-4436-b9ee-88b9412369eb\" (UID: \"c52a8b73-2de6-4436-b9ee-88b9412369eb\") " Apr 17 21:45:58.075748 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.075716 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9" (OuterVolumeSpecName: "kube-api-access-rxcl9") pod "c52a8b73-2de6-4436-b9ee-88b9412369eb" (UID: "c52a8b73-2de6-4436-b9ee-88b9412369eb"). InnerVolumeSpecName "kube-api-access-rxcl9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:45:58.175049 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.174977 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rxcl9\" (UniqueName: \"kubernetes.io/projected/c52a8b73-2de6-4436-b9ee-88b9412369eb-kube-api-access-rxcl9\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:45:58.698653 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.698620 2564 generic.go:358] "Generic (PLEG): container finished" podID="c52a8b73-2de6-4436-b9ee-88b9412369eb" containerID="697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d" exitCode=0 Apr 17 21:45:58.699099 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.698673 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-wvtj2" event={"ID":"c52a8b73-2de6-4436-b9ee-88b9412369eb","Type":"ContainerDied","Data":"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d"} Apr 17 21:45:58.699099 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.698676 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-545987446b-wvtj2" Apr 17 21:45:58.699099 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.698696 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-545987446b-wvtj2" event={"ID":"c52a8b73-2de6-4436-b9ee-88b9412369eb","Type":"ContainerDied","Data":"6546484c47a2fb63ef2894dd3899122873d4c433615f6f367d67a04f5bd90836"} Apr 17 21:45:58.699099 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.698711 2564 scope.go:117] "RemoveContainer" containerID="697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d" Apr 17 21:45:58.707575 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.707539 2564 scope.go:117] "RemoveContainer" containerID="697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d" Apr 17 21:45:58.707813 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:45:58.707794 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d\": container with ID starting with 697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d not found: ID does not exist" containerID="697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d" Apr 17 21:45:58.707860 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.707821 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d"} err="failed to get container status \"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d\": rpc error: code = NotFound desc = could not find container \"697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d\": container with ID starting with 697a60f748b097f3026621aff93b302528724fa383d33553e7b8212676c6a86d not found: ID does not exist" Apr 17 21:45:58.719161 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.719134 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:58.724443 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:58.724423 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-545987446b-wvtj2"] Apr 17 21:45:59.687112 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:59.687077 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:45:59.694342 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:45:59.694314 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c52a8b73-2de6-4436-b9ee-88b9412369eb" path="/var/lib/kubelet/pods/c52a8b73-2de6-4436-b9ee-88b9412369eb/volumes" Apr 17 21:46:15.889028 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.888988 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-6c569d4cc5-ssl9t"] Apr 17 21:46:15.889648 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.889481 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c52a8b73-2de6-4436-b9ee-88b9412369eb" containerName="manager" Apr 17 21:46:15.889648 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.889502 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="c52a8b73-2de6-4436-b9ee-88b9412369eb" containerName="manager" Apr 17 21:46:15.889773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.889656 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="c52a8b73-2de6-4436-b9ee-88b9412369eb" containerName="manager" Apr 17 21:46:15.896327 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.896303 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:15.906341 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.906309 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c569d4cc5-ssl9t"] Apr 17 21:46:15.914874 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.914844 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-maas-api-tls\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:15.915032 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:15.914897 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f26zv\" (UniqueName: \"kubernetes.io/projected/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-kube-api-access-f26zv\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.015455 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.015419 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-maas-api-tls\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.015667 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.015474 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f26zv\" (UniqueName: \"kubernetes.io/projected/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-kube-api-access-f26zv\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.017788 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.017766 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-maas-api-tls\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.022944 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.022922 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f26zv\" (UniqueName: \"kubernetes.io/projected/f67211fa-d3ff-4bb1-9b40-3dd3202edc94-kube-api-access-f26zv\") pod \"maas-api-6c569d4cc5-ssl9t\" (UID: \"f67211fa-d3ff-4bb1-9b40-3dd3202edc94\") " pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.208076 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.207990 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:16.330056 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.330028 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-6c569d4cc5-ssl9t"] Apr 17 21:46:16.332370 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:46:16.332323 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf67211fa_d3ff_4bb1_9b40_3dd3202edc94.slice/crio-38b977e7b1f346b36e1a9bfc7d75aac5dbc651c70ec9289d9e38208aa3e3ad31 WatchSource:0}: Error finding container 38b977e7b1f346b36e1a9bfc7d75aac5dbc651c70ec9289d9e38208aa3e3ad31: Status 404 returned error can't find the container with id 38b977e7b1f346b36e1a9bfc7d75aac5dbc651c70ec9289d9e38208aa3e3ad31 Apr 17 21:46:16.767420 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:16.767381 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" event={"ID":"f67211fa-d3ff-4bb1-9b40-3dd3202edc94","Type":"ContainerStarted","Data":"38b977e7b1f346b36e1a9bfc7d75aac5dbc651c70ec9289d9e38208aa3e3ad31"} Apr 17 21:46:18.776430 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:18.776395 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" event={"ID":"f67211fa-d3ff-4bb1-9b40-3dd3202edc94","Type":"ContainerStarted","Data":"cb09eede282de08701ca47319b6ef0eb5ffcba8de362e187360cc4a7cecc1c03"} Apr 17 21:46:18.776960 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:18.776440 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:18.793434 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:18.793385 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" podStartSLOduration=2.317430044 podStartE2EDuration="3.793370574s" podCreationTimestamp="2026-04-17 21:46:15 +0000 UTC" firstStartedPulling="2026-04-17 21:46:16.333688559 +0000 UTC m=+573.096448015" lastFinishedPulling="2026-04-17 21:46:17.809629089 +0000 UTC m=+574.572388545" observedRunningTime="2026-04-17 21:46:18.792559873 +0000 UTC m=+575.555319372" watchObservedRunningTime="2026-04-17 21:46:18.793370574 +0000 UTC m=+575.556130052" Apr 17 21:46:24.785266 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:24.785238 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-6c569d4cc5-ssl9t" Apr 17 21:46:24.826225 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:24.826180 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:46:24.826555 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:24.826492 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-85cd687b77-l2zwx" podUID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" containerName="maas-api" containerID="cri-o://7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541" gracePeriod=30 Apr 17 21:46:25.067194 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.067166 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:46:25.199113 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.199079 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqkxd\" (UniqueName: \"kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd\") pod \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " Apr 17 21:46:25.199288 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.199162 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls\") pod \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\" (UID: \"02f59b14-6871-4ebd-b3e6-2ebd6104de37\") " Apr 17 21:46:25.201189 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.201165 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "02f59b14-6871-4ebd-b3e6-2ebd6104de37" (UID: "02f59b14-6871-4ebd-b3e6-2ebd6104de37"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:46:25.201302 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.201220 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd" (OuterVolumeSpecName: "kube-api-access-vqkxd") pod "02f59b14-6871-4ebd-b3e6-2ebd6104de37" (UID: "02f59b14-6871-4ebd-b3e6-2ebd6104de37"). InnerVolumeSpecName "kube-api-access-vqkxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:46:25.300440 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.300361 2564 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/02f59b14-6871-4ebd-b3e6-2ebd6104de37-maas-api-tls\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:46:25.300440 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.300389 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vqkxd\" (UniqueName: \"kubernetes.io/projected/02f59b14-6871-4ebd-b3e6-2ebd6104de37-kube-api-access-vqkxd\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:46:25.803033 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.802998 2564 generic.go:358] "Generic (PLEG): container finished" podID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" containerID="7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541" exitCode=0 Apr 17 21:46:25.803422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.803089 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85cd687b77-l2zwx" Apr 17 21:46:25.803422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.803081 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85cd687b77-l2zwx" event={"ID":"02f59b14-6871-4ebd-b3e6-2ebd6104de37","Type":"ContainerDied","Data":"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541"} Apr 17 21:46:25.803422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.803194 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85cd687b77-l2zwx" event={"ID":"02f59b14-6871-4ebd-b3e6-2ebd6104de37","Type":"ContainerDied","Data":"72e856219e2ec1239d22e1336db1e0bfeefd0d59a2970325622e40fb45a64918"} Apr 17 21:46:25.803422 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.803215 2564 scope.go:117] "RemoveContainer" containerID="7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541" Apr 17 21:46:25.811548 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.811522 2564 scope.go:117] "RemoveContainer" containerID="7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541" Apr 17 21:46:25.811991 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:46:25.811971 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541\": container with ID starting with 7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541 not found: ID does not exist" containerID="7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541" Apr 17 21:46:25.812048 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.811999 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541"} err="failed to get container status \"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541\": rpc error: code = NotFound desc = could not find container \"7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541\": container with ID starting with 7d7fd1f10d6ae20ddf63cab23c67a20220349943d117ef7d8ca9680d98912541 not found: ID does not exist" Apr 17 21:46:25.820695 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.820668 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:46:25.822811 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:25.822792 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-85cd687b77-l2zwx"] Apr 17 21:46:27.694008 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:27.693965 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" path="/var/lib/kubelet/pods/02f59b14-6871-4ebd-b3e6-2ebd6104de37/volumes" Apr 17 21:46:43.664876 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:43.664845 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:46:43.665486 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:46:43.665471 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:47:39.918691 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.918659 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-74d5b49497-nkbvg"] Apr 17 21:47:39.919094 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.919023 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" containerName="maas-api" Apr 17 21:47:39.919094 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.919035 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" containerName="maas-api" Apr 17 21:47:39.919169 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.919104 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="02f59b14-6871-4ebd-b3e6-2ebd6104de37" containerName="maas-api" Apr 17 21:47:39.922133 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.922117 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:39.928747 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:39.928721 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-74d5b49497-nkbvg"] Apr 17 21:47:40.036150 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.036117 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1f3a7837-afa1-44ff-b315-12ad9060d0f1-tls-cert\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.036150 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.036159 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7k2j\" (UniqueName: \"kubernetes.io/projected/1f3a7837-afa1-44ff-b315-12ad9060d0f1-kube-api-access-l7k2j\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.136631 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.136603 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1f3a7837-afa1-44ff-b315-12ad9060d0f1-tls-cert\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.136800 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.136645 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7k2j\" (UniqueName: \"kubernetes.io/projected/1f3a7837-afa1-44ff-b315-12ad9060d0f1-kube-api-access-l7k2j\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.139021 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.138996 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1f3a7837-afa1-44ff-b315-12ad9060d0f1-tls-cert\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.143931 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.143910 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7k2j\" (UniqueName: \"kubernetes.io/projected/1f3a7837-afa1-44ff-b315-12ad9060d0f1-kube-api-access-l7k2j\") pod \"authorino-74d5b49497-nkbvg\" (UID: \"1f3a7837-afa1-44ff-b315-12ad9060d0f1\") " pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.232451 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.232378 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-74d5b49497-nkbvg" Apr 17 21:47:40.567241 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.567213 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-74d5b49497-nkbvg"] Apr 17 21:47:40.568397 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:47:40.568369 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f3a7837_afa1_44ff_b315_12ad9060d0f1.slice/crio-e0ee5a177309ea3fba55665a53b710dd38cfa2fbe4da71683f49be8cbd57ea73 WatchSource:0}: Error finding container e0ee5a177309ea3fba55665a53b710dd38cfa2fbe4da71683f49be8cbd57ea73: Status 404 returned error can't find the container with id e0ee5a177309ea3fba55665a53b710dd38cfa2fbe4da71683f49be8cbd57ea73 Apr 17 21:47:40.569551 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:40.569533 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 21:47:41.096016 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.095977 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74d5b49497-nkbvg" event={"ID":"1f3a7837-afa1-44ff-b315-12ad9060d0f1","Type":"ContainerStarted","Data":"e7299ce219b18dcf8bc713476a11edcdf2be931a39df7cde0a2d76bb4c870bec"} Apr 17 21:47:41.096016 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.096018 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-74d5b49497-nkbvg" event={"ID":"1f3a7837-afa1-44ff-b315-12ad9060d0f1","Type":"ContainerStarted","Data":"e0ee5a177309ea3fba55665a53b710dd38cfa2fbe4da71683f49be8cbd57ea73"} Apr 17 21:47:41.114565 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.114498 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-74d5b49497-nkbvg" podStartSLOduration=1.680003547 podStartE2EDuration="2.114477444s" podCreationTimestamp="2026-04-17 21:47:39 +0000 UTC" firstStartedPulling="2026-04-17 21:47:40.56971072 +0000 UTC m=+657.332470176" lastFinishedPulling="2026-04-17 21:47:41.004184615 +0000 UTC m=+657.766944073" observedRunningTime="2026-04-17 21:47:41.112673987 +0000 UTC m=+657.875433466" watchObservedRunningTime="2026-04-17 21:47:41.114477444 +0000 UTC m=+657.877237007" Apr 17 21:47:41.138791 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.138757 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:47:41.139266 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.139234 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" podUID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" containerName="authorino" containerID="cri-o://09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761" gracePeriod=30 Apr 17 21:47:41.394164 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.394132 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:47:41.550992 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.550915 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxqvp\" (UniqueName: \"kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp\") pod \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " Apr 17 21:47:41.550992 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.550958 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert\") pod \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\" (UID: \"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19\") " Apr 17 21:47:41.552950 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.552913 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp" (OuterVolumeSpecName: "kube-api-access-lxqvp") pod "b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" (UID: "b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19"). InnerVolumeSpecName "kube-api-access-lxqvp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:47:41.560767 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.560740 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" (UID: "b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 21:47:41.651567 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.651524 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxqvp\" (UniqueName: \"kubernetes.io/projected/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-kube-api-access-lxqvp\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:47:41.651567 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:41.651558 2564 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19-tls-cert\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:47:42.101101 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.101063 2564 generic.go:358] "Generic (PLEG): container finished" podID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" containerID="09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761" exitCode=0 Apr 17 21:47:42.101500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.101127 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" Apr 17 21:47:42.101500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.101144 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" event={"ID":"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19","Type":"ContainerDied","Data":"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761"} Apr 17 21:47:42.101500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.101184 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-55b44f5d48-lx5xx" event={"ID":"b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19","Type":"ContainerDied","Data":"de79d1660ed250ffe664a1de4a2638a31bcfa2e2758808a237c39ab80dcb3c82"} Apr 17 21:47:42.101500 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.101205 2564 scope.go:117] "RemoveContainer" containerID="09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761" Apr 17 21:47:42.109430 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.109403 2564 scope.go:117] "RemoveContainer" containerID="09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761" Apr 17 21:47:42.109694 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:47:42.109676 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761\": container with ID starting with 09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761 not found: ID does not exist" containerID="09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761" Apr 17 21:47:42.109765 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.109710 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761"} err="failed to get container status \"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761\": rpc error: code = NotFound desc = could not find container \"09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761\": container with ID starting with 09ff32434f6c70e067ffa222e0dfbc9c2c74d7f35977ca016ffbac0942af5761 not found: ID does not exist" Apr 17 21:47:42.117276 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.117251 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:47:42.121810 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:42.121788 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-55b44f5d48-lx5xx"] Apr 17 21:47:43.695271 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:47:43.695234 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" path="/var/lib/kubelet/pods/b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19/volumes" Apr 17 21:49:05.597385 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.597337 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:49:05.597951 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.597632 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-5c4844f674-6pgs5" podUID="78909686-b704-4628-b0bb-919e316cf769" containerName="manager" containerID="cri-o://7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092" gracePeriod=10 Apr 17 21:49:05.830561 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.830538 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:49:05.856166 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.856098 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v5jkc\" (UniqueName: \"kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc\") pod \"78909686-b704-4628-b0bb-919e316cf769\" (UID: \"78909686-b704-4628-b0bb-919e316cf769\") " Apr 17 21:49:05.857950 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.857922 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc" (OuterVolumeSpecName: "kube-api-access-v5jkc") pod "78909686-b704-4628-b0bb-919e316cf769" (UID: "78909686-b704-4628-b0bb-919e316cf769"). InnerVolumeSpecName "kube-api-access-v5jkc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 21:49:05.957120 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:05.957090 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v5jkc\" (UniqueName: \"kubernetes.io/projected/78909686-b704-4628-b0bb-919e316cf769-kube-api-access-v5jkc\") on node \"ip-10-0-132-27.ec2.internal\" DevicePath \"\"" Apr 17 21:49:06.413123 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.413088 2564 generic.go:358] "Generic (PLEG): container finished" podID="78909686-b704-4628-b0bb-919e316cf769" containerID="7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092" exitCode=0 Apr 17 21:49:06.413297 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.413157 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-6pgs5" Apr 17 21:49:06.413297 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.413178 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-6pgs5" event={"ID":"78909686-b704-4628-b0bb-919e316cf769","Type":"ContainerDied","Data":"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092"} Apr 17 21:49:06.413297 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.413211 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-6pgs5" event={"ID":"78909686-b704-4628-b0bb-919e316cf769","Type":"ContainerDied","Data":"4cddcaf1643a4299ad7c0b21d4db942a86f1ec2bb149fcce6991f280fad4f90f"} Apr 17 21:49:06.413297 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.413227 2564 scope.go:117] "RemoveContainer" containerID="7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092" Apr 17 21:49:06.421708 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.421688 2564 scope.go:117] "RemoveContainer" containerID="7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092" Apr 17 21:49:06.421948 ip-10-0-132-27 kubenswrapper[2564]: E0417 21:49:06.421932 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092\": container with ID starting with 7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092 not found: ID does not exist" containerID="7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092" Apr 17 21:49:06.421989 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.421955 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092"} err="failed to get container status \"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092\": rpc error: code = NotFound desc = could not find container \"7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092\": container with ID starting with 7eda1f4e8effade9461b13e3c374270dc77dd43942e385d6ded6a401bf3f1092 not found: ID does not exist" Apr 17 21:49:06.433803 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.433783 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:49:06.436183 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:06.436164 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5c4844f674-6pgs5"] Apr 17 21:49:07.152366 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152333 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5c4844f674-rsfcc"] Apr 17 21:49:07.152755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152716 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="78909686-b704-4628-b0bb-919e316cf769" containerName="manager" Apr 17 21:49:07.152755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152729 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="78909686-b704-4628-b0bb-919e316cf769" containerName="manager" Apr 17 21:49:07.152755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152743 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" containerName="authorino" Apr 17 21:49:07.152755 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152749 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" containerName="authorino" Apr 17 21:49:07.152886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152803 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="78909686-b704-4628-b0bb-919e316cf769" containerName="manager" Apr 17 21:49:07.152886 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.152810 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="b3cbbc88-5d35-42e3-a3ac-e14e31ce1d19" containerName="authorino" Apr 17 21:49:07.157095 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.157077 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:07.159726 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.159688 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-w5d4j\"" Apr 17 21:49:07.162757 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.162734 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c4844f674-rsfcc"] Apr 17 21:49:07.267098 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.267064 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdgqg\" (UniqueName: \"kubernetes.io/projected/9aa35df7-0929-42d1-b8ad-f5ec2662f61d-kube-api-access-mdgqg\") pod \"maas-controller-5c4844f674-rsfcc\" (UID: \"9aa35df7-0929-42d1-b8ad-f5ec2662f61d\") " pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:07.367677 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.367644 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdgqg\" (UniqueName: \"kubernetes.io/projected/9aa35df7-0929-42d1-b8ad-f5ec2662f61d-kube-api-access-mdgqg\") pod \"maas-controller-5c4844f674-rsfcc\" (UID: \"9aa35df7-0929-42d1-b8ad-f5ec2662f61d\") " pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:07.375809 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.375783 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdgqg\" (UniqueName: \"kubernetes.io/projected/9aa35df7-0929-42d1-b8ad-f5ec2662f61d-kube-api-access-mdgqg\") pod \"maas-controller-5c4844f674-rsfcc\" (UID: \"9aa35df7-0929-42d1-b8ad-f5ec2662f61d\") " pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:07.468461 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.468380 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:07.694863 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.694832 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78909686-b704-4628-b0bb-919e316cf769" path="/var/lib/kubelet/pods/78909686-b704-4628-b0bb-919e316cf769/volumes" Apr 17 21:49:07.793655 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:07.793624 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5c4844f674-rsfcc"] Apr 17 21:49:07.797093 ip-10-0-132-27 kubenswrapper[2564]: W0417 21:49:07.797060 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa35df7_0929_42d1_b8ad_f5ec2662f61d.slice/crio-6712aabfe098bbc2f2b4f8cba58a37f552b222bc130a4213826d9204451b967f WatchSource:0}: Error finding container 6712aabfe098bbc2f2b4f8cba58a37f552b222bc130a4213826d9204451b967f: Status 404 returned error can't find the container with id 6712aabfe098bbc2f2b4f8cba58a37f552b222bc130a4213826d9204451b967f Apr 17 21:49:08.422773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:08.422740 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-rsfcc" event={"ID":"9aa35df7-0929-42d1-b8ad-f5ec2662f61d","Type":"ContainerStarted","Data":"d369bb04b1fe558cd4f57568c40c325b13aab92a8a92ac2b180e7ed95bc27fca"} Apr 17 21:49:08.422773 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:08.422776 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-5c4844f674-rsfcc" event={"ID":"9aa35df7-0929-42d1-b8ad-f5ec2662f61d","Type":"ContainerStarted","Data":"6712aabfe098bbc2f2b4f8cba58a37f552b222bc130a4213826d9204451b967f"} Apr 17 21:49:08.423289 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:08.422814 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:49:08.438961 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:08.438911 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-5c4844f674-rsfcc" podStartSLOduration=1.040249213 podStartE2EDuration="1.438896903s" podCreationTimestamp="2026-04-17 21:49:07 +0000 UTC" firstStartedPulling="2026-04-17 21:49:07.79882738 +0000 UTC m=+744.561586851" lastFinishedPulling="2026-04-17 21:49:08.197475084 +0000 UTC m=+744.960234541" observedRunningTime="2026-04-17 21:49:08.436573961 +0000 UTC m=+745.199333440" watchObservedRunningTime="2026-04-17 21:49:08.438896903 +0000 UTC m=+745.201656387" Apr 17 21:49:19.431682 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:49:19.431651 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-5c4844f674-rsfcc" Apr 17 21:51:43.695568 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:51:43.695537 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:51:43.697237 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:51:43.697215 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:56:43.726107 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:56:43.726073 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 21:56:43.729014 ip-10-0-132-27 kubenswrapper[2564]: I0417 21:56:43.728995 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:01:43.754580 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:01:43.754471 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:01:43.764651 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:01:43.764627 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:04:59.549213 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.549113 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k"] Apr 17 22:04:59.552548 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.552532 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.555314 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.555290 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-v5tkv\"" Apr 17 22:04:59.564440 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.564417 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k"] Apr 17 22:04:59.618208 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.618173 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btp58\" (UniqueName: \"kubernetes.io/projected/f6a0918e-d06d-4d18-84f2-22baeb806fa1-kube-api-access-btp58\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.618380 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.618222 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6a0918e-d06d-4d18-84f2-22baeb806fa1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.719474 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.719442 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-btp58\" (UniqueName: \"kubernetes.io/projected/f6a0918e-d06d-4d18-84f2-22baeb806fa1-kube-api-access-btp58\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.719659 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.719491 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6a0918e-d06d-4d18-84f2-22baeb806fa1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.719897 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.719879 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f6a0918e-d06d-4d18-84f2-22baeb806fa1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.728125 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.728106 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-btp58\" (UniqueName: \"kubernetes.io/projected/f6a0918e-d06d-4d18-84f2-22baeb806fa1-kube-api-access-btp58\") pod \"kuadrant-operator-controller-manager-55c7f4c975-78n6k\" (UID: \"f6a0918e-d06d-4d18-84f2-22baeb806fa1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.862749 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.862711 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:04:59.997652 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:04:59.997627 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k"] Apr 17 22:04:59.999085 ip-10-0-132-27 kubenswrapper[2564]: W0417 22:04:59.999059 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6a0918e_d06d_4d18_84f2_22baeb806fa1.slice/crio-5cd669d8e74573b8fd148187ec709588ad2119a367eb3cb2b78acc88c0928b54 WatchSource:0}: Error finding container 5cd669d8e74573b8fd148187ec709588ad2119a367eb3cb2b78acc88c0928b54: Status 404 returned error can't find the container with id 5cd669d8e74573b8fd148187ec709588ad2119a367eb3cb2b78acc88c0928b54 Apr 17 22:05:00.001621 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:00.001605 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 22:05:00.958459 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:00.958419 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" event={"ID":"f6a0918e-d06d-4d18-84f2-22baeb806fa1","Type":"ContainerStarted","Data":"d5d6d1be5a5113b0f9e1fb8056f3b5d1071bea514a08cc33f4978c33b77a13cd"} Apr 17 22:05:00.958459 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:00.958462 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" event={"ID":"f6a0918e-d06d-4d18-84f2-22baeb806fa1","Type":"ContainerStarted","Data":"5cd669d8e74573b8fd148187ec709588ad2119a367eb3cb2b78acc88c0928b54"} Apr 17 22:05:00.958910 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:00.958522 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:05:00.985246 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:00.985191 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" podStartSLOduration=1.9851772680000002 podStartE2EDuration="1.985177268s" podCreationTimestamp="2026-04-17 22:04:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 22:05:00.981726528 +0000 UTC m=+1697.744486006" watchObservedRunningTime="2026-04-17 22:05:00.985177268 +0000 UTC m=+1697.747936746" Apr 17 22:05:11.964277 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:05:11.964247 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-78n6k" Apr 17 22:06:24.432551 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:24.432475 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-74d5b49497-nkbvg_1f3a7837-afa1-44ff-b315-12ad9060d0f1/authorino/0.log" Apr 17 22:06:28.325478 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:28.325448 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6c569d4cc5-ssl9t_f67211fa-d3ff-4bb1-9b40-3dd3202edc94/maas-api/0.log" Apr 17 22:06:28.439949 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:28.439908 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5c4844f674-rsfcc_9aa35df7-0929-42d1-b8ad-f5ec2662f61d/manager/0.log" Apr 17 22:06:28.782213 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:28.782186 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bfddf7b9f-btzv9_6d0fa348-40af-4bc6-a265-834e1ef67d2b/manager/0.log" Apr 17 22:06:29.012601 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.012557 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rkqpw_6be905ab-e9cd-42c4-982a-d80fc553f8f9/postgres/0.log" Apr 17 22:06:29.748279 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.748246 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/util/0.log" Apr 17 22:06:29.753722 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.753704 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/pull/0.log" Apr 17 22:06:29.759108 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.759090 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/extract/0.log" Apr 17 22:06:29.864530 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.864504 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/util/0.log" Apr 17 22:06:29.870087 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.870064 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/pull/0.log" Apr 17 22:06:29.875187 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.875128 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/extract/0.log" Apr 17 22:06:29.977857 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.977825 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/util/0.log" Apr 17 22:06:29.983328 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.983306 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/pull/0.log" Apr 17 22:06:29.989200 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:29.989181 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/extract/0.log" Apr 17 22:06:30.100959 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.100930 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/pull/0.log" Apr 17 22:06:30.106798 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.106777 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/extract/0.log" Apr 17 22:06:30.112637 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.112585 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/util/0.log" Apr 17 22:06:30.224787 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.224758 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-74d5b49497-nkbvg_1f3a7837-afa1-44ff-b315-12ad9060d0f1/authorino/0.log" Apr 17 22:06:30.346156 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.346130 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qbt98_69318013-30f1-47fd-94e0-e17179bc801e/manager/0.log" Apr 17 22:06:30.453758 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.453681 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-tm25p_d7abbdd5-20c3-43dc-ab24-9b5945b9a147/manager/0.log" Apr 17 22:06:30.670842 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.670814 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-dkhf2_ec702473-f90c-46e0-b46a-cca01ac0f169/registry-server/0.log" Apr 17 22:06:30.783091 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:30.783015 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-78n6k_f6a0918e-d06d-4d18-84f2-22baeb806fa1/manager/0.log" Apr 17 22:06:31.334161 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:31.334129 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds_6d13e4eb-609d-477a-832d-fdb2831db5a9/istio-proxy/0.log" Apr 17 22:06:31.773702 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:31.773675 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-rqwh8_a23de3a7-3163-4907-9eb0-72991fc680a6/istio-proxy/0.log" Apr 17 22:06:36.471313 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.471276 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfkqt/must-gather-6dxkv"] Apr 17 22:06:36.474818 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.474801 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.477696 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.477670 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfkqt\"/\"openshift-service-ca.crt\"" Apr 17 22:06:36.477865 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.477674 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-tfkqt\"/\"kube-root-ca.crt\"" Apr 17 22:06:36.479243 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.479215 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-tfkqt\"/\"default-dockercfg-km98p\"" Apr 17 22:06:36.482888 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.482867 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/must-gather-6dxkv"] Apr 17 22:06:36.659903 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.659870 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4flkl\" (UniqueName: \"kubernetes.io/projected/0a730b44-a64f-4505-ad5c-5a23421e2b08-kube-api-access-4flkl\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.660087 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.659984 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a730b44-a64f-4505-ad5c-5a23421e2b08-must-gather-output\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.760645 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.760522 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a730b44-a64f-4505-ad5c-5a23421e2b08-must-gather-output\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.760645 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.760578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4flkl\" (UniqueName: \"kubernetes.io/projected/0a730b44-a64f-4505-ad5c-5a23421e2b08-kube-api-access-4flkl\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.760891 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.760870 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0a730b44-a64f-4505-ad5c-5a23421e2b08-must-gather-output\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.768248 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.768221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4flkl\" (UniqueName: \"kubernetes.io/projected/0a730b44-a64f-4505-ad5c-5a23421e2b08-kube-api-access-4flkl\") pod \"must-gather-6dxkv\" (UID: \"0a730b44-a64f-4505-ad5c-5a23421e2b08\") " pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:36.785783 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:36.785762 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" Apr 17 22:06:37.119209 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:37.119183 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/must-gather-6dxkv"] Apr 17 22:06:37.122035 ip-10-0-132-27 kubenswrapper[2564]: W0417 22:06:37.121999 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a730b44_a64f_4505_ad5c_5a23421e2b08.slice/crio-a7fc4370396026d012de63d803bef2c74e9f904a24341109825f3fe4e72244c4 WatchSource:0}: Error finding container a7fc4370396026d012de63d803bef2c74e9f904a24341109825f3fe4e72244c4: Status 404 returned error can't find the container with id a7fc4370396026d012de63d803bef2c74e9f904a24341109825f3fe4e72244c4 Apr 17 22:06:37.332737 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:37.332693 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" event={"ID":"0a730b44-a64f-4505-ad5c-5a23421e2b08","Type":"ContainerStarted","Data":"a7fc4370396026d012de63d803bef2c74e9f904a24341109825f3fe4e72244c4"} Apr 17 22:06:38.338862 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:38.338816 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" event={"ID":"0a730b44-a64f-4505-ad5c-5a23421e2b08","Type":"ContainerStarted","Data":"d2b216dd179f658a004ac5f9a41cbb0614ad872fe7e27a9370958831281eae68"} Apr 17 22:06:38.338862 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:38.338867 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" event={"ID":"0a730b44-a64f-4505-ad5c-5a23421e2b08","Type":"ContainerStarted","Data":"81a61f5b10344cccfb9038f98ccdeef0d2f02b41708027207ac3684ae5e73540"} Apr 17 22:06:38.356284 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:38.356235 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfkqt/must-gather-6dxkv" podStartSLOduration=1.564751898 podStartE2EDuration="2.356218601s" podCreationTimestamp="2026-04-17 22:06:36 +0000 UTC" firstStartedPulling="2026-04-17 22:06:37.123801678 +0000 UTC m=+1793.886561133" lastFinishedPulling="2026-04-17 22:06:37.915268375 +0000 UTC m=+1794.678027836" observedRunningTime="2026-04-17 22:06:38.35301628 +0000 UTC m=+1795.115775758" watchObservedRunningTime="2026-04-17 22:06:38.356218601 +0000 UTC m=+1795.118978099" Apr 17 22:06:39.466879 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:39.466854 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8d4bc_202e1a9f-d233-463c-8e71-87d017274c62/global-pull-secret-syncer/0.log" Apr 17 22:06:39.623155 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:39.623094 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-bmcmp_ab744ec2-8a2e-4824-b6e1-3ec78e188e1e/konnectivity-agent/0.log" Apr 17 22:06:39.661242 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:39.661213 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-27.ec2.internal_2064d33e84866d45f0ed11cc547caffc/haproxy/0.log" Apr 17 22:06:42.754164 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.754137 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/extract/0.log" Apr 17 22:06:42.776987 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.776962 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/util/0.log" Apr 17 22:06:42.800573 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.800546 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_0acee64185f523d1d1272e9af2e4d9333e0dcde792ba30e1fa9605b759kmrgh_95d66543-a4d5-4050-bb06-e7dff0084985/pull/0.log" Apr 17 22:06:42.832408 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.832381 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/extract/0.log" Apr 17 22:06:42.856724 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.856697 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/util/0.log" Apr 17 22:06:42.888797 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.888773 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_19cb86e64775c5699d5aacf881a09c2d51e7f55f9e1ff096f2a667c5e09rmmr_2d6243aa-b495-4a71-8b42-309b97bacf8d/pull/0.log" Apr 17 22:06:42.931319 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.931283 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/extract/0.log" Apr 17 22:06:42.953264 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.953233 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/util/0.log" Apr 17 22:06:42.981974 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:42.981833 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_5fc36eb1065777914bfe0ff85f9a202e2a3bafefb563d9e2994474ed7366wlx_32c546ce-12e5-4717-8def-5b55510d8aeb/pull/0.log" Apr 17 22:06:43.014340 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.014136 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/extract/0.log" Apr 17 22:06:43.034646 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.034582 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/util/0.log" Apr 17 22:06:43.057093 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.057045 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_9438e18bbad664a016701a3153a6b421e2d977be7eb0117f80bcf45ef1kxsmx_70e4b3a8-b823-4c1b-8bc7-6b87cc668888/pull/0.log" Apr 17 22:06:43.248809 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.246181 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-74d5b49497-nkbvg_1f3a7837-afa1-44ff-b315-12ad9060d0f1/authorino/0.log" Apr 17 22:06:43.291440 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.291319 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-qbt98_69318013-30f1-47fd-94e0-e17179bc801e/manager/0.log" Apr 17 22:06:43.325886 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.325851 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-tm25p_d7abbdd5-20c3-43dc-ab24-9b5945b9a147/manager/0.log" Apr 17 22:06:43.394015 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.393966 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-dkhf2_ec702473-f90c-46e0-b46a-cca01ac0f169/registry-server/0.log" Apr 17 22:06:43.421463 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.421420 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-78n6k_f6a0918e-d06d-4d18-84f2-22baeb806fa1/manager/0.log" Apr 17 22:06:43.809315 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.809174 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:06:43.821191 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:43.810675 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:06:45.117461 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.117427 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7695548645-6xkws_d1610621-e8c1-4e95-b7a7-8c1d05baf41e/metrics-server/0.log" Apr 17 22:06:45.160040 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.159994 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-8xn87_0c19899f-0a5f-4d4c-a8fb-a19bb5647096/monitoring-plugin/0.log" Apr 17 22:06:45.196055 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.196020 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hdrt9_ea58c7c4-3af0-45e9-a977-19e7daff6f40/node-exporter/0.log" Apr 17 22:06:45.213650 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.213582 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hdrt9_ea58c7c4-3af0-45e9-a977-19e7daff6f40/kube-rbac-proxy/0.log" Apr 17 22:06:45.236334 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.236304 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hdrt9_ea58c7c4-3af0-45e9-a977-19e7daff6f40/init-textfile/0.log" Apr 17 22:06:45.720320 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.720071 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2sqmx_8fed131c-a37b-449e-a94e-f54a719a120d/prometheus-operator/0.log" Apr 17 22:06:45.738003 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:45.737885 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-2sqmx_8fed131c-a37b-449e-a94e-f54a719a120d/kube-rbac-proxy/0.log" Apr 17 22:06:47.255552 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:47.255520 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-lkn7k_9140d7c7-facf-4eae-9286-221c2b1004b9/networking-console-plugin/0.log" Apr 17 22:06:48.345713 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.345684 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-cqv24_db3b129d-4a40-4b37-82a8-f37d592345aa/download-server/0.log" Apr 17 22:06:48.625741 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.625653 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2"] Apr 17 22:06:48.633579 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.633545 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.638241 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.638208 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2"] Apr 17 22:06:48.683295 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.683264 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-podres\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.683471 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.683313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-sys\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.683471 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.683336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-lib-modules\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.683471 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.683398 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfqs7\" (UniqueName: \"kubernetes.io/projected/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-kube-api-access-mfqs7\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.683471 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.683435 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-proc\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.783958 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.783926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-proc\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.784185 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784157 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-proc\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.784394 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784357 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-podres\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.784525 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784424 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-sys\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.784525 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784451 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-lib-modules\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.784525 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784521 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfqs7\" (UniqueName: \"kubernetes.io/projected/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-kube-api-access-mfqs7\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.785020 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.784982 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-podres\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.785238 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.785222 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-lib-modules\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.785393 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.785366 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-sys\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.793383 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.793355 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfqs7\" (UniqueName: \"kubernetes.io/projected/cc6cb38e-0015-4510-8f89-0ef7fd82edf5-kube-api-access-mfqs7\") pod \"perf-node-gather-daemonset-ghxh2\" (UID: \"cc6cb38e-0015-4510-8f89-0ef7fd82edf5\") " pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:48.950711 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:48.950585 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:49.116647 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.113708 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2"] Apr 17 22:06:49.406833 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.406033 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" event={"ID":"cc6cb38e-0015-4510-8f89-0ef7fd82edf5","Type":"ContainerStarted","Data":"777e2d3fb4229a6c58e544474f75e045fea346364eab0af881c9a347bfb5e078"} Apr 17 22:06:49.406833 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.406082 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" event={"ID":"cc6cb38e-0015-4510-8f89-0ef7fd82edf5","Type":"ContainerStarted","Data":"7c7783512f12b59a45fb06a1bf511a09bffc3761841075c6b95142844cdfbb31"} Apr 17 22:06:49.407435 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.406879 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:06:49.424903 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.424847 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" podStartSLOduration=1.4248277630000001 podStartE2EDuration="1.424827763s" podCreationTimestamp="2026-04-17 22:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 22:06:49.422638415 +0000 UTC m=+1806.185397893" watchObservedRunningTime="2026-04-17 22:06:49.424827763 +0000 UTC m=+1806.187587245" Apr 17 22:06:49.821017 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.820985 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcgv2_d9e8a464-4161-4835-bb2a-311f468b76b3/dns/0.log" Apr 17 22:06:49.840025 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.839983 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-wcgv2_d9e8a464-4161-4835-bb2a-311f468b76b3/kube-rbac-proxy/0.log" Apr 17 22:06:49.890333 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:49.890302 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-l7tk5_116be4c2-a389-4822-bd06-12d2e0fcf15a/dns-node-resolver/0.log" Apr 17 22:06:50.415290 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:50.415260 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-798d665f68-m7frd_ab02dab3-df62-4fe4-91ca-ca5f92bce3f2/registry/0.log" Apr 17 22:06:50.473607 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:50.473562 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nlczj_960eb4ac-0adf-443b-8b6e-b34cc770fb3a/node-ca/0.log" Apr 17 22:06:51.280613 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:51.280559 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cfhr4ds_6d13e4eb-609d-477a-832d-fdb2831db5a9/istio-proxy/0.log" Apr 17 22:06:51.462603 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:51.462560 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-rqwh8_a23de3a7-3163-4907-9eb0-72991fc680a6/istio-proxy/0.log" Apr 17 22:06:52.019562 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:52.019528 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9fcp4_d95e33bb-b1a4-4e97-8fbf-eab2a80b4bdf/serve-healthcheck-canary/0.log" Apr 17 22:06:52.604974 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:52.604945 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlkjn_2296bc55-e2c4-4c73-b08d-0e0583540f74/kube-rbac-proxy/0.log" Apr 17 22:06:52.622993 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:52.622970 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlkjn_2296bc55-e2c4-4c73-b08d-0e0583540f74/exporter/0.log" Apr 17 22:06:52.640663 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:52.640637 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tlkjn_2296bc55-e2c4-4c73-b08d-0e0583540f74/extractor/0.log" Apr 17 22:06:54.549677 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:54.549646 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-6c569d4cc5-ssl9t_f67211fa-d3ff-4bb1-9b40-3dd3202edc94/maas-api/0.log" Apr 17 22:06:54.587301 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:54.587276 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-5c4844f674-rsfcc_9aa35df7-0929-42d1-b8ad-f5ec2662f61d/manager/0.log" Apr 17 22:06:54.676376 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:54.676343 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6bfddf7b9f-btzv9_6d0fa348-40af-4bc6-a265-834e1ef67d2b/manager/0.log" Apr 17 22:06:54.720497 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:54.720468 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-rkqpw_6be905ab-e9cd-42c4-982a-d80fc553f8f9/postgres/0.log" Apr 17 22:06:55.816119 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:55.816085 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64dbd89fbc-44bpv_8d9ad950-c742-40fe-9770-d484bfdea043/manager/0.log" Apr 17 22:06:55.859481 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:55.859451 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-82jnl_bf52de7f-83de-4f9e-9f54-61cc669988ec/openshift-lws-operator/0.log" Apr 17 22:06:56.426719 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:06:56.426692 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-tfkqt/perf-node-gather-daemonset-ghxh2" Apr 17 22:07:01.822579 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.822547 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/kube-multus-additional-cni-plugins/0.log" Apr 17 22:07:01.843407 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.843381 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/egress-router-binary-copy/0.log" Apr 17 22:07:01.863756 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.863727 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/cni-plugins/0.log" Apr 17 22:07:01.889499 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.889479 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/bond-cni-plugin/0.log" Apr 17 22:07:01.910450 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.910395 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/routeoverride-cni/0.log" Apr 17 22:07:01.936751 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.936722 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/whereabouts-cni-bincopy/0.log" Apr 17 22:07:01.956441 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:01.956412 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4wvmp_1ced47e5-b9ea-4efa-8587-2c824560fd6c/whereabouts-cni/0.log" Apr 17 22:07:02.335886 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:02.335860 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-v9gzn_87fbd26a-4a22-4878-91ae-b4b73c69c322/kube-multus/0.log" Apr 17 22:07:02.356729 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:02.356703 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hqbt5_ee2090c8-65ec-46e0-9614-f6f0ddae32d7/network-metrics-daemon/0.log" Apr 17 22:07:02.372464 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:02.372443 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-hqbt5_ee2090c8-65ec-46e0-9614-f6f0ddae32d7/kube-rbac-proxy/0.log" Apr 17 22:07:03.508236 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.508207 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-controller/0.log" Apr 17 22:07:03.524367 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.524343 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/0.log" Apr 17 22:07:03.536670 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.536648 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovn-acl-logging/1.log" Apr 17 22:07:03.553749 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.553721 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/kube-rbac-proxy-node/0.log" Apr 17 22:07:03.573897 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.573873 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 22:07:03.590862 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.590839 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/northd/0.log" Apr 17 22:07:03.608412 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.608393 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/nbdb/0.log" Apr 17 22:07:03.626958 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.626937 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/sbdb/0.log" Apr 17 22:07:03.731851 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:03.731815 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kwgmn_60e0c58d-3db8-4433-a617-00082bd25488/ovnkube-controller/0.log" Apr 17 22:07:05.104932 ip-10-0-132-27 kubenswrapper[2564]: I0417 22:07:05.104908 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ddrrn_803936af-5a7f-4be9-bc47-8ca0f94064a9/network-check-target-container/0.log"