Apr 20 14:52:53.208005 ip-10-0-133-163 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 14:52:53.208017 ip-10-0-133-163 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 14:52:53.208024 ip-10-0-133-163 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 14:52:53.208234 ip-10-0-133-163 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 14:53:03.350163 ip-10-0-133-163 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 14:53:03.350183 ip-10-0-133-163 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 8eb94e1b745543a4be643e3b8e13a4b3 -- Apr 20 14:55:16.214576 ip-10-0-133-163 systemd[1]: Starting Kubernetes Kubelet... Apr 20 14:55:16.687209 ip-10-0-133-163 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:16.687209 ip-10-0-133-163 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 14:55:16.687209 ip-10-0-133-163 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:16.687209 ip-10-0-133-163 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 14:55:16.687209 ip-10-0-133-163 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 14:55:16.690700 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.690611 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696951 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696970 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696974 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696977 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696980 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:16.696973 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696983 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696986 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696989 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696992 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696995 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.696997 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697000 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697003 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697005 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697008 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697010 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697013 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697015 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697017 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697020 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697023 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697025 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697028 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697031 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:16.697204 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697033 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697036 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697041 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697045 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697049 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697054 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697057 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697060 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697062 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697065 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697068 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697070 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697073 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697075 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697078 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697081 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697084 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697087 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697089 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:16.697674 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697092 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697094 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697097 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697099 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697102 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697104 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697107 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697110 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697113 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697115 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697117 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697120 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697122 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697125 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697127 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697129 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697132 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697135 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697138 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697140 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:16.698165 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697143 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697145 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697148 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697150 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697153 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697155 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697157 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697160 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697164 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697167 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697169 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697172 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697174 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697176 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697179 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697188 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697191 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697194 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697196 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697199 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:16.698680 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697201 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697204 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697207 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697660 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697671 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697675 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697678 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697681 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697684 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697687 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697690 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697693 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697696 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697699 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697702 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697704 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697707 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697709 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697712 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:16.699158 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697714 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697718 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697720 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697723 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697725 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697727 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697730 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697732 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697736 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697739 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697741 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697744 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697747 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697749 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697752 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697754 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697756 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697761 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697764 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:16.699623 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697767 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697769 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697772 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697774 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697777 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697779 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697782 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697784 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697787 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697789 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697792 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697794 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697797 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697800 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697802 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697805 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697807 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697810 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697812 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697814 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:16.700082 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697817 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697819 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697822 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697825 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697827 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697829 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697832 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697834 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697836 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697839 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697842 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697845 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697847 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697850 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697853 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697856 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697858 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697862 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697865 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697868 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:16.700579 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697872 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697875 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697878 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697881 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697884 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697887 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697892 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697895 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697897 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697900 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.697903 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698523 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698533 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698540 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698544 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698549 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698552 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698556 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698561 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698564 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698567 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 14:55:16.701067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698571 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698574 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698577 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698580 2570 flags.go:64] FLAG: --cgroup-root="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698583 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698586 2570 flags.go:64] FLAG: --client-ca-file="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698589 2570 flags.go:64] FLAG: --cloud-config="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698591 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698594 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698598 2570 flags.go:64] FLAG: --cluster-domain="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698601 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698604 2570 flags.go:64] FLAG: --config-dir="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698607 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698610 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698614 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698617 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698620 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698626 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698629 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698632 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698635 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698638 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698641 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698645 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698648 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 14:55:16.701597 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698651 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698654 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698657 2570 flags.go:64] FLAG: --enable-server="true" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698660 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698664 2570 flags.go:64] FLAG: --event-burst="100" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698667 2570 flags.go:64] FLAG: --event-qps="50" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698670 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698673 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698676 2570 flags.go:64] FLAG: --eviction-hard="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698680 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698682 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698685 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698689 2570 flags.go:64] FLAG: --eviction-soft="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698692 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698695 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698698 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698701 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698704 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698707 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698709 2570 flags.go:64] FLAG: --feature-gates="" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698716 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698719 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698723 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698726 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698729 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 20 14:55:16.702220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698732 2570 flags.go:64] FLAG: --help="false" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698735 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-133-163.ec2.internal" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698739 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698741 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698744 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698748 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698751 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698754 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698757 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698759 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698763 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698766 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698769 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698772 2570 flags.go:64] FLAG: --kube-reserved="" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698775 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698777 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698780 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698783 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698786 2570 flags.go:64] FLAG: --lock-file="" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698788 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698792 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698795 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698800 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 14:55:16.702849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698803 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698806 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698808 2570 flags.go:64] FLAG: --logging-format="text" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698811 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698816 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698819 2570 flags.go:64] FLAG: --manifest-url="" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698822 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698831 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698834 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698839 2570 flags.go:64] FLAG: --max-pods="110" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698842 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698845 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698847 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698850 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698853 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698856 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698859 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698867 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698870 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698873 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698876 2570 flags.go:64] FLAG: --pod-cidr="" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698879 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698884 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698887 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 14:55:16.703432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698890 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698893 2570 flags.go:64] FLAG: --port="10250" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698896 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698898 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-09f51ac051133ff47" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698901 2570 flags.go:64] FLAG: --qos-reserved="" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698904 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698908 2570 flags.go:64] FLAG: --register-node="true" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698911 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698913 2570 flags.go:64] FLAG: --register-with-taints="" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698917 2570 flags.go:64] FLAG: --registry-burst="10" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698920 2570 flags.go:64] FLAG: --registry-qps="5" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698923 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698925 2570 flags.go:64] FLAG: --reserved-memory="" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698930 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698933 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698936 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698939 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698942 2570 flags.go:64] FLAG: --runonce="false" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698945 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698948 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698950 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698953 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698956 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698959 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698962 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698965 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 14:55:16.704003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698967 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698970 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698973 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698976 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698979 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698982 2570 flags.go:64] FLAG: --system-cgroups="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698985 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698990 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698993 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698995 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.698999 2570 flags.go:64] FLAG: --tls-min-version="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699002 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699005 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699008 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699010 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699013 2570 flags.go:64] FLAG: --v="2" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699017 2570 flags.go:64] FLAG: --version="false" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699021 2570 flags.go:64] FLAG: --vmodule="" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699026 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699030 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699125 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699130 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699133 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699135 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:16.704632 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699138 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699140 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699143 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699145 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699148 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699150 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699153 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699156 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699158 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699161 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699164 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699166 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699169 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699172 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699175 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699177 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699179 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699182 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699184 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699187 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:16.705605 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699189 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699193 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699196 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699198 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699202 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699205 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699208 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699212 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699215 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699217 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699220 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699222 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699225 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699227 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699230 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699232 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699235 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699237 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699240 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:16.706497 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699242 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699245 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699247 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699249 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699252 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699254 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699258 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699260 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699263 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699265 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699267 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699270 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699273 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699275 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699279 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699282 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699284 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699286 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699289 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699291 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:16.707337 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699295 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699297 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699318 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699321 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699325 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699328 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699331 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699334 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699336 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699338 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699341 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699343 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699346 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699348 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699351 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699353 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699355 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699358 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699360 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699364 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:16.708164 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699367 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699369 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.699372 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.699380 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.706867 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.706888 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706957 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706969 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706975 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706980 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706985 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706990 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.706994 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707000 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707007 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:16.708678 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707011 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707015 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707019 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707023 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707027 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707031 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707035 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707039 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707044 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707048 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707053 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707057 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707061 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707065 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707069 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707073 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707077 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707081 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707085 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:16.709153 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707089 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707093 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707097 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707103 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707108 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707113 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707117 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707121 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707125 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707129 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707133 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707137 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707141 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707145 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707149 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707153 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707157 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707161 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707165 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707169 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:16.709875 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707173 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707177 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707181 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707185 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707189 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707193 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707198 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707202 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707206 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707211 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707215 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707219 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707223 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707228 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707232 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707236 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707241 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707246 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707250 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707254 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:16.710754 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707258 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707262 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707266 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707270 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707274 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707278 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707282 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707286 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707290 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707294 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707318 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707322 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707326 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707331 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707334 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707338 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707342 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:16.711415 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707346 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.707354 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707515 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707523 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707528 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707532 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707537 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707541 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707545 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707549 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707554 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707558 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707563 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707567 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707571 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 14:55:16.711986 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707575 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707579 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707583 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707587 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707591 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707595 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707599 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707603 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707607 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707611 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707618 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707624 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707628 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707633 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707637 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707642 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707646 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707650 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707654 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707659 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 14:55:16.712428 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707663 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707667 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707671 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707675 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707679 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707685 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707691 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707695 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707699 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707703 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707708 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707713 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707716 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707721 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707726 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707730 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707734 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707738 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707742 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 14:55:16.712965 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707746 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707750 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707754 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707758 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707762 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707766 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707770 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707774 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707778 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707782 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707786 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707790 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707794 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707799 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707802 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707806 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707811 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707815 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707818 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707823 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 14:55:16.713446 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707827 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707831 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707835 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707839 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707844 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707848 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707852 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707856 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707860 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707864 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707868 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707872 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707876 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:16.707879 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.707887 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 14:55:16.713956 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.708759 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 14:55:16.714356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.711546 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 14:55:16.714356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.712691 2570 server.go:1019] "Starting client certificate rotation" Apr 20 14:55:16.714356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.712785 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:16.714356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.712820 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 14:55:16.738893 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.738876 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:16.741466 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.741446 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 14:55:16.758723 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.758702 2570 log.go:25] "Validated CRI v1 runtime API" Apr 20 14:55:16.764433 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.764419 2570 log.go:25] "Validated CRI v1 image API" Apr 20 14:55:16.765687 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.765676 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 14:55:16.768046 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.768027 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:16.768133 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.768085 2570 fs.go:135] Filesystem UUIDs: map[06c26ce0-c8c8-4220-b3f0-54a79c6278f0:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 7b38bdfb-72cc-443d-a41f-848d660d5af5:/dev/nvme0n1p3] Apr 20 14:55:16.768133 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.768104 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 14:55:16.774242 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.774132 2570 manager.go:217] Machine: {Timestamp:2026-04-20 14:55:16.772458008 +0000 UTC m=+0.428506309 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3099977 MemoryCapacity:32812167168 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27bf73390d6e077c8802a73a6fa14a SystemUUID:ec27bf73-390d-6e07-7c88-02a73a6fa14a BootID:8eb94e1b-7455-43a4-be64-3e3b8e13a4b3 Filesystems:[{Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406081536 Type:vfs Inodes:4005391 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:02:07:3f:f5:5b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:02:07:3f:f5:5b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:91:47:ee:bd:fe Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812167168 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 14:55:16.774242 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.774238 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 14:55:16.774360 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.774324 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 14:55:16.777038 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.777020 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 14:55:16.777186 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.777041 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-163.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 14:55:16.777227 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.777192 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 14:55:16.777227 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.777201 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 14:55:16.777227 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.777214 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:16.778292 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.778282 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 14:55:16.779135 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.779125 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:16.779407 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.779397 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 14:55:16.782126 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.782116 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 20 14:55:16.782158 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.782138 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 14:55:16.782158 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.782150 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 14:55:16.782228 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.782159 2570 kubelet.go:397] "Adding apiserver pod source" Apr 20 14:55:16.782228 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.782173 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 14:55:16.783450 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.783435 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:16.783450 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.783453 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 14:55:16.786635 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.786619 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 14:55:16.787914 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.787901 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 14:55:16.789676 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789658 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 14:55:16.789676 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789676 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 14:55:16.789676 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789682 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789688 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789693 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789699 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789714 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789720 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789728 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789737 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789753 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 14:55:16.789873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.789761 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 14:55:16.790514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.790505 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 14:55:16.790514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.790515 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 14:55:16.791501 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.791478 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rrb2z" Apr 20 14:55:16.793724 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.793701 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 14:55:16.793779 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.793723 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-163.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 14:55:16.793987 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.793975 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 14:55:16.794021 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.794014 2570 server.go:1295] "Started kubelet" Apr 20 14:55:16.794132 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.794099 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 14:55:16.794205 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.794159 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 14:55:16.794251 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.794236 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 14:55:16.794983 ip-10-0-133-163 systemd[1]: Started Kubernetes Kubelet. Apr 20 14:55:16.795518 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.795318 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 20 14:55:16.795879 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.795852 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 14:55:16.798672 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.798653 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rrb2z" Apr 20 14:55:16.804375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.804314 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:16.804655 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.804628 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 14:55:16.804802 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.804784 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 14:55:16.805765 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.805736 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 14:55:16.805765 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.805760 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 14:55:16.805903 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.805883 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 14:55:16.805955 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.805947 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 20 14:55:16.805955 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.805955 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 20 14:55:16.806118 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806101 2570 factory.go:153] Registering CRI-O factory Apr 20 14:55:16.806192 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806128 2570 factory.go:223] Registration of the crio container factory successfully Apr 20 14:55:16.806192 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806179 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 14:55:16.806192 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806189 2570 factory.go:55] Registering systemd factory Apr 20 14:55:16.806376 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806199 2570 factory.go:223] Registration of the systemd container factory successfully Apr 20 14:55:16.806376 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806214 2570 factory.go:103] Registering Raw factory Apr 20 14:55:16.806376 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806234 2570 manager.go:1196] Started watching for new ooms in manager Apr 20 14:55:16.806376 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.806263 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:16.806726 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.806704 2570 manager.go:319] Starting recovery of all containers Apr 20 14:55:16.807666 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.807647 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:16.810080 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.810058 2570 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 20 14:55:16.810931 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.810761 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-163.ec2.internal" not found Apr 20 14:55:16.815387 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.815352 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 14:55:16.817244 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.817231 2570 manager.go:324] Recovery completed Apr 20 14:55:16.819136 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.819094 2570 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 20 14:55:16.822173 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.822156 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:16.824727 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.824712 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:16.824793 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.824739 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:16.824793 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.824751 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:16.825252 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.825239 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 14:55:16.825252 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.825250 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 14:55:16.825402 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.825290 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 14:55:16.825873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.825861 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-163.ec2.internal" not found Apr 20 14:55:16.827784 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.827771 2570 policy_none.go:49] "None policy: Start" Apr 20 14:55:16.827849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.827789 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 14:55:16.827849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.827802 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 20 14:55:16.868519 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.868503 2570 manager.go:341] "Starting Device Plugin manager" Apr 20 14:55:16.868618 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.868558 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 14:55:16.868618 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.868574 2570 server.go:85] "Starting device plugin registration server" Apr 20 14:55:16.868855 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.868841 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 14:55:16.868907 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.868859 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 14:55:16.868979 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.868966 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 14:55:16.869270 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.869042 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 14:55:16.869270 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.869053 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 14:55:16.870214 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.870197 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 14:55:16.870293 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.870250 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:16.887023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.887002 2570 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-133-163.ec2.internal" not found Apr 20 14:55:16.969377 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.969330 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:16.970419 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.970390 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:16.970495 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.970429 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:16.970495 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.970440 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:16.970495 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.970465 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-163.ec2.internal" Apr 20 14:55:16.978826 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.978807 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-163.ec2.internal" Apr 20 14:55:16.978912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.978829 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-163.ec2.internal\": node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:16.978955 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.978922 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 14:55:16.978955 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.978945 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 14:55:16.979013 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.978962 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 14:55:16.979013 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.978969 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 14:55:16.979013 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:16.978998 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 14:55:16.982763 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:16.982740 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:17.011814 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.011797 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.080025 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.079995 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal"] Apr 20 14:55:17.080106 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.080094 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:17.081650 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.081629 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:17.081745 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.081670 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:17.081745 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.081684 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:17.082966 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.082951 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:17.083085 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.083128 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083102 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:17.083676 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083658 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:17.083736 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083691 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:17.083736 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083659 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:17.083736 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083725 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:17.083837 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083703 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:17.083837 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.083757 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:17.084965 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.084952 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.085011 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.084976 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 14:55:17.085631 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.085616 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientMemory" Apr 20 14:55:17.085700 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.085638 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 14:55:17.085700 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.085651 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeHasSufficientPID" Apr 20 14:55:17.108832 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.108811 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.111896 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.111876 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.113044 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.113030 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-163.ec2.internal\" not found" node="ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.207335 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.207289 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.207335 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.207340 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.207498 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.207405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.212379 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.212367 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.308040 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.308136 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308044 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.308136 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.308205 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308138 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.308205 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8147dca2f1846ffe58ac40c8a9cdfc0b-config\") pod \"kube-apiserver-proxy-ip-10-0-133-163.ec2.internal\" (UID: \"8147dca2f1846ffe58ac40c8a9cdfc0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.308205 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.308191 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/7b77433e362f2114f13c38a959650d25-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal\" (UID: \"7b77433e362f2114f13c38a959650d25\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.313095 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.313075 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.410593 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.410562 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.413158 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.413137 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.416342 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.416324 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:17.513609 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.513571 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.614114 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.614049 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.713637 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.713604 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 14:55:17.714192 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.713789 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:55:17.714192 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.713789 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 14:55:17.714701 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.714679 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.800483 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.800423 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 14:50:16 +0000 UTC" deadline="2027-09-18 09:47:27.003366227 +0000 UTC" Apr 20 14:55:17.800483 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.800479 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12378h52m9.202892201s" Apr 20 14:55:17.804517 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.804494 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 14:55:17.814939 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.814912 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.826840 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.826818 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 14:55:17.846982 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.846962 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-mpdph" Apr 20 14:55:17.855035 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.855014 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-mpdph" Apr 20 14:55:17.915940 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:17.915912 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:17.984688 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:17.984652 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8147dca2f1846ffe58ac40c8a9cdfc0b.slice/crio-2ba41b82a30825d00f32ecdcca0ebb1890d390fe48a89d106322a9cd7881a5af WatchSource:0}: Error finding container 2ba41b82a30825d00f32ecdcca0ebb1890d390fe48a89d106322a9cd7881a5af: Status 404 returned error can't find the container with id 2ba41b82a30825d00f32ecdcca0ebb1890d390fe48a89d106322a9cd7881a5af Apr 20 14:55:17.985188 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:17.985164 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b77433e362f2114f13c38a959650d25.slice/crio-6453dd9ee699eaaf8dc0fa0fbb905274a2c53c21ca81e3968d6b87829f6f3db0 WatchSource:0}: Error finding container 6453dd9ee699eaaf8dc0fa0fbb905274a2c53c21ca81e3968d6b87829f6f3db0: Status 404 returned error can't find the container with id 6453dd9ee699eaaf8dc0fa0fbb905274a2c53c21ca81e3968d6b87829f6f3db0 Apr 20 14:55:17.990524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:17.990508 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 14:55:18.016521 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.016503 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-163.ec2.internal\" not found" Apr 20 14:55:18.068364 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.068348 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:18.106345 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.106320 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" Apr 20 14:55:18.118071 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.118054 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:18.119911 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.119897 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" Apr 20 14:55:18.128753 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.128716 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 14:55:18.324267 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.324238 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:18.783285 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.783261 2570 apiserver.go:52] "Watching apiserver" Apr 20 14:55:18.790944 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.790923 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 14:55:18.792146 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.792126 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zb7gn","openshift-image-registry/node-ca-h82ph","openshift-multus/multus-chk28","openshift-ovn-kubernetes/ovnkube-node-g9x87","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal","openshift-multus/multus-additional-cni-plugins-lstvb","openshift-multus/network-metrics-daemon-sjhzf","openshift-network-diagnostics/network-check-target-d4wt8","openshift-network-operator/iptables-alerter-s775f","kube-system/konnectivity-agent-9vtqc","kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal","openshift-cluster-node-tuning-operator/tuned-s6x9l"] Apr 20 14:55:18.793433 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.793411 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.794560 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.794533 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.795655 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.795638 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.796074 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796055 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.796430 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796297 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 14:55:18.796430 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796333 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-6zlnz\"" Apr 20 14:55:18.796430 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.796681 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796659 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 14:55:18.796997 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796980 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-lb9tw\"" Apr 20 14:55:18.797096 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.796982 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.798066 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-whf9x\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.798230 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.798398 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.798652 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.798678 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.799453 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.799744 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.800088 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.800767 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.800197 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.801802 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.801780 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.802267 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802249 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.802401 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802265 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 14:55:18.802747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802583 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.802747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802591 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jc2sv\"" Apr 20 14:55:18.802747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802651 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mx55k\"" Apr 20 14:55:18.802747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802598 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.802917 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.802873 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.803129 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.803108 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.803228 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.803197 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:18.804024 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804008 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.804276 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804259 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:18.804373 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.804347 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:18.804975 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804769 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 14:55:18.804975 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804790 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.804975 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804843 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 14:55:18.804975 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.804869 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-bwj5r\"" Apr 20 14:55:18.805174 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.805088 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 14:55:18.808382 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.808363 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-chk28" Apr 20 14:55:18.808502 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.808477 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.809608 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.809588 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810512 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4fsdr\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810804 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810885 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-mgzjq\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810919 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810888 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.810807 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.812352 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:18.812524 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.812355 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ljhlf\"" Apr 20 14:55:18.813014 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.812998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:18.815971 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.815951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-serviceca\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.816065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.815990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816014 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-systemd-units\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816038 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-hostroot\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816224 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-var-lib-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816224 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816224 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-system-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816224 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816167 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-netd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816224 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6c014f8-befe-4916-a8ed-bc592d3baacf-hosts-file\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816241 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjj2x\" (UniqueName: \"kubernetes.io/projected/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-kube-api-access-vjj2x\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816286 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-system-cni-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816329 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cni-binary-copy\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-etc-kubernetes\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816381 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-node-log\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-etc-selinux\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50d12be2-afb1-4257-895a-8f2eed4865c3-agent-certs\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.816490 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816492 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-netns\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816536 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-ovn\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816563 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816605 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2vf2\" (UniqueName: \"kubernetes.io/projected/f6944a1f-03f8-4115-899e-e5c61d0d6075-kube-api-access-v2vf2\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-log-socket\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9jtk\" (UniqueName: \"kubernetes.io/projected/6ae6c334-21b5-4f64-b2c3-68f797cd363b-kube-api-access-r9jtk\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816717 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-registration-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816731 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-netns\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-conf-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816766 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-kubelet\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816854 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6c014f8-befe-4916-a8ed-bc592d3baacf-tmp-dir\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8w6mm\" (UniqueName: \"kubernetes.io/projected/f6c014f8-befe-4916-a8ed-bc592d3baacf-kube-api-access-8w6mm\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.816913 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816939 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db4252bf-5e13-4727-a83a-7f87874cf5c4-iptables-alerter-script\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-os-release\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.816995 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8ncq\" (UniqueName: \"kubernetes.io/projected/d0ae6812-a112-4eba-84ef-a2eeea69630a-kube-api-access-k8ncq\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817022 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-k8s-cni-cncf-io\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-etc-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817134 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-device-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-sys-fs\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817181 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817203 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-socket-dir-parent\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817226 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-bin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-kubelet\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817342 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-bin\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-socket-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817397 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cnibin\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.817441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cnibin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817468 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jstsz\" (UniqueName: \"kubernetes.io/projected/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-kube-api-access-jstsz\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817493 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-slash\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-config\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-systemd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817584 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db4252bf-5e13-4727-a83a-7f87874cf5c4-host-slash\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bxzf\" (UniqueName: \"kubernetes.io/projected/db4252bf-5e13-4727-a83a-7f87874cf5c4-kube-api-access-2bxzf\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817661 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-os-release\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-host\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817704 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50d12be2-afb1-4257-895a-8f2eed4865c3-konnectivity-ca\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-env-overrides\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-script-lib\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-daemon-config\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovn-node-metrics-cert\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817825 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.818122 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817866 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-multus\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.818704 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817892 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-multus-certs\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.818704 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.817928 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4bl\" (UniqueName: \"kubernetes.io/projected/2db4df8a-cdb6-4503-9793-bc14f5983e3e-kube-api-access-ps4bl\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.856798 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.856767 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:17 +0000 UTC" deadline="2027-10-19 03:33:23.070632788 +0000 UTC" Apr 20 14:55:18.856877 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.856799 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13116h38m4.213837172s" Apr 20 14:55:18.907054 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.907026 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 14:55:18.918261 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918237 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-conf-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-kubelet\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.918375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918281 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6c014f8-befe-4916-a8ed-bc592d3baacf-tmp-dir\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.918375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8w6mm\" (UniqueName: \"kubernetes.io/projected/f6c014f8-befe-4916-a8ed-bc592d3baacf-kube-api-access-8w6mm\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.918375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918367 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-kubelet\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.918514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-conf-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.918514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db4252bf-5e13-4727-a83a-7f87874cf5c4-iptables-alerter-script\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.918514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-os-release\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918524 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-conf\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.918679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918548 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-run\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.918679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918590 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8ncq\" (UniqueName: \"kubernetes.io/projected/d0ae6812-a112-4eba-84ef-a2eeea69630a-kube-api-access-k8ncq\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.918679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918625 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f6c014f8-befe-4916-a8ed-bc592d3baacf-tmp-dir\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.918679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-os-release\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.918655 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918696 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918730 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-k8s-cni-cncf-io\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.918767 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:19.418746914 +0000 UTC m=+3.074795197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-k8s-cni-cncf-io\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918788 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-kubernetes\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918823 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-etc-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918839 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-device-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-sys-fs\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918870 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.918894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918886 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-socket-dir-parent\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918907 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-device-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-bin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918939 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-bin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918934 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-sys-fs\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-kubelet\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-etc-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.918982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-bin\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-socket-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919005 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-socket-dir-parent\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-bin\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cnibin\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919007 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-kubelet\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cnibin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jstsz\" (UniqueName: \"kubernetes.io/projected/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-kube-api-access-jstsz\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919079 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cnibin\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919099 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-slash\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919112 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-socket-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.919420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-config\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919132 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cnibin\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919142 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/db4252bf-5e13-4727-a83a-7f87874cf5c4-iptables-alerter-script\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-slash\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919147 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-modprobe-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-systemd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919290 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-systemd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919321 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db4252bf-5e13-4727-a83a-7f87874cf5c4-host-slash\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919346 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2bxzf\" (UniqueName: \"kubernetes.io/projected/db4252bf-5e13-4727-a83a-7f87874cf5c4-kube-api-access-2bxzf\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919414 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-os-release\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-host\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919451 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-cni-binary-copy\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919466 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50d12be2-afb1-4257-895a-8f2eed4865c3-konnectivity-ca\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/db4252bf-5e13-4727-a83a-7f87874cf5c4-host-slash\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-env-overrides\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920198 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-os-release\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919518 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-host\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-script-lib\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-daemon-config\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919638 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovn-node-metrics-cert\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919668 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysconfig\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919695 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-config\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919690 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-lib-modules\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-multus\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-var-lib-cni-multus\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-multus-certs\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919872 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4bl\" (UniqueName: \"kubernetes.io/projected/2db4df8a-cdb6-4503-9793-bc14f5983e3e-kube-api-access-ps4bl\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-sys\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919941 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-multus-certs\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-env-overrides\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.919953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/50d12be2-afb1-4257-895a-8f2eed4865c3-konnectivity-ca\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.920909 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920058 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-serviceca\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovnkube-script-lib\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-systemd\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920134 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-systemd-units\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-multus-daemon-config\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920152 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920170 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-tuned\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920177 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-systemd-units\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920189 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-hostroot\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920226 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-var-lib-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920263 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-hostroot\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920350 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920414 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-system-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-var-lib-openvswitch\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-system-cni-dir\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.921673 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-netd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920457 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-serviceca\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-cni-netd\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6c014f8-befe-4916-a8ed-bc592d3baacf-hosts-file\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920490 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920511 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-var-lib-kubelet\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920523 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f6c014f8-befe-4916-a8ed-bc592d3baacf-hosts-file\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920533 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjj2x\" (UniqueName: \"kubernetes.io/projected/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-kube-api-access-vjj2x\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920555 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-tmp\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lf54x\" (UniqueName: \"kubernetes.io/projected/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-kube-api-access-lf54x\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920648 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-system-cni-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920676 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cni-binary-copy\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920694 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-system-cni-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-etc-kubernetes\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920789 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-node-log\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-etc-kubernetes\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-host\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920856 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-node-log\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-etc-selinux\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920908 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920925 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50d12be2-afb1-4257-895a-8f2eed4865c3-agent-certs\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-netns\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920969 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-ovn\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-etc-selinux\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.920992 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921009 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v2vf2\" (UniqueName: \"kubernetes.io/projected/f6944a1f-03f8-4115-899e-e5c61d0d6075-kube-api-access-v2vf2\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921031 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921052 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-log-socket\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921068 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r9jtk\" (UniqueName: \"kubernetes.io/projected/6ae6c334-21b5-4f64-b2c3-68f797cd363b-kube-api-access-r9jtk\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921085 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2db4df8a-cdb6-4503-9793-bc14f5983e3e-cni-binary-copy\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-registration-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921153 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-registration-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-run-netns\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921213 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-log-socket\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.922777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921268 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-netns\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921383 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921400 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f6944a1f-03f8-4115-899e-e5c61d0d6075-run-ovn\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2db4df8a-cdb6-4503-9793-bc14f5983e3e-host-run-netns\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.923397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.921461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d0ae6812-a112-4eba-84ef-a2eeea69630a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.923894 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.923875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f6944a1f-03f8-4115-899e-e5c61d0d6075-ovn-node-metrics-cert\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.924012 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.924002 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/50d12be2-afb1-4257-895a-8f2eed4865c3-agent-certs\") pod \"konnectivity-agent-9vtqc\" (UID: \"50d12be2-afb1-4257-895a-8f2eed4865c3\") " pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:18.925756 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.925733 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:18.925842 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.925760 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:18.925842 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.925773 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:18.925842 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:18.925827 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:19.425809899 +0000 UTC m=+3.081858200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:18.927778 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.927752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8w6mm\" (UniqueName: \"kubernetes.io/projected/f6c014f8-befe-4916-a8ed-bc592d3baacf-kube-api-access-8w6mm\") pod \"node-resolver-zb7gn\" (UID: \"f6c014f8-befe-4916-a8ed-bc592d3baacf\") " pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:18.928423 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.928401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8ncq\" (UniqueName: \"kubernetes.io/projected/d0ae6812-a112-4eba-84ef-a2eeea69630a-kube-api-access-k8ncq\") pod \"aws-ebs-csi-driver-node-df9th\" (UID: \"d0ae6812-a112-4eba-84ef-a2eeea69630a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:18.930250 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.930229 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2vf2\" (UniqueName: \"kubernetes.io/projected/f6944a1f-03f8-4115-899e-e5c61d0d6075-kube-api-access-v2vf2\") pod \"ovnkube-node-g9x87\" (UID: \"f6944a1f-03f8-4115-899e-e5c61d0d6075\") " pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:18.930351 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.930230 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjj2x\" (UniqueName: \"kubernetes.io/projected/4aa972e0-3242-4b0c-87e7-b4ebc421bbce-kube-api-access-vjj2x\") pod \"multus-additional-cni-plugins-lstvb\" (UID: \"4aa972e0-3242-4b0c-87e7-b4ebc421bbce\") " pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:18.930636 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.930617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jstsz\" (UniqueName: \"kubernetes.io/projected/3661ad3f-53ca-47ec-8a9b-15e3d3f054bd-kube-api-access-jstsz\") pod \"node-ca-h82ph\" (UID: \"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd\") " pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:18.931046 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.931026 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4bl\" (UniqueName: \"kubernetes.io/projected/2db4df8a-cdb6-4503-9793-bc14f5983e3e-kube-api-access-ps4bl\") pod \"multus-chk28\" (UID: \"2db4df8a-cdb6-4503-9793-bc14f5983e3e\") " pod="openshift-multus/multus-chk28" Apr 20 14:55:18.931614 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.931598 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bxzf\" (UniqueName: \"kubernetes.io/projected/db4252bf-5e13-4727-a83a-7f87874cf5c4-kube-api-access-2bxzf\") pod \"iptables-alerter-s775f\" (UID: \"db4252bf-5e13-4727-a83a-7f87874cf5c4\") " pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:18.932693 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.932672 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9jtk\" (UniqueName: \"kubernetes.io/projected/6ae6c334-21b5-4f64-b2c3-68f797cd363b-kube-api-access-r9jtk\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:18.983918 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.983874 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerStarted","Data":"6453dd9ee699eaaf8dc0fa0fbb905274a2c53c21ca81e3968d6b87829f6f3db0"} Apr 20 14:55:18.985054 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:18.985020 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" event={"ID":"8147dca2f1846ffe58ac40c8a9cdfc0b","Type":"ContainerStarted","Data":"2ba41b82a30825d00f32ecdcca0ebb1890d390fe48a89d106322a9cd7881a5af"} Apr 20 14:55:19.022315 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022276 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysconfig\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-lib-modules\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022349 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-sys\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-systemd\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022399 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysconfig\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-tuned\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022463 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-var-lib-kubelet\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022486 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-tmp\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lf54x\" (UniqueName: \"kubernetes.io/projected/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-kube-api-access-lf54x\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-host\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022576 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-lib-modules\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-conf\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-run\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-kubernetes\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022665 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-modprobe-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022697 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-conf\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022733 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-run\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-modprobe-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022775 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-kubernetes\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022465 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-sys\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022809 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-systemd\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.022870 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.022866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-sysctl-d\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.023185 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.023159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-var-lib-kubelet\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.023266 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.023244 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-host\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.025802 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.025784 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-tmp\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.025895 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.025830 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-etc-tuned\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.031267 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.031249 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lf54x\" (UniqueName: \"kubernetes.io/projected/48771f49-ba8f-4d89-b489-7d4f8dbd6d3b-kube-api-access-lf54x\") pod \"tuned-s6x9l\" (UID: \"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b\") " pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.108434 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.108372 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-s775f" Apr 20 14:55:19.116057 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.116035 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:19.123672 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.123652 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:19.128191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.128173 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zb7gn" Apr 20 14:55:19.136453 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.136437 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" Apr 20 14:55:19.140962 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.140945 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-lstvb" Apr 20 14:55:19.150488 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.150466 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-chk28" Apr 20 14:55:19.156970 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.156953 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-h82ph" Apr 20 14:55:19.167469 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.167450 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" Apr 20 14:55:19.179633 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.179612 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:19.274509 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.274478 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 14:55:19.427228 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.427153 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.427235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427348 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427364 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427386 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427398 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:19.427403 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:20.427390006 +0000 UTC m=+4.083438286 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:19.427640 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.427455 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:20.427435029 +0000 UTC m=+4.083483325 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:19.635114 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:19.635008 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48771f49_ba8f_4d89_b489_7d4f8dbd6d3b.slice/crio-400e525b17f98e26971eadcaaa7d7f51095ee67f73a5a4032c3c757add5b132f WatchSource:0}: Error finding container 400e525b17f98e26971eadcaaa7d7f51095ee67f73a5a4032c3c757add5b132f: Status 404 returned error can't find the container with id 400e525b17f98e26971eadcaaa7d7f51095ee67f73a5a4032c3c757add5b132f Apr 20 14:55:19.636645 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:19.636436 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db4df8a_cdb6_4503_9793_bc14f5983e3e.slice/crio-2d101ea2c56a4033723d4aa82574bdc549c01ad54c69ff3b1d161e216e4808d1 WatchSource:0}: Error finding container 2d101ea2c56a4033723d4aa82574bdc549c01ad54c69ff3b1d161e216e4808d1: Status 404 returned error can't find the container with id 2d101ea2c56a4033723d4aa82574bdc549c01ad54c69ff3b1d161e216e4808d1 Apr 20 14:55:19.638801 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:19.638713 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6c014f8_befe_4916_a8ed_bc592d3baacf.slice/crio-15fb5ad092ccf64ef4009c65d859fa3e092c75aad0cc402c7bc00329a68afe92 WatchSource:0}: Error finding container 15fb5ad092ccf64ef4009c65d859fa3e092c75aad0cc402c7bc00329a68afe92: Status 404 returned error can't find the container with id 15fb5ad092ccf64ef4009c65d859fa3e092c75aad0cc402c7bc00329a68afe92 Apr 20 14:55:19.639859 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:19.639557 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddb4252bf_5e13_4727_a83a_7f87874cf5c4.slice/crio-ce4ebff8572558c4ae5020d69d8c6ff120dcb7bb6ad73363c98f13296d7880ac WatchSource:0}: Error finding container ce4ebff8572558c4ae5020d69d8c6ff120dcb7bb6ad73363c98f13296d7880ac: Status 404 returned error can't find the container with id ce4ebff8572558c4ae5020d69d8c6ff120dcb7bb6ad73363c98f13296d7880ac Apr 20 14:55:19.642203 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:19.642084 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6944a1f_03f8_4115_899e_e5c61d0d6075.slice/crio-e73a921c540e7220cbc0ccee5169e167d9a6e68c0adb366a3578e5566f005cf5 WatchSource:0}: Error finding container e73a921c540e7220cbc0ccee5169e167d9a6e68c0adb366a3578e5566f005cf5: Status 404 returned error can't find the container with id e73a921c540e7220cbc0ccee5169e167d9a6e68c0adb366a3578e5566f005cf5 Apr 20 14:55:19.857839 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.857805 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 14:50:17 +0000 UTC" deadline="2027-12-06 18:53:48.893064263 +0000 UTC" Apr 20 14:55:19.857839 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.857834 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14283h58m29.035232473s" Apr 20 14:55:19.979606 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.979523 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:19.979751 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:19.979636 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:19.988705 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.988662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" event={"ID":"d0ae6812-a112-4eba-84ef-a2eeea69630a","Type":"ContainerStarted","Data":"99e40a152437b910812863e137fbc4b017f7c32cd371a11504728e6e1a808a5b"} Apr 20 14:55:19.990646 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.990620 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h82ph" event={"ID":"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd","Type":"ContainerStarted","Data":"49f4dead4e9fec9f5af46b4035361865e857b679c7280a93822e2559ee5c10f0"} Apr 20 14:55:19.992583 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.992556 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerStarted","Data":"9538a9c57ea914149da108d876dd259071e7c13ce9ef6bf2df7c29a8f2e71235"} Apr 20 14:55:19.994451 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.994421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"e73a921c540e7220cbc0ccee5169e167d9a6e68c0adb366a3578e5566f005cf5"} Apr 20 14:55:19.996341 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.995617 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s775f" event={"ID":"db4252bf-5e13-4727-a83a-7f87874cf5c4","Type":"ContainerStarted","Data":"ce4ebff8572558c4ae5020d69d8c6ff120dcb7bb6ad73363c98f13296d7880ac"} Apr 20 14:55:19.997424 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.997394 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zb7gn" event={"ID":"f6c014f8-befe-4916-a8ed-bc592d3baacf","Type":"ContainerStarted","Data":"15fb5ad092ccf64ef4009c65d859fa3e092c75aad0cc402c7bc00329a68afe92"} Apr 20 14:55:19.999492 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:19.999469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chk28" event={"ID":"2db4df8a-cdb6-4503-9793-bc14f5983e3e","Type":"ContainerStarted","Data":"2d101ea2c56a4033723d4aa82574bdc549c01ad54c69ff3b1d161e216e4808d1"} Apr 20 14:55:20.010587 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.010547 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" event={"ID":"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b","Type":"ContainerStarted","Data":"400e525b17f98e26971eadcaaa7d7f51095ee67f73a5a4032c3c757add5b132f"} Apr 20 14:55:20.013578 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.013556 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" event={"ID":"8147dca2f1846ffe58ac40c8a9cdfc0b","Type":"ContainerStarted","Data":"770d61202897121ff045d650189496d97f33b37061ffad5e3eb7557919f94889"} Apr 20 14:55:20.021087 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.021055 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9vtqc" event={"ID":"50d12be2-afb1-4257-895a-8f2eed4865c3","Type":"ContainerStarted","Data":"449544c14fd7ca4d6e969f9a05f75af931ffcde9dfceeb215575d25a74e52486"} Apr 20 14:55:20.029999 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.029954 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-163.ec2.internal" podStartSLOduration=2.029938183 podStartE2EDuration="2.029938183s" podCreationTimestamp="2026-04-20 14:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:20.029372851 +0000 UTC m=+3.685421152" watchObservedRunningTime="2026-04-20 14:55:20.029938183 +0000 UTC m=+3.685986486" Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.436058 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.436129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436296 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436333 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436345 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436415 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:22.436396338 +0000 UTC m=+6.092444640 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436829 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:20.436912 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.436879 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:22.436863019 +0000 UTC m=+6.092911316 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:20.983047 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:20.980377 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:20.983047 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:20.980529 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:21.058426 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.057960 2570 generic.go:358] "Generic (PLEG): container finished" podID="7b77433e362f2114f13c38a959650d25" containerID="f8de706e9abd0f97cb0d951e19a6d9c607c4f8a4b65c8e90ee80455fb7d3c2fa" exitCode=0 Apr 20 14:55:21.058426 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.058104 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerDied","Data":"f8de706e9abd0f97cb0d951e19a6d9c607c4f8a4b65c8e90ee80455fb7d3c2fa"} Apr 20 14:55:21.625138 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.624236 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-cnpgl"] Apr 20 14:55:21.626842 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.626452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.626842 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:21.626525 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:21.748093 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.747712 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-kubelet-config\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.748093 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.747758 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.748093 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.747813 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-dbus\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.848864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-dbus\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.848951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-kubelet-config\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.848977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:21.849110 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:21.849169 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:22.349150906 +0000 UTC m=+6.005199190 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.849538 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-dbus\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.849664 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.849613 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-kubelet-config\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:21.981655 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:21.981579 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:21.981813 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:21.981712 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:22.064806 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.064142 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" event={"ID":"7b77433e362f2114f13c38a959650d25","Type":"ContainerStarted","Data":"210c92613b2ec2b8a7ec9d4fe187a189af499e61487eaa485c36410911fc5d81"} Apr 20 14:55:22.353514 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.352812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:22.353514 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.353062 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:22.353514 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.353144 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:23.353122489 +0000 UTC m=+7.009170793 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.453267 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.453379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.453560 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.453582 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.453606 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.453666 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:26.453646426 +0000 UTC m=+10.109694708 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.454085 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:22.454200 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.454142 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:26.45412466 +0000 UTC m=+10.110172947 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:22.980360 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.980256 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:22.980516 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.980406 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:22.982969 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:22.982900 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:22.983117 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:22.983019 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:23.361560 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:23.361518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:23.362077 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:23.361674 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:23.362077 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:23.361740 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:25.361721581 +0000 UTC m=+9.017769879 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:23.979434 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:23.979402 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:23.979639 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:23.979509 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:24.980059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:24.979504 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:24.980059 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:24.979641 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:24.980059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:24.979720 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:24.980059 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:24.979841 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:25.379124 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:25.379085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:25.379349 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:25.379232 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:25.379349 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:25.379323 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:29.379284536 +0000 UTC m=+13.035332837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:25.980017 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:25.979551 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:25.980017 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:25.979689 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:26.488137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:26.488200 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488388 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488406 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488418 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488474 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:34.48845661 +0000 UTC m=+18.144504898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488872 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:26.488949 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.488918 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:34.488903439 +0000 UTC m=+18.144951727 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:26.981483 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:26.981445 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:26.981939 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.981585 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:26.982297 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:26.982093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:26.982297 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:26.982211 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:27.980231 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:27.979755 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:27.980231 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:27.979881 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:28.980166 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:28.980136 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:28.980575 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:28.980176 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:28.980575 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:28.980259 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:28.980575 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:28.980424 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:29.409208 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:29.409168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:29.409369 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:29.409285 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:29.409369 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:29.409362 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:37.409343991 +0000 UTC m=+21.065392286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:29.980130 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:29.980096 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:29.980282 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:29.980222 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:30.980138 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:30.980104 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:30.980340 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:30.980226 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:30.980340 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:30.980316 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:30.980701 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:30.980445 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:31.980136 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:31.980102 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:31.980328 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:31.980224 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:32.979861 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:32.979826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:32.980342 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:32.979939 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:32.980342 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:32.980005 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:32.980342 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:32.980113 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:33.979454 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:33.979424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:33.979638 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:33.979564 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:34.548648 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:34.548616 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:34.548668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.548779 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.548793 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.548853 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:55:50.548833553 +0000 UTC m=+34.204881860 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.548871 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.548948 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:34.549092 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.549008 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:50.548991129 +0000 UTC m=+34.205039428 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:34.979958 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:34.979158 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:34.979958 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:34.979939 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:34.980168 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.980060 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:34.980258 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:34.980234 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:35.979591 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:35.979560 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:35.980035 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:35.979672 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:36.980462 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:36.980439 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:36.980757 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:36.980518 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:36.980757 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:36.980532 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:36.980757 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:36.980649 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:37.470026 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:37.469865 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:37.470108 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:37.469996 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:37.470108 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:37.470093 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret podName:8ff97bcf-86b2-437d-aad6-c51eae0b40b1 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:53.470080755 +0000 UTC m=+37.126129048 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret") pod "global-pull-secret-syncer-cnpgl" (UID: "8ff97bcf-86b2-437d-aad6-c51eae0b40b1") : object "kube-system"/"original-pull-secret" not registered Apr 20 14:55:37.980160 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:37.980130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:37.980364 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:37.980253 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:38.095440 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.095405 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-9vtqc" event={"ID":"50d12be2-afb1-4257-895a-8f2eed4865c3","Type":"ContainerStarted","Data":"aa128b1614ef4b9285be1bb7c7a574284bc1ef3f298ebda148941f5ed6c86b93"} Apr 20 14:55:38.096967 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.096936 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" event={"ID":"d0ae6812-a112-4eba-84ef-a2eeea69630a","Type":"ContainerStarted","Data":"0364edf49dbf2610094155564f9c47f35186863f42eaa668090cbde0759ea483"} Apr 20 14:55:38.098367 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.098330 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-h82ph" event={"ID":"3661ad3f-53ca-47ec-8a9b-15e3d3f054bd","Type":"ContainerStarted","Data":"767315f7d226c395d534d867a8a09673363a0cc7f76d180b640a081d54149226"} Apr 20 14:55:38.099835 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.099790 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="a937ccffd7811b18cf8b210afcdbe1ba3e4c21f00f3d05bfb8b5542c77a70fcb" exitCode=0 Apr 20 14:55:38.100100 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.100079 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"a937ccffd7811b18cf8b210afcdbe1ba3e4c21f00f3d05bfb8b5542c77a70fcb"} Apr 20 14:55:38.102873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.102775 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 14:55:38.103134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103105 2570 generic.go:358] "Generic (PLEG): container finished" podID="f6944a1f-03f8-4115-899e-e5c61d0d6075" containerID="731a9d51e4dc4b2c3143556dc6411e02e3f68011c5369664a64b466ce6ac851e" exitCode=1 Apr 20 14:55:38.103229 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"37d3e24d016ec66329f766896566a043c05d52c20f09179ecacc4ff5db2c5553"} Apr 20 14:55:38.103229 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103213 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"c454163f6198e55b8889e0aaaad12eb69b3609378953641bbc2299c86bc1693c"} Apr 20 14:55:38.103229 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103227 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"979c7f9a48eaa0add6590eb5edfb8a20d8e60bc3ff9f9cc5fa6bb71b9abd10a4"} Apr 20 14:55:38.103379 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103240 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"bd734a910a1e26b93bbd7d12d8099bb469bce2d34d8018f1f9e6eb7f5588b2f6"} Apr 20 14:55:38.103379 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103252 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerDied","Data":"731a9d51e4dc4b2c3143556dc6411e02e3f68011c5369664a64b466ce6ac851e"} Apr 20 14:55:38.103379 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.103267 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"ecb6d352e16a137a0971fb4503f0b05ef581441d83e5accd7590523deda47322"} Apr 20 14:55:38.104627 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.104601 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zb7gn" event={"ID":"f6c014f8-befe-4916-a8ed-bc592d3baacf","Type":"ContainerStarted","Data":"987037220f4625e955aa52a9927ff444796a2973664bd131bfc8576fc83062a6"} Apr 20 14:55:38.105848 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.105826 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-chk28" event={"ID":"2db4df8a-cdb6-4503-9793-bc14f5983e3e","Type":"ContainerStarted","Data":"69fd572d31bb4891bd2e80230e2ac7d58c232570beb8ce6322be7ef9065a023e"} Apr 20 14:55:38.107061 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.107042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" event={"ID":"48771f49-ba8f-4d89-b489-7d4f8dbd6d3b","Type":"ContainerStarted","Data":"8a2f8fa77b066a0eb78d52f96a5245f08108413045ce6797903b0ee2ea755586"} Apr 20 14:55:38.110263 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.110227 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-163.ec2.internal" podStartSLOduration=20.110217054 podStartE2EDuration="20.110217054s" podCreationTimestamp="2026-04-20 14:55:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:55:22.08243948 +0000 UTC m=+5.738487783" watchObservedRunningTime="2026-04-20 14:55:38.110217054 +0000 UTC m=+21.766265357" Apr 20 14:55:38.110541 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.110514 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-9vtqc" podStartSLOduration=4.749259236 podStartE2EDuration="22.110507574s" podCreationTimestamp="2026-04-20 14:55:16 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.650925023 +0000 UTC m=+3.306973303" lastFinishedPulling="2026-04-20 14:55:37.012173346 +0000 UTC m=+20.668221641" observedRunningTime="2026-04-20 14:55:38.109916471 +0000 UTC m=+21.765964786" watchObservedRunningTime="2026-04-20 14:55:38.110507574 +0000 UTC m=+21.766555872" Apr 20 14:55:38.126187 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.126149 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-chk28" podStartSLOduration=3.723367134 podStartE2EDuration="21.126139076s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.638429463 +0000 UTC m=+3.294477742" lastFinishedPulling="2026-04-20 14:55:37.041201401 +0000 UTC m=+20.697249684" observedRunningTime="2026-04-20 14:55:38.125655712 +0000 UTC m=+21.781704016" watchObservedRunningTime="2026-04-20 14:55:38.126139076 +0000 UTC m=+21.782187377" Apr 20 14:55:38.138865 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.138825 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-h82ph" podStartSLOduration=3.774293048 podStartE2EDuration="21.138814332s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.648048855 +0000 UTC m=+3.304097139" lastFinishedPulling="2026-04-20 14:55:37.012570126 +0000 UTC m=+20.668618423" observedRunningTime="2026-04-20 14:55:38.138426638 +0000 UTC m=+21.794474922" watchObservedRunningTime="2026-04-20 14:55:38.138814332 +0000 UTC m=+21.794862634" Apr 20 14:55:38.153454 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.153413 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-s6x9l" podStartSLOduration=3.778450701 podStartE2EDuration="21.153400828s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.637245184 +0000 UTC m=+3.293293466" lastFinishedPulling="2026-04-20 14:55:37.012195313 +0000 UTC m=+20.668243593" observedRunningTime="2026-04-20 14:55:38.153290793 +0000 UTC m=+21.809339096" watchObservedRunningTime="2026-04-20 14:55:38.153400828 +0000 UTC m=+21.809449131" Apr 20 14:55:38.166805 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.166755 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zb7gn" podStartSLOduration=3.770790556 podStartE2EDuration="21.16674244s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.641529025 +0000 UTC m=+3.297577307" lastFinishedPulling="2026-04-20 14:55:37.037480908 +0000 UTC m=+20.693529191" observedRunningTime="2026-04-20 14:55:38.166198728 +0000 UTC m=+21.822247031" watchObservedRunningTime="2026-04-20 14:55:38.16674244 +0000 UTC m=+21.822790747" Apr 20 14:55:38.294860 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.294829 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:38.295407 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.295391 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:38.719709 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.719689 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 14:55:38.881477 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.881379 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T14:55:38.719706176Z","UUID":"0e39b3c9-18bd-4a96-a1fa-eb7706ae0852","Handler":null,"Name":"","Endpoint":""} Apr 20 14:55:38.883533 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.883508 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 14:55:38.883661 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.883540 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 14:55:38.979803 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.979735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:38.979944 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:38.979840 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:38.979944 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:38.979735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:38.980063 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:38.979984 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:39.111220 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.111186 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" event={"ID":"d0ae6812-a112-4eba-84ef-a2eeea69630a","Type":"ContainerStarted","Data":"c3106bcbfabcebb5cc7b9b566a2b5f391d6e11349cc09f5376bfd7b60f4b1004"} Apr 20 14:55:39.113979 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.113951 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-s775f" event={"ID":"db4252bf-5e13-4727-a83a-7f87874cf5c4","Type":"ContainerStarted","Data":"b8eebaa0d0689e1d2c6fc0439356ff650186cd31c99520b50a8cb8537a8e66b6"} Apr 20 14:55:39.114086 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.113992 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:39.114460 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.114428 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-9vtqc" Apr 20 14:55:39.142276 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.142227 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-s775f" podStartSLOduration=5.571039749 podStartE2EDuration="23.142211047s" podCreationTimestamp="2026-04-20 14:55:16 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.641527061 +0000 UTC m=+3.297575352" lastFinishedPulling="2026-04-20 14:55:37.212698357 +0000 UTC m=+20.868746650" observedRunningTime="2026-04-20 14:55:39.129124912 +0000 UTC m=+22.785173216" watchObservedRunningTime="2026-04-20 14:55:39.142211047 +0000 UTC m=+22.798259350" Apr 20 14:55:39.979848 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:39.979826 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:39.979982 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:39.979947 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:40.117223 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.116978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" event={"ID":"d0ae6812-a112-4eba-84ef-a2eeea69630a","Type":"ContainerStarted","Data":"08fcd3023181d0ce63fa874c8fd03b54659735a869033a7692d5d540b7f93d01"} Apr 20 14:55:40.120863 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.120786 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 14:55:40.121485 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.121085 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"e02e2420d17976f64a74e2882416399ae25abc7dfb3420e5030e48e13f751288"} Apr 20 14:55:40.133899 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.133861 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-df9th" podStartSLOduration=2.852352303 podStartE2EDuration="23.133848535s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.65095824 +0000 UTC m=+3.307006522" lastFinishedPulling="2026-04-20 14:55:39.932454469 +0000 UTC m=+23.588502754" observedRunningTime="2026-04-20 14:55:40.133826258 +0000 UTC m=+23.789874561" watchObservedRunningTime="2026-04-20 14:55:40.133848535 +0000 UTC m=+23.789896818" Apr 20 14:55:40.979831 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.979802 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:40.980044 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:40.979918 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:40.980044 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:40.979974 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:40.980153 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:40.980079 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:41.979593 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:41.979556 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:41.980153 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:41.979660 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:42.980414 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:42.980100 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:42.980910 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:42.980139 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:42.980910 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:42.980493 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:42.980910 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:42.980594 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:43.127325 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.127273 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="d73e4f2d0325d7af07153f97aa2387f4093ade9b7014aca821a528a5cb212a1f" exitCode=0 Apr 20 14:55:43.127491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.127363 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"d73e4f2d0325d7af07153f97aa2387f4093ade9b7014aca821a528a5cb212a1f"} Apr 20 14:55:43.130319 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.130208 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 14:55:43.130564 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.130546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"cf4154490a11056c53c859d65adf2f5cdee4ea8da30b12c97244132d9e1d1c3b"} Apr 20 14:55:43.130783 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.130768 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:43.130833 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.130791 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:43.131003 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.130988 2570 scope.go:117] "RemoveContainer" containerID="731a9d51e4dc4b2c3143556dc6411e02e3f68011c5369664a64b466ce6ac851e" Apr 20 14:55:43.145956 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.145928 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:43.979695 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:43.979664 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:43.979841 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:43.979802 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:44.133701 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.133615 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="cfef14d18877b401d7ac716ab99a92f54347758df5bdc99d376b5ac0ccc5d7e1" exitCode=0 Apr 20 14:55:44.133701 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.133688 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"cfef14d18877b401d7ac716ab99a92f54347758df5bdc99d376b5ac0ccc5d7e1"} Apr 20 14:55:44.137018 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.136998 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 14:55:44.137388 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.137368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" event={"ID":"f6944a1f-03f8-4115-899e-e5c61d0d6075","Type":"ContainerStarted","Data":"52b3a341bb9aadf9399b9ed7949c99a43f2d2e84ccbbea11bf17b7211612d661"} Apr 20 14:55:44.137754 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.137732 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:44.152619 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.152429 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:55:44.181391 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.181355 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" podStartSLOduration=9.758516421 podStartE2EDuration="27.181343898s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.644775398 +0000 UTC m=+3.300823695" lastFinishedPulling="2026-04-20 14:55:37.067602892 +0000 UTC m=+20.723651172" observedRunningTime="2026-04-20 14:55:44.17990496 +0000 UTC m=+27.835953267" watchObservedRunningTime="2026-04-20 14:55:44.181343898 +0000 UTC m=+27.837392199" Apr 20 14:55:44.205816 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.205787 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cnpgl"] Apr 20 14:55:44.205927 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.205896 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:44.205984 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:44.205968 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:44.209325 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.209283 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4wt8"] Apr 20 14:55:44.209428 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.209410 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:44.209521 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:44.209502 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:44.209897 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.209874 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sjhzf"] Apr 20 14:55:44.209987 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:44.209975 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:44.210070 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:44.210055 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:45.141114 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:45.141076 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="b0e9f17590eef27268a9a31e82d8242171a2d49b4b9d0acacc21a0572a6b637d" exitCode=0 Apr 20 14:55:45.141569 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:45.141149 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"b0e9f17590eef27268a9a31e82d8242171a2d49b4b9d0acacc21a0572a6b637d"} Apr 20 14:55:45.979731 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:45.979703 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:45.979933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:45.979841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:45.979933 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:45.979861 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:45.980040 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:45.979977 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:45.980040 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:45.980012 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:45.980150 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:45.980082 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:47.979798 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:47.979722 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:47.979798 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:47.979769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:47.980553 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:47.979882 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:47.980553 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:47.979940 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:47.980553 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:47.979967 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:47.980553 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:47.980059 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:49.980092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:49.980012 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:49.980092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:49.980031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:49.980801 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:49.980138 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sjhzf" podUID="6ae6c334-21b5-4f64-b2c3-68f797cd363b" Apr 20 14:55:49.980801 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:49.980603 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-cnpgl" podUID="8ff97bcf-86b2-437d-aad6-c51eae0b40b1" Apr 20 14:55:49.980801 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:49.980642 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:49.980801 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:49.980694 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-d4wt8" podUID="dbe6bf00-4b0b-4432-80f4-1085e83c9110" Apr 20 14:55:50.565715 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:50.565675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:50.565901 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:50.565728 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:50.565901 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565841 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:50.565901 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565867 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 14:55:50.565901 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565886 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 14:55:50.565901 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565900 2570 projected.go:194] Error preparing data for projected volume kube-api-access-g9wdm for pod openshift-network-diagnostics/network-check-target-d4wt8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:50.566090 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565910 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs podName:6ae6c334-21b5-4f64-b2c3-68f797cd363b nodeName:}" failed. No retries permitted until 2026-04-20 14:56:22.565894356 +0000 UTC m=+66.221942635 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs") pod "network-metrics-daemon-sjhzf" (UID: "6ae6c334-21b5-4f64-b2c3-68f797cd363b") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 14:55:50.566090 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:50.565948 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm podName:dbe6bf00-4b0b-4432-80f4-1085e83c9110 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:22.565933105 +0000 UTC m=+66.221981401 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-g9wdm" (UniqueName: "kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm") pod "network-check-target-d4wt8" (UID: "dbe6bf00-4b0b-4432-80f4-1085e83c9110") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 14:55:51.128512 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.128487 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-163.ec2.internal" event="NodeReady" Apr 20 14:55:51.128803 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.128619 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 14:55:51.174628 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.174595 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-jbtsc"] Apr 20 14:55:51.196141 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.196104 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-9dz42"] Apr 20 14:55:51.196357 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.196327 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.198821 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.198796 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tr5vz\"" Apr 20 14:55:51.198950 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.198796 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 14:55:51.198950 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.198796 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 14:55:51.232536 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.232497 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jbtsc"] Apr 20 14:55:51.232689 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.232547 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9dz42"] Apr 20 14:55:51.232689 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.232610 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.235461 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.235428 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 14:55:51.235461 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.235440 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 14:55:51.235461 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.235431 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 14:55:51.235720 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.235550 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xg95b\"" Apr 20 14:55:51.270235 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.270201 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfrrv\" (UniqueName: \"kubernetes.io/projected/7c674737-9de4-4df3-8cd4-de9165e6e70a-kube-api-access-zfrrv\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.270235 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.270232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.270447 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.270280 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c674737-9de4-4df3-8cd4-de9165e6e70a-tmp-dir\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.270447 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.270311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c674737-9de4-4df3-8cd4-de9165e6e70a-config-volume\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371340 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c674737-9de4-4df3-8cd4-de9165e6e70a-tmp-dir\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371340 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c674737-9de4-4df3-8cd4-de9165e6e70a-config-volume\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371584 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371378 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfmm2\" (UniqueName: \"kubernetes.io/projected/7fe57737-4cb8-41e4-95a2-77878dc0e909-kube-api-access-dfmm2\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.371584 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zfrrv\" (UniqueName: \"kubernetes.io/projected/7c674737-9de4-4df3-8cd4-de9165e6e70a-kube-api-access-zfrrv\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371584 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.371584 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371464 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371584 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.371562 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:51.371829 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.371623 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:55:51.871604482 +0000 UTC m=+35.527652801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:55:51.371829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371635 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7c674737-9de4-4df3-8cd4-de9165e6e70a-tmp-dir\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.371994 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.371971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7c674737-9de4-4df3-8cd4-de9165e6e70a-config-volume\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.385298 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.385120 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfrrv\" (UniqueName: \"kubernetes.io/projected/7c674737-9de4-4df3-8cd4-de9165e6e70a-kube-api-access-zfrrv\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.472001 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.471912 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dfmm2\" (UniqueName: \"kubernetes.io/projected/7fe57737-4cb8-41e4-95a2-77878dc0e909-kube-api-access-dfmm2\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.472001 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.471984 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.472173 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.472095 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:51.472173 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.472165 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:51.972146263 +0000 UTC m=+35.628194564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:55:51.481803 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.481770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfmm2\" (UniqueName: \"kubernetes.io/projected/7fe57737-4cb8-41e4-95a2-77878dc0e909-kube-api-access-dfmm2\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.874126 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.874082 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:51.874326 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.874200 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:51.874326 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.874250 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:55:52.874235955 +0000 UTC m=+36.530284235 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:55:51.975282 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.975245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:51.975457 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.975420 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:51.975541 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:51.975496 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:52.975475593 +0000 UTC m=+36.631523874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:55:51.979336 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.979293 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:55:51.979470 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.979296 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:55:51.979534 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.979296 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:51.981836 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.981812 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:55:51.981836 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.981815 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:55:51.982195 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.982179 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:55:51.982854 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.982833 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:55:51.982854 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.982845 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 14:55:51.982854 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:51.982851 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:55:52.159743 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:52.159657 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="d6e8f9e227180d863ed115a717c538ca2b2c95f041df5a05b40ad7a93c5b3bec" exitCode=0 Apr 20 14:55:52.159743 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:52.159718 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"d6e8f9e227180d863ed115a717c538ca2b2c95f041df5a05b40ad7a93c5b3bec"} Apr 20 14:55:52.882401 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:52.882352 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:52.882545 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:52.882506 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:52.882583 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:52.882574 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:55:54.88255917 +0000 UTC m=+38.538607449 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:55:52.983444 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:52.983418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:52.983595 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:52.983555 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:52.983636 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:52.983609 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:54.983595174 +0000 UTC m=+38.639643454 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:55:53.165095 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.165014 2570 generic.go:358] "Generic (PLEG): container finished" podID="4aa972e0-3242-4b0c-87e7-b4ebc421bbce" containerID="507ddb220490cb759f921e59f5c5ac402d54a29ececd3f14eea843bc0137198d" exitCode=0 Apr 20 14:55:53.165095 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.165079 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerDied","Data":"507ddb220490cb759f921e59f5c5ac402d54a29ececd3f14eea843bc0137198d"} Apr 20 14:55:53.487141 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.487059 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:53.490138 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.490114 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/8ff97bcf-86b2-437d-aad6-c51eae0b40b1-original-pull-secret\") pod \"global-pull-secret-syncer-cnpgl\" (UID: \"8ff97bcf-86b2-437d-aad6-c51eae0b40b1\") " pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:53.500132 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.500113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-cnpgl" Apr 20 14:55:53.638988 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:53.638962 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-cnpgl"] Apr 20 14:55:53.642585 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:53.642549 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ff97bcf_86b2_437d_aad6_c51eae0b40b1.slice/crio-dd6d6e4f048723fff1144cc3477bf8be6a9346702f49fcc83e530cbb1818977f WatchSource:0}: Error finding container dd6d6e4f048723fff1144cc3477bf8be6a9346702f49fcc83e530cbb1818977f: Status 404 returned error can't find the container with id dd6d6e4f048723fff1144cc3477bf8be6a9346702f49fcc83e530cbb1818977f Apr 20 14:55:54.168088 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:54.168052 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cnpgl" event={"ID":"8ff97bcf-86b2-437d-aad6-c51eae0b40b1","Type":"ContainerStarted","Data":"dd6d6e4f048723fff1144cc3477bf8be6a9346702f49fcc83e530cbb1818977f"} Apr 20 14:55:54.170718 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:54.170693 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-lstvb" event={"ID":"4aa972e0-3242-4b0c-87e7-b4ebc421bbce","Type":"ContainerStarted","Data":"64b9c01ab4a1d23dd87e40a194374c91a7c43945f1b9fa0a6d74c6768e1d12b4"} Apr 20 14:55:54.193432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:54.193387 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-lstvb" podStartSLOduration=5.84193942 podStartE2EDuration="37.193374162s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:55:19.647588779 +0000 UTC m=+3.303637058" lastFinishedPulling="2026-04-20 14:55:50.999023519 +0000 UTC m=+34.655071800" observedRunningTime="2026-04-20 14:55:54.191629101 +0000 UTC m=+37.847677403" watchObservedRunningTime="2026-04-20 14:55:54.193374162 +0000 UTC m=+37.849422463" Apr 20 14:55:54.898991 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:54.898948 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:54.899163 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:54.899111 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:54.899210 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:54.899188 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.899169543 +0000 UTC m=+42.555217823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:55:55.000212 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:55.000168 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:55.000415 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:55.000333 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:55.000415 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:55.000404 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:59.000383855 +0000 UTC m=+42.656432136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:55:56.934089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.934046 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z"] Apr 20 14:55:56.939104 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.939085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" Apr 20 14:55:56.941825 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.941805 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:56.941918 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.941835 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:56.942010 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.941988 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-svjsh\"" Apr 20 14:55:56.948397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:56.948373 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z"] Apr 20 14:55:57.016594 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.016560 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5dq7\" (UniqueName: \"kubernetes.io/projected/a319ad70-d05f-4ed2-a056-f1fe50c202fc-kube-api-access-x5dq7\") pod \"volume-data-source-validator-7c6cbb6c87-dt42z\" (UID: \"a319ad70-d05f-4ed2-a056-f1fe50c202fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" Apr 20 14:55:57.034226 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.033439 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv"] Apr 20 14:55:57.037253 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.037233 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-67d496fbdd-b9cws"] Apr 20 14:55:57.037428 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.037401 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.039997 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.039974 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 20 14:55:57.040090 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.040016 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 20 14:55:57.040090 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.040017 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 20 14:55:57.040169 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.039975 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-9lsdg\"" Apr 20 14:55:57.040368 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.040259 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 20 14:55:57.040968 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.040945 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:55:57.041099 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.041085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.043594 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.043519 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 20 14:55:57.043692 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.043647 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.044678 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-qk7sc\"" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.044845 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv"] Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.044881 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.044942 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.044968 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.045216 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 20 14:55:57.045829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.045371 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 14:55:57.047493 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.047415 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 14:55:57.048160 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.048141 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 14:55:57.048255 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.048214 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fgmdb\"" Apr 20 14:55:57.048544 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.048383 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 14:55:57.049279 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.049247 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67d496fbdd-b9cws"] Apr 20 14:55:57.050481 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.050351 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:55:57.053174 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.053153 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 14:55:57.117910 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.117875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.118059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.117924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5dq7\" (UniqueName: \"kubernetes.io/projected/a319ad70-d05f-4ed2-a056-f1fe50c202fc-kube-api-access-x5dq7\") pod \"volume-data-source-validator-7c6cbb6c87-dt42z\" (UID: \"a319ad70-d05f-4ed2-a056-f1fe50c202fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" Apr 20 14:55:57.118059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.117945 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfm56\" (UniqueName: \"kubernetes.io/projected/95d5e76e-7689-4825-ba12-28c88aebccda-kube-api-access-lfm56\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.118059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.117969 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.118196 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.118105 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.118196 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.118146 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-config\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.118196 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.118179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24lk7\" (UniqueName: \"kubernetes.io/projected/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-kube-api-access-24lk7\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.118319 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.118276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-default-certificate\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.118354 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.118326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-stats-auth\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.128238 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.128218 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5dq7\" (UniqueName: \"kubernetes.io/projected/a319ad70-d05f-4ed2-a056-f1fe50c202fc-kube-api-access-x5dq7\") pod \"volume-data-source-validator-7c6cbb6c87-dt42z\" (UID: \"a319ad70-d05f-4ed2-a056-f1fe50c202fc\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" Apr 20 14:55:57.218945 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.218856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.218945 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.218910 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-config\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.218945 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.218940 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24lk7\" (UniqueName: \"kubernetes.io/projected/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-kube-api-access-24lk7\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.218984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.218990 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.219055 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.719034503 +0000 UTC m=+41.375082802 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : secret "router-metrics-certs-default" not found Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219096 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219164 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219127 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-default-certificate\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.219580 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-stats-auth\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.219637 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219685 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219685 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219696 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjvg\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.219777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.219777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfm56\" (UniqueName: \"kubernetes.io/projected/95d5e76e-7689-4825-ba12-28c88aebccda-kube-api-access-lfm56\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.219777 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219770 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-config\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.219946 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.219946 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.219850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.220404 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.220378 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.720363677 +0000 UTC m=+41.376411957 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:57.223648 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.223623 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-default-certificate\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.224801 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.224771 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-stats-auth\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.226023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.226001 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-serving-cert\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.228350 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.228284 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24lk7\" (UniqueName: \"kubernetes.io/projected/2baebef1-0abd-4dc6-a4f1-5cf8fe74d376-kube-api-access-24lk7\") pod \"service-ca-operator-d6fc45fc5-sj2fv\" (UID: \"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.231328 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.229542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfm56\" (UniqueName: \"kubernetes.io/projected/95d5e76e-7689-4825-ba12-28c88aebccda-kube-api-access-lfm56\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.251089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.251068 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" Apr 20 14:55:57.321208 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321176 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321229 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321256 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321280 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321320 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjvg\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.321333 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:57.321356 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.321354 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fcf5597b7-ngxsq: secret "image-registry-tls" not found Apr 20 14:55:57.321642 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321376 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321642 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.321416 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls podName:6495ee23-3ff8-4b72-b73d-ec3a22a198c2 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:57.821394856 +0000 UTC m=+41.477443139 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls") pod "image-registry-6fcf5597b7-ngxsq" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2") : secret "image-registry-tls" not found Apr 20 14:55:57.321642 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321642 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321606 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.321880 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.321663 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.322030 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.322009 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.322438 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.322422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.323887 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.323863 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.324140 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.324123 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.330251 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.330217 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjvg\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.330559 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.330540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.359475 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.358423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" Apr 20 14:55:57.427208 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.427139 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z"] Apr 20 14:55:57.432515 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:57.432480 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda319ad70_d05f_4ed2_a056_f1fe50c202fc.slice/crio-b91aca523e4606eea2da5c40b3bb45b4cc0e86f1f1e9d4249251512a719eed47 WatchSource:0}: Error finding container b91aca523e4606eea2da5c40b3bb45b4cc0e86f1f1e9d4249251512a719eed47: Status 404 returned error can't find the container with id b91aca523e4606eea2da5c40b3bb45b4cc0e86f1f1e9d4249251512a719eed47 Apr 20 14:55:57.503676 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.503655 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv"] Apr 20 14:55:57.515338 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:55:57.515299 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2baebef1_0abd_4dc6_a4f1_5cf8fe74d376.slice/crio-401970f37e58c778b4c8ef82f76b74cae0eaeb3a275801f6447c4c0d6743db1e WatchSource:0}: Error finding container 401970f37e58c778b4c8ef82f76b74cae0eaeb3a275801f6447c4c0d6743db1e: Status 404 returned error can't find the container with id 401970f37e58c778b4c8ef82f76b74cae0eaeb3a275801f6447c4c0d6743db1e Apr 20 14:55:57.726127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.726095 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.726287 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.726159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:57.726287 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.726245 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.726227071 +0000 UTC m=+42.382275351 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:57.726287 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.726270 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:57.726457 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.726329 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.726295298 +0000 UTC m=+42.382343578 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : secret "router-metrics-certs-default" not found Apr 20 14:55:57.826832 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:57.826801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:57.826991 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.826946 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:57.826991 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.826961 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fcf5597b7-ngxsq: secret "image-registry-tls" not found Apr 20 14:55:57.827102 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:57.827011 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls podName:6495ee23-3ff8-4b72-b73d-ec3a22a198c2 nodeName:}" failed. No retries permitted until 2026-04-20 14:55:58.826994172 +0000 UTC m=+42.483042454 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls") pod "image-registry-6fcf5597b7-ngxsq" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2") : secret "image-registry-tls" not found Apr 20 14:55:58.181282 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.181201 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-cnpgl" event={"ID":"8ff97bcf-86b2-437d-aad6-c51eae0b40b1","Type":"ContainerStarted","Data":"7f03114212f7f0a483f4d7d8481f41cca46aada5035172e84b3b93e8dfbefe6f"} Apr 20 14:55:58.182635 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.182592 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" event={"ID":"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376","Type":"ContainerStarted","Data":"401970f37e58c778b4c8ef82f76b74cae0eaeb3a275801f6447c4c0d6743db1e"} Apr 20 14:55:58.183954 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.183926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" event={"ID":"a319ad70-d05f-4ed2-a056-f1fe50c202fc","Type":"ContainerStarted","Data":"b91aca523e4606eea2da5c40b3bb45b4cc0e86f1f1e9d4249251512a719eed47"} Apr 20 14:55:58.198108 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.198059 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-cnpgl" podStartSLOduration=33.454398446 podStartE2EDuration="37.198047647s" podCreationTimestamp="2026-04-20 14:55:21 +0000 UTC" firstStartedPulling="2026-04-20 14:55:53.644290977 +0000 UTC m=+37.300339257" lastFinishedPulling="2026-04-20 14:55:57.387940163 +0000 UTC m=+41.043988458" observedRunningTime="2026-04-20 14:55:58.196942996 +0000 UTC m=+41.852991300" watchObservedRunningTime="2026-04-20 14:55:58.198047647 +0000 UTC m=+41.854095950" Apr 20 14:55:58.734001 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.733972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:58.734146 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.734035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:55:58.734146 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.734124 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:55:58.734233 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.734146 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.734129609 +0000 UTC m=+44.390177888 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : configmap references non-existent config key: service-ca.crt Apr 20 14:55:58.734233 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.734175 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.734162305 +0000 UTC m=+44.390210585 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : secret "router-metrics-certs-default" not found Apr 20 14:55:58.834747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.834716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:55:58.834905 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.834834 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:55:58.834905 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.834845 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fcf5597b7-ngxsq: secret "image-registry-tls" not found Apr 20 14:55:58.834905 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.834893 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls podName:6495ee23-3ff8-4b72-b73d-ec3a22a198c2 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:00.834879589 +0000 UTC m=+44.490927868 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls") pod "image-registry-6fcf5597b7-ngxsq" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2") : secret "image-registry-tls" not found Apr 20 14:55:58.935557 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:58.935521 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:55:58.935685 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.935670 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:55:58.935742 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:58.935731 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:56:06.935715599 +0000 UTC m=+50.591763880 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:55:59.036184 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:59.036157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:55:59.036352 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:59.036319 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:55:59.036407 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:55:59.036381 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:07.036366552 +0000 UTC m=+50.692414832 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:55:59.187915 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:59.187834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" event={"ID":"a319ad70-d05f-4ed2-a056-f1fe50c202fc","Type":"ContainerStarted","Data":"9dd71074e57fad139c45fe321d1543ad8d5e7b2a49bcfad243393d42c99a05ed"} Apr 20 14:55:59.202420 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:55:59.202381 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-dt42z" podStartSLOduration=2.129148443 podStartE2EDuration="3.202367625s" podCreationTimestamp="2026-04-20 14:55:56 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.434657656 +0000 UTC m=+41.090705935" lastFinishedPulling="2026-04-20 14:55:58.507876822 +0000 UTC m=+42.163925117" observedRunningTime="2026-04-20 14:55:59.201944479 +0000 UTC m=+42.857992794" watchObservedRunningTime="2026-04-20 14:55:59.202367625 +0000 UTC m=+42.858415906" Apr 20 14:56:00.190785 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:00.190743 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" event={"ID":"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376","Type":"ContainerStarted","Data":"2d52832da8ec3628ccfdfb88cc72bce211658f458c107ecb20c56ed0a8289860"} Apr 20 14:56:00.750575 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:00.750536 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:00.750731 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:00.750602 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:00.750731 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.750686 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:56:00.750731 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.750695 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.750668658 +0000 UTC m=+48.406716946 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : configmap references non-existent config key: service-ca.crt Apr 20 14:56:00.750731 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.750726 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.750715326 +0000 UTC m=+48.406763606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : secret "router-metrics-certs-default" not found Apr 20 14:56:00.851789 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:00.851754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:00.851932 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.851865 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:56:00.851932 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.851878 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fcf5597b7-ngxsq: secret "image-registry-tls" not found Apr 20 14:56:00.851932 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:00.851931 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls podName:6495ee23-3ff8-4b72-b73d-ec3a22a198c2 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:04.851913941 +0000 UTC m=+48.507962221 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls") pod "image-registry-6fcf5597b7-ngxsq" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2") : secret "image-registry-tls" not found Apr 20 14:56:03.090349 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.090270 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" podStartSLOduration=3.903288867 podStartE2EDuration="6.090254932s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="2026-04-20 14:55:57.517129089 +0000 UTC m=+41.173177368" lastFinishedPulling="2026-04-20 14:55:59.704095151 +0000 UTC m=+43.360143433" observedRunningTime="2026-04-20 14:56:00.205678679 +0000 UTC m=+43.861726981" watchObservedRunningTime="2026-04-20 14:56:03.090254932 +0000 UTC m=+46.746303295" Apr 20 14:56:03.090770 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.090498 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wh8t5"] Apr 20 14:56:03.102788 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.102761 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wh8t5"] Apr 20 14:56:03.102899 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.102862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.105394 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.105373 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 14:56:03.105491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.105394 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 14:56:03.106488 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.106470 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-hlwq8\"" Apr 20 14:56:03.106582 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.106516 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 14:56:03.106780 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.106754 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 14:56:03.268589 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.268556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-cabundle\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.268742 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.268664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhk74\" (UniqueName: \"kubernetes.io/projected/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-kube-api-access-hhk74\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.268742 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.268707 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-key\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.369461 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.369395 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-key\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.369461 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.369454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-cabundle\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.369621 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.369543 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhk74\" (UniqueName: \"kubernetes.io/projected/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-kube-api-access-hhk74\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.370084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.370064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-cabundle\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.371877 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.371859 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-signing-key\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.377668 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.377644 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhk74\" (UniqueName: \"kubernetes.io/projected/974a2b40-3b4b-4b0b-a10d-dfa7f4596c85-kube-api-access-hhk74\") pod \"service-ca-865cb79987-wh8t5\" (UID: \"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85\") " pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.411896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.411856 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-wh8t5" Apr 20 14:56:03.527850 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:03.527822 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-wh8t5"] Apr 20 14:56:03.530880 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:03.530854 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod974a2b40_3b4b_4b0b_a10d_dfa7f4596c85.slice/crio-130b8e43021e38d8f9589f9940a361c9144398594d5a2b8b9ae5c4997d315dfa WatchSource:0}: Error finding container 130b8e43021e38d8f9589f9940a361c9144398594d5a2b8b9ae5c4997d315dfa: Status 404 returned error can't find the container with id 130b8e43021e38d8f9589f9940a361c9144398594d5a2b8b9ae5c4997d315dfa Apr 20 14:56:04.200953 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:04.200912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wh8t5" event={"ID":"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85","Type":"ContainerStarted","Data":"6d32f9dd3533acac07940c8f8605386d04fca2a4efe3f30f3080432b7005d8ba"} Apr 20 14:56:04.200953 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:04.200959 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-wh8t5" event={"ID":"974a2b40-3b4b-4b0b-a10d-dfa7f4596c85","Type":"ContainerStarted","Data":"130b8e43021e38d8f9589f9940a361c9144398594d5a2b8b9ae5c4997d315dfa"} Apr 20 14:56:04.782714 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:04.782684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:04.782875 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:04.782745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:04.782875 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.782854 2570 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 20 14:56:04.782875 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.782866 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.782847866 +0000 UTC m=+56.438896166 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : configmap references non-existent config key: service-ca.crt Apr 20 14:56:04.782982 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.782902 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs podName:95d5e76e-7689-4825-ba12-28c88aebccda nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.782890734 +0000 UTC m=+56.438939014 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs") pod "router-default-67d496fbdd-b9cws" (UID: "95d5e76e-7689-4825-ba12-28c88aebccda") : secret "router-metrics-certs-default" not found Apr 20 14:56:04.883941 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:04.883915 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:04.884067 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.884043 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 14:56:04.884067 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.884058 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6fcf5597b7-ngxsq: secret "image-registry-tls" not found Apr 20 14:56:04.884130 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:04.884103 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls podName:6495ee23-3ff8-4b72-b73d-ec3a22a198c2 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:12.8840887 +0000 UTC m=+56.540136981 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls") pod "image-registry-6fcf5597b7-ngxsq" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2") : secret "image-registry-tls" not found Apr 20 14:56:06.999753 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:06.999718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:07.000189 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:06.999847 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 14:56:07.000189 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:06.999916 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls podName:7c674737-9de4-4df3-8cd4-de9165e6e70a nodeName:}" failed. No retries permitted until 2026-04-20 14:56:22.999900991 +0000 UTC m=+66.655949272 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls") pod "dns-default-jbtsc" (UID: "7c674737-9de4-4df3-8cd4-de9165e6e70a") : secret "dns-default-metrics-tls" not found Apr 20 14:56:07.100207 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:07.100173 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:56:07.100404 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:07.100344 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 14:56:07.100469 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:07.100429 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert podName:7fe57737-4cb8-41e4-95a2-77878dc0e909 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:23.100408436 +0000 UTC m=+66.756456730 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert") pod "ingress-canary-9dz42" (UID: "7fe57737-4cb8-41e4-95a2-77878dc0e909") : secret "canary-serving-cert" not found Apr 20 14:56:07.695896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:07.695871 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zb7gn_f6c014f8-befe-4916-a8ed-bc592d3baacf/dns-node-resolver/0.log" Apr 20 14:56:08.495139 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:08.495115 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h82ph_3661ad3f-53ca-47ec-8a9b-15e3d3f054bd/node-ca/0.log" Apr 20 14:56:12.846907 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.846867 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:12.847380 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.846947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:12.847508 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.847490 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/95d5e76e-7689-4825-ba12-28c88aebccda-service-ca-bundle\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:12.849470 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.849447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/95d5e76e-7689-4825-ba12-28c88aebccda-metrics-certs\") pod \"router-default-67d496fbdd-b9cws\" (UID: \"95d5e76e-7689-4825-ba12-28c88aebccda\") " pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:12.947838 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.947810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:12.950149 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.950128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"image-registry-6fcf5597b7-ngxsq\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:12.959019 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.958996 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:12.964859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:12.964828 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:13.097431 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.097208 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-wh8t5" podStartSLOduration=10.097187245 podStartE2EDuration="10.097187245s" podCreationTimestamp="2026-04-20 14:56:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:04.220002382 +0000 UTC m=+47.876050684" watchObservedRunningTime="2026-04-20 14:56:13.097187245 +0000 UTC m=+56.753235548" Apr 20 14:56:13.098020 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.097996 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-67d496fbdd-b9cws"] Apr 20 14:56:13.101092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.101064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:56:13.101539 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:13.101514 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d5e76e_7689_4825_ba12_28c88aebccda.slice/crio-9c614abb638efa36000dde19de99945c16a929bb1d52adcf00bb941407aec091 WatchSource:0}: Error finding container 9c614abb638efa36000dde19de99945c16a929bb1d52adcf00bb941407aec091: Status 404 returned error can't find the container with id 9c614abb638efa36000dde19de99945c16a929bb1d52adcf00bb941407aec091 Apr 20 14:56:13.103831 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:13.103806 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6495ee23_3ff8_4b72_b73d_ec3a22a198c2.slice/crio-cd993ddc468b53a802968654854fc6ab24fcf7eb77ba89fbe18bd0264094b00e WatchSource:0}: Error finding container cd993ddc468b53a802968654854fc6ab24fcf7eb77ba89fbe18bd0264094b00e: Status 404 returned error can't find the container with id cd993ddc468b53a802968654854fc6ab24fcf7eb77ba89fbe18bd0264094b00e Apr 20 14:56:13.221103 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.221075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" event={"ID":"6495ee23-3ff8-4b72-b73d-ec3a22a198c2","Type":"ContainerStarted","Data":"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe"} Apr 20 14:56:13.221187 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.221114 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" event={"ID":"6495ee23-3ff8-4b72-b73d-ec3a22a198c2","Type":"ContainerStarted","Data":"cd993ddc468b53a802968654854fc6ab24fcf7eb77ba89fbe18bd0264094b00e"} Apr 20 14:56:13.222591 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.222563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67d496fbdd-b9cws" event={"ID":"95d5e76e-7689-4825-ba12-28c88aebccda","Type":"ContainerStarted","Data":"e295546bebae069eb3463892c045adb73c06e3cbffc183a84d9a524a26c6dce6"} Apr 20 14:56:13.222682 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.222593 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-67d496fbdd-b9cws" event={"ID":"95d5e76e-7689-4825-ba12-28c88aebccda","Type":"ContainerStarted","Data":"9c614abb638efa36000dde19de99945c16a929bb1d52adcf00bb941407aec091"} Apr 20 14:56:13.244090 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.244041 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-67d496fbdd-b9cws" podStartSLOduration=16.244024984 podStartE2EDuration="16.244024984s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:13.242621486 +0000 UTC m=+56.898669779" watchObservedRunningTime="2026-04-20 14:56:13.244024984 +0000 UTC m=+56.900073286" Apr 20 14:56:13.959937 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.959902 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:13.962412 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:13.962390 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:14.226267 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:14.226187 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:14.227290 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:14.227271 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-67d496fbdd-b9cws" Apr 20 14:56:14.244036 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:14.243999 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" podStartSLOduration=17.24398814 podStartE2EDuration="17.24398814s" podCreationTimestamp="2026-04-20 14:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 14:56:14.243818246 +0000 UTC m=+57.899866548" watchObservedRunningTime="2026-04-20 14:56:14.24398814 +0000 UTC m=+57.900036439" Apr 20 14:56:16.154820 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:16.154789 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-g9x87" Apr 20 14:56:22.386450 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.386418 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:56:22.386866 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.386821 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:22.398296 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.398273 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-tjwf7"] Apr 20 14:56:22.403170 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.403150 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:22.407087 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.407071 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-vllz9\"" Apr 20 14:56:22.407196 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.407073 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 20 14:56:22.407353 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.407339 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 20 14:56:22.424141 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.424034 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tjwf7"] Apr 20 14:56:22.424836 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.424812 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-gktqt"] Apr 20 14:56:22.428723 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.428687 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.431365 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.431345 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 14:56:22.431987 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.431970 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-n4j2j\"" Apr 20 14:56:22.432065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.431619 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 14:56:22.432065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.431855 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 14:56:22.432174 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.431439 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 14:56:22.440720 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.440701 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gktqt"] Apr 20 14:56:22.525138 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.525110 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdr8j\" (UniqueName: \"kubernetes.io/projected/764028e9-7c8b-4dc6-9392-a96033a5f59d-kube-api-access-hdr8j\") pod \"downloads-6bcc868b7-tjwf7\" (UID: \"764028e9-7c8b-4dc6-9392-a96033a5f59d\") " pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:22.625612 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:56:22.625740 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hdr8j\" (UniqueName: \"kubernetes.io/projected/764028e9-7c8b-4dc6-9392-a96033a5f59d-kube-api-access-hdr8j\") pod \"downloads-6bcc868b7-tjwf7\" (UID: \"764028e9-7c8b-4dc6-9392-a96033a5f59d\") " pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:22.625740 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/07670bb4-e63a-4c79-930c-288b4bffcda3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.625740 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625725 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.625896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625752 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/07670bb4-e63a-4c79-930c-288b4bffcda3-data-volume\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.625896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625780 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:56:22.625896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625812 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nqhg\" (UniqueName: \"kubernetes.io/projected/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-api-access-8nqhg\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.625896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.625844 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/07670bb4-e63a-4c79-930c-288b4bffcda3-crio-socket\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.628119 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.628100 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 14:56:22.628178 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.628162 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 14:56:22.635601 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.635557 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdr8j\" (UniqueName: \"kubernetes.io/projected/764028e9-7c8b-4dc6-9392-a96033a5f59d-kube-api-access-hdr8j\") pod \"downloads-6bcc868b7-tjwf7\" (UID: \"764028e9-7c8b-4dc6-9392-a96033a5f59d\") " pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:22.638275 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.638214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6ae6c334-21b5-4f64-b2c3-68f797cd363b-metrics-certs\") pod \"network-metrics-daemon-sjhzf\" (UID: \"6ae6c334-21b5-4f64-b2c3-68f797cd363b\") " pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:56:22.638415 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.638395 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 14:56:22.648686 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.648668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9wdm\" (UniqueName: \"kubernetes.io/projected/dbe6bf00-4b0b-4432-80f4-1085e83c9110-kube-api-access-g9wdm\") pod \"network-check-target-d4wt8\" (UID: \"dbe6bf00-4b0b-4432-80f4-1085e83c9110\") " pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:56:22.711373 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.711349 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:22.727205 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727175 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/07670bb4-e63a-4c79-930c-288b4bffcda3-crio-socket\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727336 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727287 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/07670bb4-e63a-4c79-930c-288b4bffcda3-crio-socket\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727336 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727292 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/07670bb4-e63a-4c79-930c-288b4bffcda3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727447 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727384 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727447 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/07670bb4-e63a-4c79-930c-288b4bffcda3-data-volume\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727447 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727443 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nqhg\" (UniqueName: \"kubernetes.io/projected/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-api-access-8nqhg\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.727829 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.727808 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/07670bb4-e63a-4c79-930c-288b4bffcda3-data-volume\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.728840 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.728804 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.730316 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.730280 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/07670bb4-e63a-4c79-930c-288b4bffcda3-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.740491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.740472 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nqhg\" (UniqueName: \"kubernetes.io/projected/07670bb4-e63a-4c79-930c-288b4bffcda3-kube-api-access-8nqhg\") pod \"insights-runtime-extractor-gktqt\" (UID: \"07670bb4-e63a-4c79-930c-288b4bffcda3\") " pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:22.826989 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.826958 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-tjwf7"] Apr 20 14:56:22.830856 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:22.830822 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod764028e9_7c8b_4dc6_9392_a96033a5f59d.slice/crio-a223644fcd49ee6c293ecbe1fe07bb48a908b3c282fac01c190fe068dbc7e7f9 WatchSource:0}: Error finding container a223644fcd49ee6c293ecbe1fe07bb48a908b3c282fac01c190fe068dbc7e7f9: Status 404 returned error can't find the container with id a223644fcd49ee6c293ecbe1fe07bb48a908b3c282fac01c190fe068dbc7e7f9 Apr 20 14:56:22.892226 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.892144 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-6x78v\"" Apr 20 14:56:22.897244 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.897227 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6rqtq\"" Apr 20 14:56:22.900048 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.900029 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:56:22.905814 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:22.905794 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sjhzf" Apr 20 14:56:23.021772 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.021749 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-d4wt8"] Apr 20 14:56:23.024134 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:23.024108 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbe6bf00_4b0b_4432_80f4_1085e83c9110.slice/crio-a027c986d0ba065e7a355bc9f278df6f6e39df8ce3f9581012d8e2cf708a0500 WatchSource:0}: Error finding container a027c986d0ba065e7a355bc9f278df6f6e39df8ce3f9581012d8e2cf708a0500: Status 404 returned error can't find the container with id a027c986d0ba065e7a355bc9f278df6f6e39df8ce3f9581012d8e2cf708a0500 Apr 20 14:56:23.030668 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.030645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:23.032966 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.032945 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c674737-9de4-4df3-8cd4-de9165e6e70a-metrics-tls\") pod \"dns-default-jbtsc\" (UID: \"7c674737-9de4-4df3-8cd4-de9165e6e70a\") " pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:23.037574 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.037549 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-gktqt" Apr 20 14:56:23.038892 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.038872 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sjhzf"] Apr 20 14:56:23.042724 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:23.042700 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ae6c334_21b5_4f64_b2c3_68f797cd363b.slice/crio-90eaea1fd01e49031d18fd3feef74b12fa158020e87c1625b7e21bd56a2608a2 WatchSource:0}: Error finding container 90eaea1fd01e49031d18fd3feef74b12fa158020e87c1625b7e21bd56a2608a2: Status 404 returned error can't find the container with id 90eaea1fd01e49031d18fd3feef74b12fa158020e87c1625b7e21bd56a2608a2 Apr 20 14:56:23.131879 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.131843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:56:23.134410 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.134382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7fe57737-4cb8-41e4-95a2-77878dc0e909-cert\") pod \"ingress-canary-9dz42\" (UID: \"7fe57737-4cb8-41e4-95a2-77878dc0e909\") " pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:56:23.158221 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.158080 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-gktqt"] Apr 20 14:56:23.160459 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:23.160428 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07670bb4_e63a_4c79_930c_288b4bffcda3.slice/crio-9c16d0ed99ffbefd61b8f55c7e10a2bc64a1b0fbfb829ad9ce36e135bb90cb3b WatchSource:0}: Error finding container 9c16d0ed99ffbefd61b8f55c7e10a2bc64a1b0fbfb829ad9ce36e135bb90cb3b: Status 404 returned error can't find the container with id 9c16d0ed99ffbefd61b8f55c7e10a2bc64a1b0fbfb829ad9ce36e135bb90cb3b Apr 20 14:56:23.249062 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.249026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gktqt" event={"ID":"07670bb4-e63a-4c79-930c-288b4bffcda3","Type":"ContainerStarted","Data":"d80d149c75147d739fafd7085d5052b2053b00b60c5057fd0b88f5a524aa66f3"} Apr 20 14:56:23.249062 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.249063 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gktqt" event={"ID":"07670bb4-e63a-4c79-930c-288b4bffcda3","Type":"ContainerStarted","Data":"9c16d0ed99ffbefd61b8f55c7e10a2bc64a1b0fbfb829ad9ce36e135bb90cb3b"} Apr 20 14:56:23.250050 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.250023 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4wt8" event={"ID":"dbe6bf00-4b0b-4432-80f4-1085e83c9110","Type":"ContainerStarted","Data":"a027c986d0ba065e7a355bc9f278df6f6e39df8ce3f9581012d8e2cf708a0500"} Apr 20 14:56:23.251023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.251000 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tjwf7" event={"ID":"764028e9-7c8b-4dc6-9392-a96033a5f59d","Type":"ContainerStarted","Data":"a223644fcd49ee6c293ecbe1fe07bb48a908b3c282fac01c190fe068dbc7e7f9"} Apr 20 14:56:23.251946 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.251929 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjhzf" event={"ID":"6ae6c334-21b5-4f64-b2c3-68f797cd363b","Type":"ContainerStarted","Data":"90eaea1fd01e49031d18fd3feef74b12fa158020e87c1625b7e21bd56a2608a2"} Apr 20 14:56:23.309921 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.309884 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tr5vz\"" Apr 20 14:56:23.317926 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.317899 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:23.359326 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.359277 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-xg95b\"" Apr 20 14:56:23.367473 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.367444 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-9dz42" Apr 20 14:56:23.454287 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.454242 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-jbtsc"] Apr 20 14:56:23.459578 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:23.459536 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c674737_9de4_4df3_8cd4_de9165e6e70a.slice/crio-a84b3dcf9aef76dc830969b208fc4e00ebe5aec2d10a1571d75ca46cfaf725eb WatchSource:0}: Error finding container a84b3dcf9aef76dc830969b208fc4e00ebe5aec2d10a1571d75ca46cfaf725eb: Status 404 returned error can't find the container with id a84b3dcf9aef76dc830969b208fc4e00ebe5aec2d10a1571d75ca46cfaf725eb Apr 20 14:56:23.527682 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:23.527654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-9dz42"] Apr 20 14:56:24.257636 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:24.257600 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9dz42" event={"ID":"7fe57737-4cb8-41e4-95a2-77878dc0e909","Type":"ContainerStarted","Data":"fb4b8da4aa246cd8d43616a61fcfd20e88c9981a2197ec104414559cf4223296"} Apr 20 14:56:24.259064 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:24.259030 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbtsc" event={"ID":"7c674737-9de4-4df3-8cd4-de9165e6e70a","Type":"ContainerStarted","Data":"a84b3dcf9aef76dc830969b208fc4e00ebe5aec2d10a1571d75ca46cfaf725eb"} Apr 20 14:56:25.265701 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:25.265664 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gktqt" event={"ID":"07670bb4-e63a-4c79-930c-288b4bffcda3","Type":"ContainerStarted","Data":"56a98b975779d46d9785bbd2ae5ad29b78687f584eeb05de4448ad60237b2317"} Apr 20 14:56:25.268591 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:25.268562 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjhzf" event={"ID":"6ae6c334-21b5-4f64-b2c3-68f797cd363b","Type":"ContainerStarted","Data":"e04cb087920294a1411cc394029f5be8b30826fb99a19d26f485f1b13f5e2ab6"} Apr 20 14:56:25.268706 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:25.268600 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sjhzf" event={"ID":"6ae6c334-21b5-4f64-b2c3-68f797cd363b","Type":"ContainerStarted","Data":"728b7a2b35ecafa19b6723824e33fcf01ad662143d34b379bde45c56ffb6f6c0"} Apr 20 14:56:25.285649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:25.285598 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sjhzf" podStartSLOduration=67.019241721 podStartE2EDuration="1m8.285583023s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.044598122 +0000 UTC m=+66.700646402" lastFinishedPulling="2026-04-20 14:56:24.310939418 +0000 UTC m=+67.966987704" observedRunningTime="2026-04-20 14:56:25.284958416 +0000 UTC m=+68.941006719" watchObservedRunningTime="2026-04-20 14:56:25.285583023 +0000 UTC m=+68.941631326" Apr 20 14:56:29.282714 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.282678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-gktqt" event={"ID":"07670bb4-e63a-4c79-930c-288b4bffcda3","Type":"ContainerStarted","Data":"0a31eace485d84e78177ec26e93f7f9be473ecadc5f613329ed9da5c21e340fd"} Apr 20 14:56:29.284204 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.284170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-d4wt8" event={"ID":"dbe6bf00-4b0b-4432-80f4-1085e83c9110","Type":"ContainerStarted","Data":"c40df656e39125e339fb5e5f762d3a6339cbcfe8ee0e342bda9c0dae2122126d"} Apr 20 14:56:29.284351 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.284317 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:56:29.285732 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.285706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-9dz42" event={"ID":"7fe57737-4cb8-41e4-95a2-77878dc0e909","Type":"ContainerStarted","Data":"02429b30b9fa47a43c80626da3dc9c2929852415b9cc100dfe4c36946aaba8ae"} Apr 20 14:56:29.287188 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.287164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbtsc" event={"ID":"7c674737-9de4-4df3-8cd4-de9165e6e70a","Type":"ContainerStarted","Data":"b9a732c9c03919ac0261106d1a03b7502c40b1c24bd5cd24854dd36b77e9eb7e"} Apr 20 14:56:29.287188 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.287188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-jbtsc" event={"ID":"7c674737-9de4-4df3-8cd4-de9165e6e70a","Type":"ContainerStarted","Data":"3b70eecbf75e083d4023a638e68e69f2066637cb5114a1ca09ed89abca55f6a0"} Apr 20 14:56:29.287385 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.287342 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:29.301497 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.301453 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-gktqt" podStartSLOduration=1.859308768 podStartE2EDuration="7.3014382s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.217399169 +0000 UTC m=+66.873447449" lastFinishedPulling="2026-04-20 14:56:28.659528595 +0000 UTC m=+72.315576881" observedRunningTime="2026-04-20 14:56:29.299577107 +0000 UTC m=+72.955625412" watchObservedRunningTime="2026-04-20 14:56:29.3014382 +0000 UTC m=+72.957486505" Apr 20 14:56:29.314809 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.314768 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-d4wt8" podStartSLOduration=66.681167036 podStartE2EDuration="1m12.314756488s" podCreationTimestamp="2026-04-20 14:55:17 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.026002805 +0000 UTC m=+66.682051088" lastFinishedPulling="2026-04-20 14:56:28.659592261 +0000 UTC m=+72.315640540" observedRunningTime="2026-04-20 14:56:29.313751728 +0000 UTC m=+72.969800031" watchObservedRunningTime="2026-04-20 14:56:29.314756488 +0000 UTC m=+72.970804803" Apr 20 14:56:29.344286 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.344238 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-jbtsc" podStartSLOduration=33.147345937 podStartE2EDuration="38.344225241s" podCreationTimestamp="2026-04-20 14:55:51 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.462486848 +0000 UTC m=+67.118535130" lastFinishedPulling="2026-04-20 14:56:28.659366143 +0000 UTC m=+72.315414434" observedRunningTime="2026-04-20 14:56:29.343595287 +0000 UTC m=+72.999643589" watchObservedRunningTime="2026-04-20 14:56:29.344225241 +0000 UTC m=+73.000273544" Apr 20 14:56:29.378785 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:29.378739 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-9dz42" podStartSLOduration=33.246574657 podStartE2EDuration="38.378724383s" podCreationTimestamp="2026-04-20 14:55:51 +0000 UTC" firstStartedPulling="2026-04-20 14:56:23.532853969 +0000 UTC m=+67.188902260" lastFinishedPulling="2026-04-20 14:56:28.6650037 +0000 UTC m=+72.321051986" observedRunningTime="2026-04-20 14:56:29.377839978 +0000 UTC m=+73.033888281" watchObservedRunningTime="2026-04-20 14:56:29.378724383 +0000 UTC m=+73.034772688" Apr 20 14:56:31.432696 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.431629 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd"] Apr 20 14:56:31.435821 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.435786 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:31.438442 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.438217 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 20 14:56:31.438442 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.438334 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-dshqb\"" Apr 20 14:56:31.441602 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.441580 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd"] Apr 20 14:56:31.497274 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.494044 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-k7lvd\" (UID: \"368b3ec8-ae66-4571-9538-90802a0710c3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:31.594992 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:31.594963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-k7lvd\" (UID: \"368b3ec8-ae66-4571-9538-90802a0710c3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:31.595120 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:31.595106 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 14:56:31.595187 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:31.595177 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates podName:368b3ec8-ae66-4571-9538-90802a0710c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:32.095155524 +0000 UTC m=+75.751203819 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-k7lvd" (UID: "368b3ec8-ae66-4571-9538-90802a0710c3") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 14:56:32.099470 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:32.099435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-k7lvd\" (UID: \"368b3ec8-ae66-4571-9538-90802a0710c3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:32.099656 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:32.099603 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 20 14:56:32.099707 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:32.099673 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates podName:368b3ec8-ae66-4571-9538-90802a0710c3 nodeName:}" failed. No retries permitted until 2026-04-20 14:56:33.099651451 +0000 UTC m=+76.755699732 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-k7lvd" (UID: "368b3ec8-ae66-4571-9538-90802a0710c3") : secret "prometheus-operator-admission-webhook-tls" not found Apr 20 14:56:32.392036 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:32.391967 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:33.108291 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:33.108258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-k7lvd\" (UID: \"368b3ec8-ae66-4571-9538-90802a0710c3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:33.110938 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:33.110904 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/368b3ec8-ae66-4571-9538-90802a0710c3-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-k7lvd\" (UID: \"368b3ec8-ae66-4571-9538-90802a0710c3\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:33.248228 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:33.248200 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:33.384940 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:33.384864 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd"] Apr 20 14:56:33.388225 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:33.388195 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod368b3ec8_ae66_4571_9538_90802a0710c3.slice/crio-96ba421e4c0e68d7a0843ce760756deb2b798b267fde0fbcdc472ee8567382d6 WatchSource:0}: Error finding container 96ba421e4c0e68d7a0843ce760756deb2b798b267fde0fbcdc472ee8567382d6: Status 404 returned error can't find the container with id 96ba421e4c0e68d7a0843ce760756deb2b798b267fde0fbcdc472ee8567382d6 Apr 20 14:56:34.305712 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:34.305673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" event={"ID":"368b3ec8-ae66-4571-9538-90802a0710c3","Type":"ContainerStarted","Data":"96ba421e4c0e68d7a0843ce760756deb2b798b267fde0fbcdc472ee8567382d6"} Apr 20 14:56:39.295556 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:39.295431 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-jbtsc" Apr 20 14:56:41.326603 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.326563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" event={"ID":"368b3ec8-ae66-4571-9538-90802a0710c3","Type":"ContainerStarted","Data":"2dfedb43644373edf70b729c1388acbf88e9420cf00e5e8875790cb214166a40"} Apr 20 14:56:41.327062 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.326955 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:41.328863 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.328422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-tjwf7" event={"ID":"764028e9-7c8b-4dc6-9392-a96033a5f59d","Type":"ContainerStarted","Data":"596c00b0f7273a68593f643a7d9d98a8bb744d2d2a77f29f46e5fe093877aa55"} Apr 20 14:56:41.328863 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.328703 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:41.332594 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.332568 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" Apr 20 14:56:41.339426 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.339405 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-tjwf7" Apr 20 14:56:41.347722 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.347681 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-k7lvd" podStartSLOduration=2.808684852 podStartE2EDuration="10.347667004s" podCreationTimestamp="2026-04-20 14:56:31 +0000 UTC" firstStartedPulling="2026-04-20 14:56:33.390564979 +0000 UTC m=+77.046613260" lastFinishedPulling="2026-04-20 14:56:40.929547132 +0000 UTC m=+84.585595412" observedRunningTime="2026-04-20 14:56:41.345055227 +0000 UTC m=+85.001103530" watchObservedRunningTime="2026-04-20 14:56:41.347667004 +0000 UTC m=+85.003715306" Apr 20 14:56:41.366492 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.366447 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-tjwf7" podStartSLOduration=1.2258782990000001 podStartE2EDuration="19.366434886s" podCreationTimestamp="2026-04-20 14:56:22 +0000 UTC" firstStartedPulling="2026-04-20 14:56:22.832622298 +0000 UTC m=+66.488670578" lastFinishedPulling="2026-04-20 14:56:40.973178884 +0000 UTC m=+84.629227165" observedRunningTime="2026-04-20 14:56:41.365601265 +0000 UTC m=+85.021649564" watchObservedRunningTime="2026-04-20 14:56:41.366434886 +0000 UTC m=+85.022483187" Apr 20 14:56:41.551055 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.551015 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-scrln"] Apr 20 14:56:41.555446 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.555424 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.559191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.559152 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 14:56:41.559500 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.559478 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 14:56:41.559722 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.559707 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 14:56:41.559892 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.559879 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 14:56:41.559991 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.559974 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-wbfr6\"" Apr 20 14:56:41.560119 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.560102 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 14:56:41.570873 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.570834 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-scrln"] Apr 20 14:56:41.670410 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.670293 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.670541 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.670453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d4fa613-c16b-4696-a031-8643149ab3a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.670541 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.670483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzhk7\" (UniqueName: \"kubernetes.io/projected/7d4fa613-c16b-4696-a031-8643149ab3a6-kube-api-access-xzhk7\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.670541 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.670515 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.771145 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.771106 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.771342 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.771174 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.771342 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.771253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d4fa613-c16b-4696-a031-8643149ab3a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.771342 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.771284 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xzhk7\" (UniqueName: \"kubernetes.io/projected/7d4fa613-c16b-4696-a031-8643149ab3a6-kube-api-access-xzhk7\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.772947 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.772914 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7d4fa613-c16b-4696-a031-8643149ab3a6-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.774134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.774086 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.774298 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.774273 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7d4fa613-c16b-4696-a031-8643149ab3a6-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.784066 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.784032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzhk7\" (UniqueName: \"kubernetes.io/projected/7d4fa613-c16b-4696-a031-8643149ab3a6-kube-api-access-xzhk7\") pod \"prometheus-operator-5676c8c784-scrln\" (UID: \"7d4fa613-c16b-4696-a031-8643149ab3a6\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:41.871065 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:41.871031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" Apr 20 14:56:42.011484 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:42.011446 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-scrln"] Apr 20 14:56:42.014380 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:42.014350 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4fa613_c16b_4696_a031_8643149ab3a6.slice/crio-c2842f2b9967dd79807b0ff99429cbbbcdc95491633db84ecd3991733d2970bb WatchSource:0}: Error finding container c2842f2b9967dd79807b0ff99429cbbbcdc95491633db84ecd3991733d2970bb: Status 404 returned error can't find the container with id c2842f2b9967dd79807b0ff99429cbbbcdc95491633db84ecd3991733d2970bb Apr 20 14:56:42.332967 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:42.332926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" event={"ID":"7d4fa613-c16b-4696-a031-8643149ab3a6","Type":"ContainerStarted","Data":"c2842f2b9967dd79807b0ff99429cbbbcdc95491633db84ecd3991733d2970bb"} Apr 20 14:56:44.341389 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:44.341345 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" event={"ID":"7d4fa613-c16b-4696-a031-8643149ab3a6","Type":"ContainerStarted","Data":"46e978ee5ad5b8cfcb97d66ad2b9967c9bd74ce4d3c11f186b672d2a39bafe2a"} Apr 20 14:56:44.341389 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:44.341381 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" event={"ID":"7d4fa613-c16b-4696-a031-8643149ab3a6","Type":"ContainerStarted","Data":"45bf0f61dcd85bcff0987ff3b1b1e048e7581a89135f7f0ced18006213e7e5fa"} Apr 20 14:56:44.361842 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:44.361797 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-scrln" podStartSLOduration=1.65209077 podStartE2EDuration="3.36178387s" podCreationTimestamp="2026-04-20 14:56:41 +0000 UTC" firstStartedPulling="2026-04-20 14:56:42.016597254 +0000 UTC m=+85.672645533" lastFinishedPulling="2026-04-20 14:56:43.726290354 +0000 UTC m=+87.382338633" observedRunningTime="2026-04-20 14:56:44.360036261 +0000 UTC m=+88.016084564" watchObservedRunningTime="2026-04-20 14:56:44.36178387 +0000 UTC m=+88.017832180" Apr 20 14:56:45.898986 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.898953 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4"] Apr 20 14:56:45.947450 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.947212 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4"] Apr 20 14:56:45.947686 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.947674 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rt627"] Apr 20 14:56:45.947876 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.947423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:45.950532 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.950497 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-b94mk\"" Apr 20 14:56:45.950692 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.950672 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 20 14:56:45.951008 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.950994 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:56:45.967864 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.967845 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-sb9k2"] Apr 20 14:56:45.972741 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.969707 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:45.978699 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.978677 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 20 14:56:45.978808 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.978739 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 20 14:56:45.978808 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.978801 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 20 14:56:45.978928 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.978878 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-cscl5\"" Apr 20 14:56:45.993982 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.993960 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rt627"] Apr 20 14:56:45.994096 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.994082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.000067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.998610 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 14:56:46.000067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.998804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 14:56:46.000067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.999412 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-cx2mf\"" Apr 20 14:56:46.000067 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:45.999605 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 14:56:46.004288 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.004267 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.004491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.004473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.004647 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.004631 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78kk\" (UniqueName: \"kubernetes.io/projected/14881dac-c7a8-45ea-bd59-230b8e9811af-kube-api-access-d78kk\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.004790 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.004774 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.004943 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.004907 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14881dac-c7a8-45ea-bd59-230b8e9811af-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.005068 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.005053 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.005172 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.005156 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhfvj\" (UniqueName: \"kubernetes.io/projected/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-api-access-hhfvj\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.005289 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.005272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.005427 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.005411 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.005532 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.005516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/92d39451-f35b-4d2a-88da-a4769e1eaae5-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.106209 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106175 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-wtmp\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106409 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-sys\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106499 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-textfile\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106499 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-metrics-client-ca\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106595 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.106629 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106595 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106663 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.106697 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d78kk\" (UniqueName: \"kubernetes.io/projected/14881dac-c7a8-45ea-bd59-230b8e9811af-kube-api-access-d78kk\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.106741 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106726 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.106784 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106771 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14881dac-c7a8-45ea-bd59-230b8e9811af-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.106817 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106849 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.106897 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106861 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhfvj\" (UniqueName: \"kubernetes.io/projected/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-api-access-hhfvj\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.106948 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.106948 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.107089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106954 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.107089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.106982 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdq4k\" (UniqueName: \"kubernetes.io/projected/937a5c5b-de08-42bb-9cb1-0086ff30299e-kube-api-access-zdq4k\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.107089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.107011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-root\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.107089 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.107042 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/92d39451-f35b-4d2a-88da-a4769e1eaae5-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.107400 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/92d39451-f35b-4d2a-88da-a4769e1eaae5-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:46.107455 2570 secret.go:189] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:46.107538 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls podName:14881dac-c7a8-45ea-bd59-230b8e9811af nodeName:}" failed. No retries permitted until 2026-04-20 14:56:46.60751642 +0000 UTC m=+90.263564714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls") pod "openshift-state-metrics-9d44df66c-gqhb4" (UID: "14881dac-c7a8-45ea-bd59-230b8e9811af") : secret "openshift-state-metrics-tls" not found Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.107600 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.108024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.108134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.108073 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/14881dac-c7a8-45ea-bd59-230b8e9811af-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.109625 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.109603 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.111184 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.111161 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.115674 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.115642 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.119760 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.119730 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78kk\" (UniqueName: \"kubernetes.io/projected/14881dac-c7a8-45ea-bd59-230b8e9811af-kube-api-access-d78kk\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.120216 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.120192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhfvj\" (UniqueName: \"kubernetes.io/projected/92d39451-f35b-4d2a-88da-a4769e1eaae5-kube-api-access-hhfvj\") pod \"kube-state-metrics-69db897b98-rt627\" (UID: \"92d39451-f35b-4d2a-88da-a4769e1eaae5\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.208075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.207999 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-wtmp\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208043 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-sys\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208072 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-textfile\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-metrics-client-ca\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-wtmp\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208236 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208126 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-sys\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208339 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zdq4k\" (UniqueName: \"kubernetes.io/projected/937a5c5b-de08-42bb-9cb1-0086ff30299e-kube-api-access-zdq4k\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-root\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208679 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.208441 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/937a5c5b-de08-42bb-9cb1-0086ff30299e-root\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.208679 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:46.208564 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 14:56:46.208679 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:46.208656 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls podName:937a5c5b-de08-42bb-9cb1-0086ff30299e nodeName:}" failed. No retries permitted until 2026-04-20 14:56:46.708635327 +0000 UTC m=+90.364683626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls") pod "node-exporter-sb9k2" (UID: "937a5c5b-de08-42bb-9cb1-0086ff30299e") : secret "node-exporter-tls" not found Apr 20 14:56:46.218011 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.217984 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-textfile\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.218331 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.218292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.218431 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.218337 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-metrics-client-ca\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.218661 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.218640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdq4k\" (UniqueName: \"kubernetes.io/projected/937a5c5b-de08-42bb-9cb1-0086ff30299e-kube-api-access-zdq4k\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.219000 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.218978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-accelerators-collector-config\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.288690 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.288659 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" Apr 20 14:56:46.442638 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.442565 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rt627"] Apr 20 14:56:46.444987 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:46.444955 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d39451_f35b_4d2a_88da_a4769e1eaae5.slice/crio-197eac29ec391fff729a3b3802f5c81b7f3a978666ee64483c139600f0ca003c WatchSource:0}: Error finding container 197eac29ec391fff729a3b3802f5c81b7f3a978666ee64483c139600f0ca003c: Status 404 returned error can't find the container with id 197eac29ec391fff729a3b3802f5c81b7f3a978666ee64483c139600f0ca003c Apr 20 14:56:46.611746 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.611716 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.614498 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.614475 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/14881dac-c7a8-45ea-bd59-230b8e9811af-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-gqhb4\" (UID: \"14881dac-c7a8-45ea-bd59-230b8e9811af\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.712871 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.712843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.715539 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.715512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/937a5c5b-de08-42bb-9cb1-0086ff30299e-node-exporter-tls\") pod \"node-exporter-sb9k2\" (UID: \"937a5c5b-de08-42bb-9cb1-0086ff30299e\") " pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.862510 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.862430 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" Apr 20 14:56:46.906630 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.906602 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-sb9k2" Apr 20 14:56:46.916615 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:46.916574 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod937a5c5b_de08_42bb_9cb1_0086ff30299e.slice/crio-2cd1bf9cb935a003639399b54b7f1e54f2ff20b8cce5f45b513f7c7792f655d3 WatchSource:0}: Error finding container 2cd1bf9cb935a003639399b54b7f1e54f2ff20b8cce5f45b513f7c7792f655d3: Status 404 returned error can't find the container with id 2cd1bf9cb935a003639399b54b7f1e54f2ff20b8cce5f45b513f7c7792f655d3 Apr 20 14:56:46.977349 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:46.976832 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:47.001892 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.001864 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:47.002155 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.002118 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.006102 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.006075 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 20 14:56:47.006375 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.006356 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 20 14:56:47.006799 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.006780 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 20 14:56:47.006898 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.006882 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 20 14:56:47.006987 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.006972 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 20 14:56:47.007083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.007068 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 20 14:56:47.007157 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.007143 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-kgm4r\"" Apr 20 14:56:47.008107 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.007327 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 20 14:56:47.008107 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.007446 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 20 14:56:47.008107 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.007512 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 20 14:56:47.037137 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.031129 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4"] Apr 20 14:56:47.037137 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:47.035890 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14881dac_c7a8_45ea_bd59_230b8e9811af.slice/crio-0a6f995382d00ef4c07826be3f93e51eead07897a3d2450f01f49b6e05aa5254 WatchSource:0}: Error finding container 0a6f995382d00ef4c07826be3f93e51eead07897a3d2450f01f49b6e05aa5254: Status 404 returned error can't find the container with id 0a6f995382d00ef4c07826be3f93e51eead07897a3d2450f01f49b6e05aa5254 Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117363 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117407 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117444 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crzjc\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-kube-api-access-crzjc\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117532 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117564 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-out\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117629 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.117800 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.118433 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.117940 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-web-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.118433 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.118011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.118433 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.118050 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crzjc\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-kube-api-access-crzjc\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-out\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219435 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219478 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219518 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-web-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219547 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219573 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219673 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.220084 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.219698 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.222783 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.222636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.223585 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.223538 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.223686 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.223651 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.225225 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:47.224932 2570 secret.go:189] Couldn't get secret openshift-monitoring/alertmanager-main-tls: secret "alertmanager-main-tls" not found Apr 20 14:56:47.225225 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:47.225058 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls podName:f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb nodeName:}" failed. No retries permitted until 2026-04-20 14:56:47.725035213 +0000 UTC m=+91.381083494 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-alertmanager-main-tls" (UniqueName: "kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls") pod "alertmanager-main-0" (UID: "f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb") : secret "alertmanager-main-tls" not found Apr 20 14:56:47.226452 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.226385 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.226701 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.226678 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.228794 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.228749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.228888 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.228801 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.229418 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.229371 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.230243 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.230211 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-web-config\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.235687 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.235637 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.237724 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.237701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-config-out\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.247157 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.247131 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crzjc\" (UniqueName: \"kubernetes.io/projected/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-kube-api-access-crzjc\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.357398 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.357365 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb9k2" event={"ID":"937a5c5b-de08-42bb-9cb1-0086ff30299e","Type":"ContainerStarted","Data":"2cd1bf9cb935a003639399b54b7f1e54f2ff20b8cce5f45b513f7c7792f655d3"} Apr 20 14:56:47.365432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.362493 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" event={"ID":"14881dac-c7a8-45ea-bd59-230b8e9811af","Type":"ContainerStarted","Data":"3ed5e1e619d2ddf99b400ad05623a484702a33560260e5436509b40d552afd4f"} Apr 20 14:56:47.365432 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.362529 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" event={"ID":"14881dac-c7a8-45ea-bd59-230b8e9811af","Type":"ContainerStarted","Data":"0a6f995382d00ef4c07826be3f93e51eead07897a3d2450f01f49b6e05aa5254"} Apr 20 14:56:47.372950 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.369608 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" event={"ID":"92d39451-f35b-4d2a-88da-a4769e1eaae5","Type":"ContainerStarted","Data":"197eac29ec391fff729a3b3802f5c81b7f3a978666ee64483c139600f0ca003c"} Apr 20 14:56:47.406199 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.406134 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" podUID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" containerName="registry" containerID="cri-o://b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe" gracePeriod=30 Apr 20 14:56:47.725816 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.725732 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.729012 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.728981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:47.921817 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:47.921777 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 20 14:56:48.082011 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.081989 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:48.130041 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.129986 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130699 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130678 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130819 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130734 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130819 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130780 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130938 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130850 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130938 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130877 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.130938 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130934 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.131077 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.130974 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjvg\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg\") pod \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\" (UID: \"6495ee23-3ff8-4b72-b73d-ec3a22a198c2\") " Apr 20 14:56:48.132753 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.132492 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:48.133222 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.133074 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 14:56:48.134093 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.133957 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:48.138259 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.137949 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg" (OuterVolumeSpecName: "kube-api-access-9pjvg") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "kube-api-access-9pjvg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:48.138382 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.138365 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:48.138529 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.138505 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 14:56:48.138989 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.138964 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 14:56:48.142317 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.142279 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "6495ee23-3ff8-4b72-b73d-ec3a22a198c2" (UID: "6495ee23-3ff8-4b72-b73d-ec3a22a198c2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 14:56:48.231898 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231860 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9pjvg\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-kube-api-access-9pjvg\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.231898 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231894 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-installation-pull-secrets\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231910 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-ca-trust-extracted\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231925 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-tls\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231939 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-bound-sa-token\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231953 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-image-registry-private-configuration\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231967 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-trusted-ca\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.232083 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.231985 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/6495ee23-3ff8-4b72-b73d-ec3a22a198c2-registry-certificates\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 14:56:48.374248 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.374169 2570 generic.go:358] "Generic (PLEG): container finished" podID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" containerID="b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe" exitCode=0 Apr 20 14:56:48.374248 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.374236 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" Apr 20 14:56:48.374491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.374262 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" event={"ID":"6495ee23-3ff8-4b72-b73d-ec3a22a198c2","Type":"ContainerDied","Data":"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe"} Apr 20 14:56:48.374491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.374321 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6fcf5597b7-ngxsq" event={"ID":"6495ee23-3ff8-4b72-b73d-ec3a22a198c2","Type":"ContainerDied","Data":"cd993ddc468b53a802968654854fc6ab24fcf7eb77ba89fbe18bd0264094b00e"} Apr 20 14:56:48.374491 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.374346 2570 scope.go:117] "RemoveContainer" containerID="b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe" Apr 20 14:56:48.377477 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.377453 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" event={"ID":"14881dac-c7a8-45ea-bd59-230b8e9811af","Type":"ContainerStarted","Data":"8f575f262baaa6ac6937cf16157cddbccd59b517a9eed1b92009dfd657ebc08c"} Apr 20 14:56:48.397441 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.397417 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:56:48.402499 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.402476 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6fcf5597b7-ngxsq"] Apr 20 14:56:48.554017 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.553994 2570 scope.go:117] "RemoveContainer" containerID="b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe" Apr 20 14:56:48.554437 ip-10-0-133-163 kubenswrapper[2570]: E0420 14:56:48.554399 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe\": container with ID starting with b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe not found: ID does not exist" containerID="b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe" Apr 20 14:56:48.554554 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.554442 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe"} err="failed to get container status \"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe\": rpc error: code = NotFound desc = could not find container \"b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe\": container with ID starting with b6a60cccae5e031c2d3cc7c106d9bed41f55c04c0c77a4c2a9a13f651e6051fe not found: ID does not exist" Apr 20 14:56:48.992530 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.974888 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-78d4476d99-54g5t"] Apr 20 14:56:48.992530 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.975389 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" containerName="registry" Apr 20 14:56:48.998014 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.997988 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" containerName="registry" Apr 20 14:56:48.999117 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:48.998147 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" containerName="registry" Apr 20 14:56:49.040454 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.040373 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.049395 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.049589 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-dlhnmn21n20iv\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.049727 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.049863 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.050003 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.050496 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.050585 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-mwqb2\"" Apr 20 14:56:49.051075 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.050681 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6495ee23-3ff8-4b72-b73d-ec3a22a198c2" path="/var/lib/kubelet/pods/6495ee23-3ff8-4b72-b73d-ec3a22a198c2/volumes" Apr 20 14:56:49.058092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.055643 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78d4476d99-54g5t"] Apr 20 14:56:49.058092 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.055677 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143735 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9msqh\" (UniqueName: \"kubernetes.io/projected/7bef6867-224c-4525-bca7-c1f04fe94c83-kube-api-access-9msqh\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143810 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-grpc-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143852 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143914 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.143942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bef6867-224c-4525-bca7-c1f04fe94c83-metrics-client-ca\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.144004 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.144859 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.144035 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245451 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245540 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9msqh\" (UniqueName: \"kubernetes.io/projected/7bef6867-224c-4525-bca7-c1f04fe94c83-kube-api-access-9msqh\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-grpc-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245724 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.245754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bef6867-224c-4525-bca7-c1f04fe94c83-metrics-client-ca\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.246933 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.246584 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7bef6867-224c-4525-bca7-c1f04fe94c83-metrics-client-ca\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.255021 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.254867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.255848 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.255824 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9msqh\" (UniqueName: \"kubernetes.io/projected/7bef6867-224c-4525-bca7-c1f04fe94c83-kube-api-access-9msqh\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.257060 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.256993 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.257060 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.257007 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.257363 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.257335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.259793 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.259749 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-thanos-querier-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.263726 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.260946 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/7bef6867-224c-4525-bca7-c1f04fe94c83-secret-grpc-tls\") pod \"thanos-querier-78d4476d99-54g5t\" (UID: \"7bef6867-224c-4525-bca7-c1f04fe94c83\") " pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.360574 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.360505 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:49.382983 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.382935 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"3b77cd74ec1303284f324fd350efb3fafdd13c08d8ef35680ccc1a09e8d174b8"} Apr 20 14:56:49.385658 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.385631 2570 generic.go:358] "Generic (PLEG): container finished" podID="937a5c5b-de08-42bb-9cb1-0086ff30299e" containerID="60052c0f9f120aa8db93c038a99ca8d70b673eeaf30aaeab132b61e724ab1e64" exitCode=0 Apr 20 14:56:49.385788 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.385724 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb9k2" event={"ID":"937a5c5b-de08-42bb-9cb1-0086ff30299e","Type":"ContainerDied","Data":"60052c0f9f120aa8db93c038a99ca8d70b673eeaf30aaeab132b61e724ab1e64"} Apr 20 14:56:49.390020 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.389378 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" event={"ID":"92d39451-f35b-4d2a-88da-a4769e1eaae5","Type":"ContainerStarted","Data":"546a73e1188702855aaafcf59b2d03a637691d56c3a96b2c47a370bcbc816131"} Apr 20 14:56:49.390020 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.389409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" event={"ID":"92d39451-f35b-4d2a-88da-a4769e1eaae5","Type":"ContainerStarted","Data":"031d32613ea67f6f75de8d36703ad0738a83e9047028043e74f6ee3949a443fc"} Apr 20 14:56:49.390020 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.389425 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" event={"ID":"92d39451-f35b-4d2a-88da-a4769e1eaae5","Type":"ContainerStarted","Data":"909aa65278f816c74ba9dea8eae0b64a0b87d9ac018631a620ae6c8d2d2f41c1"} Apr 20 14:56:49.422922 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.422866 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rt627" podStartSLOduration=2.000974929 podStartE2EDuration="4.422849557s" podCreationTimestamp="2026-04-20 14:56:45 +0000 UTC" firstStartedPulling="2026-04-20 14:56:46.447361468 +0000 UTC m=+90.103409766" lastFinishedPulling="2026-04-20 14:56:48.8692361 +0000 UTC m=+92.525284394" observedRunningTime="2026-04-20 14:56:49.421763895 +0000 UTC m=+93.077812200" watchObservedRunningTime="2026-04-20 14:56:49.422849557 +0000 UTC m=+93.078897861" Apr 20 14:56:49.822830 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:49.822803 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-78d4476d99-54g5t"] Apr 20 14:56:49.825861 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:49.825836 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bef6867_224c_4525_bca7_c1f04fe94c83.slice/crio-0bfa68dd10618d4677665bc8e190be52d48d9f97e3a1bf92c4ad59cdf91d0d89 WatchSource:0}: Error finding container 0bfa68dd10618d4677665bc8e190be52d48d9f97e3a1bf92c4ad59cdf91d0d89: Status 404 returned error can't find the container with id 0bfa68dd10618d4677665bc8e190be52d48d9f97e3a1bf92c4ad59cdf91d0d89 Apr 20 14:56:50.397572 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.396622 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" event={"ID":"14881dac-c7a8-45ea-bd59-230b8e9811af","Type":"ContainerStarted","Data":"5568cbb61f337fdd3c5a02559a67c0944f0131a8628eb9e4f8e435b5347f45c8"} Apr 20 14:56:50.399278 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.399240 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"0bfa68dd10618d4677665bc8e190be52d48d9f97e3a1bf92c4ad59cdf91d0d89"} Apr 20 14:56:50.402041 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.402013 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb9k2" event={"ID":"937a5c5b-de08-42bb-9cb1-0086ff30299e","Type":"ContainerStarted","Data":"fa0ff41ed5dc978f1f2298d8318d0c4643655f95fcd504a295d0ce70df2dee4b"} Apr 20 14:56:50.402215 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.402049 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-sb9k2" event={"ID":"937a5c5b-de08-42bb-9cb1-0086ff30299e","Type":"ContainerStarted","Data":"1db9ce004c30d2457fdbcd7d30292d0c99a4bd6b1e3b74042bad8731a50a57c1"} Apr 20 14:56:50.409674 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.409651 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6d767b4bfd-nqwbp"] Apr 20 14:56:50.423651 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.423012 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-gqhb4" podStartSLOduration=2.706831219 podStartE2EDuration="5.422996868s" podCreationTimestamp="2026-04-20 14:56:45 +0000 UTC" firstStartedPulling="2026-04-20 14:56:47.384960142 +0000 UTC m=+91.041008444" lastFinishedPulling="2026-04-20 14:56:50.1011258 +0000 UTC m=+93.757174093" observedRunningTime="2026-04-20 14:56:50.421270453 +0000 UTC m=+94.077318756" watchObservedRunningTime="2026-04-20 14:56:50.422996868 +0000 UTC m=+94.079045174" Apr 20 14:56:50.426747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.426581 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6d767b4bfd-nqwbp"] Apr 20 14:56:50.426747 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.426709 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.431278 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.431529 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-4p5n2\"" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.431781 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.431965 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.432180 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 14:56:50.432393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.432350 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-3bbppomlrts5a\"" Apr 20 14:56:50.441892 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.441854 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-sb9k2" podStartSLOduration=3.488946871 podStartE2EDuration="5.441841758s" podCreationTimestamp="2026-04-20 14:56:45 +0000 UTC" firstStartedPulling="2026-04-20 14:56:46.918554144 +0000 UTC m=+90.574602438" lastFinishedPulling="2026-04-20 14:56:48.871449036 +0000 UTC m=+92.527497325" observedRunningTime="2026-04-20 14:56:50.441371261 +0000 UTC m=+94.097419566" watchObservedRunningTime="2026-04-20 14:56:50.441841758 +0000 UTC m=+94.097890060" Apr 20 14:56:50.557476 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frgq8\" (UniqueName: \"kubernetes.io/projected/77d517a8-5191-4606-8bdd-d236599c3b5b-kube-api-access-frgq8\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557476 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557487 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-client-certs\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557761 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557518 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-client-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557761 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557550 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-metrics-server-audit-profiles\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557761 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557597 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557761 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557620 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/77d517a8-5191-4606-8bdd-d236599c3b5b-audit-log\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.557761 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.557677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-tls\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.658995 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-frgq8\" (UniqueName: \"kubernetes.io/projected/77d517a8-5191-4606-8bdd-d236599c3b5b-kube-api-access-frgq8\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659046 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-client-certs\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659079 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-client-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-metrics-server-audit-profiles\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659165 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659188 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/77d517a8-5191-4606-8bdd-d236599c3b5b-audit-log\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.660358 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.659251 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-tls\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.663807 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.661906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-metrics-server-audit-profiles\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.663807 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.663227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/77d517a8-5191-4606-8bdd-d236599c3b5b-audit-log\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.663807 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.663748 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/77d517a8-5191-4606-8bdd-d236599c3b5b-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.663807 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.663754 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-tls\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.664671 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.664629 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-secret-metrics-server-client-certs\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.676164 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.676138 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/77d517a8-5191-4606-8bdd-d236599c3b5b-client-ca-bundle\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.681023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.680978 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-frgq8\" (UniqueName: \"kubernetes.io/projected/77d517a8-5191-4606-8bdd-d236599c3b5b-kube-api-access-frgq8\") pod \"metrics-server-6d767b4bfd-nqwbp\" (UID: \"77d517a8-5191-4606-8bdd-d236599c3b5b\") " pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:50.746543 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:50.746143 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:56:51.041834 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:51.041077 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6d767b4bfd-nqwbp"] Apr 20 14:56:51.053043 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:51.053003 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77d517a8_5191_4606_8bdd_d236599c3b5b.slice/crio-6925da3e09983fac3dd59b203903dae7a8597fff66115c1c1b1f7aaf5cc453d9 WatchSource:0}: Error finding container 6925da3e09983fac3dd59b203903dae7a8597fff66115c1c1b1f7aaf5cc453d9: Status 404 returned error can't find the container with id 6925da3e09983fac3dd59b203903dae7a8597fff66115c1c1b1f7aaf5cc453d9 Apr 20 14:56:51.406808 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:51.406778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"a3f7912a401034c80cd92999bc07b06b9891aa08d2832d3902fa74b6fc601e4f"} Apr 20 14:56:51.408096 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:51.408058 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" event={"ID":"77d517a8-5191-4606-8bdd-d236599c3b5b","Type":"ContainerStarted","Data":"6925da3e09983fac3dd59b203903dae7a8597fff66115c1c1b1f7aaf5cc453d9"} Apr 20 14:56:52.141118 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.138670 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:52.157100 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.157071 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.160589 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.160564 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 20 14:56:52.160720 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.160637 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 20 14:56:52.160824 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.160806 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-tfj95\"" Apr 20 14:56:52.161502 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.161482 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 20 14:56:52.161609 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.161555 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 20 14:56:52.161836 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.161816 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-2j9cstkbahbam\"" Apr 20 14:56:52.162120 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.162101 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 20 14:56:52.162120 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.162110 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 20 14:56:52.162277 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.162154 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 20 14:56:52.162543 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.162527 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 20 14:56:52.162839 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.162821 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 20 14:56:52.163045 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.163029 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 20 14:56:52.168265 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.168237 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:52.175668 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.175644 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 20 14:56:52.176330 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.176279 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 20 14:56:52.276990 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.276963 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277010 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277031 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277072 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-config-out\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277091 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277127 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277111 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277165 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-web-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277228 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277249 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277397 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277373 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrmnn\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-kube-api-access-jrmnn\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277529 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.277649 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.277567 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378047 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378007 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-config-out\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-web-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378191 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378152 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378208 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378291 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrmnn\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-kube-api-access-jrmnn\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378340 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378385 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378442 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378523 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.378551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.379211 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.378570 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.379920 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.379888 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.380028 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.379995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.381001 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.380945 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-config-out\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.382209 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.381509 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-web-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.382720 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.382402 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.385683 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.384538 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.385683 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.384796 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.385683 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.385375 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.387013 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.386972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-config\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.388864 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.388793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.390667 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.390632 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrmnn\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-kube-api-access-jrmnn\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.391554 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.391438 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.397021 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.396927 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.400947 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.400631 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/948924e3-21b3-473a-9035-d51cb2d5f65e-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.401105 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.401078 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.401189 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.401077 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.401579 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.401556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/948924e3-21b3-473a-9035-d51cb2d5f65e-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.403058 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.403021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/948924e3-21b3-473a-9035-d51cb2d5f65e-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"948924e3-21b3-473a-9035-d51cb2d5f65e\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.413881 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.413693 2570 generic.go:358] "Generic (PLEG): container finished" podID="f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb" containerID="a3f7912a401034c80cd92999bc07b06b9891aa08d2832d3902fa74b6fc601e4f" exitCode=0 Apr 20 14:56:52.413881 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.413753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerDied","Data":"a3f7912a401034c80cd92999bc07b06b9891aa08d2832d3902fa74b6fc601e4f"} Apr 20 14:56:52.419617 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.419593 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"aab77a543b67371a8d7eff94ae6ec056aa650405a912abe3640b58233d9dd10c"} Apr 20 14:56:52.472052 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.472018 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:56:52.639786 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:52.639728 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 20 14:56:52.644636 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:56:52.644601 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod948924e3_21b3_473a_9035_d51cb2d5f65e.slice/crio-9104684aa38531f3d65107996d56b68518b56c02ac175cf897631bdc696f9e41 WatchSource:0}: Error finding container 9104684aa38531f3d65107996d56b68518b56c02ac175cf897631bdc696f9e41: Status 404 returned error can't find the container with id 9104684aa38531f3d65107996d56b68518b56c02ac175cf897631bdc696f9e41 Apr 20 14:56:53.425296 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:53.425258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"e4f5618fb1582222d42ccb137eb839f58129287f73b5e8161cf8aa347b54832c"} Apr 20 14:56:53.425709 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:53.425299 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"cec840b5f14a0d7e3d0e35ff76fc1fa8f3b66fd863ae68ac913d03e4bd6b818b"} Apr 20 14:56:53.426581 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:53.426559 2570 generic.go:358] "Generic (PLEG): container finished" podID="948924e3-21b3-473a-9035-d51cb2d5f65e" containerID="10ddb407c6a20d4d9b363ceba4f4ee7a814449f799dda9e129ffd660d05900b6" exitCode=0 Apr 20 14:56:53.426669 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:53.426615 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerDied","Data":"10ddb407c6a20d4d9b363ceba4f4ee7a814449f799dda9e129ffd660d05900b6"} Apr 20 14:56:53.426669 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:53.426633 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"9104684aa38531f3d65107996d56b68518b56c02ac175cf897631bdc696f9e41"} Apr 20 14:56:55.438937 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.438902 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"05d3b3ae6ad121da8a4e450be532c0e098200aed121e444bef88654728460350"} Apr 20 14:56:55.439415 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.438948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"d04a2bd5e0809ac660b3378d930ba14c3ab565229e4f7cd9c67bdcd6ee7f58e8"} Apr 20 14:56:55.439415 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.438965 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"1d2c3f19be2433141f46b66cae5195d1aa3e908825097cf125aae69b31f70307"} Apr 20 14:56:55.439415 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.438978 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"03ef6bfe95011bd5f9c632125467828fb181c168edec97cdc20c76f950afce7b"} Apr 20 14:56:55.439415 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.438990 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"7e5c30c8a69da1fd18a17be06b7c3bbd86d51669fc01442f5dc48b0db36be17c"} Apr 20 14:56:55.440748 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.440722 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" event={"ID":"77d517a8-5191-4606-8bdd-d236599c3b5b","Type":"ContainerStarted","Data":"3e213cb3a9f54911bd9213b2a6d8a121728556b0d09da563dffb162d978daf11"} Apr 20 14:56:55.446338 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.445706 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"5dfdce526128e19b5fe3732549cfdc3caf303b815d93d7ef2c2a5487244c2b11"} Apr 20 14:56:55.446338 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.445739 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"4026496323fd007e81f31ea1e04de1b4e2877a3224a69620050e04a89aeffe4f"} Apr 20 14:56:55.446338 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.445752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" event={"ID":"7bef6867-224c-4525-bca7-c1f04fe94c83","Type":"ContainerStarted","Data":"8dbda144fe8b413c3001e0350ac0239c8b94f13a9da37251490a676d898e7656"} Apr 20 14:56:55.446338 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.446027 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:56:55.458505 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.458450 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" podStartSLOduration=2.380773446 podStartE2EDuration="5.458428111s" podCreationTimestamp="2026-04-20 14:56:50 +0000 UTC" firstStartedPulling="2026-04-20 14:56:51.209927582 +0000 UTC m=+94.865975864" lastFinishedPulling="2026-04-20 14:56:54.287582249 +0000 UTC m=+97.943630529" observedRunningTime="2026-04-20 14:56:55.458187115 +0000 UTC m=+99.114235427" watchObservedRunningTime="2026-04-20 14:56:55.458428111 +0000 UTC m=+99.114476414" Apr 20 14:56:55.481059 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:55.481006 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" podStartSLOduration=2.602118976 podStartE2EDuration="7.480989241s" podCreationTimestamp="2026-04-20 14:56:48 +0000 UTC" firstStartedPulling="2026-04-20 14:56:49.827838568 +0000 UTC m=+93.483886848" lastFinishedPulling="2026-04-20 14:56:54.706708832 +0000 UTC m=+98.362757113" observedRunningTime="2026-04-20 14:56:55.478590821 +0000 UTC m=+99.134639147" watchObservedRunningTime="2026-04-20 14:56:55.480989241 +0000 UTC m=+99.137037545" Apr 20 14:56:56.453048 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:56.452997 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb","Type":"ContainerStarted","Data":"2ded373e6aa05a9803201305ffa8bc534a16931ed4e53a7ebd7d1d867a0b7ba4"} Apr 20 14:56:56.481154 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:56.481106 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.836831325 podStartE2EDuration="10.48108868s" podCreationTimestamp="2026-04-20 14:56:46 +0000 UTC" firstStartedPulling="2026-04-20 14:56:49.061094265 +0000 UTC m=+92.717142551" lastFinishedPulling="2026-04-20 14:56:54.705351615 +0000 UTC m=+98.361399906" observedRunningTime="2026-04-20 14:56:56.48053153 +0000 UTC m=+100.136579845" watchObservedRunningTime="2026-04-20 14:56:56.48108868 +0000 UTC m=+100.137136982" Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"a686937a4924055ad058fe347810b97bb2e2d808d29217ce80bd3d37729ee877"} Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462873 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"9b921f982f710b7f4be6a71926d8f99919da944ab31de49f3a73d2e995287c09"} Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"84843cbf69d805e3c2ca19ee80ccdf2f36aa11c77da67b1d176bdb9b9c1b8729"} Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462891 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"ea681461f3e8666860524c3fa80045b9adcc2bfb701f2bc99bd139efe5bbaf9f"} Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462899 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"cd170329d2eb8c5ad7cf9e89c57f480bc06dfc0d2193636ffe5eebb554361129"} Apr 20 14:56:58.462919 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.462910 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"948924e3-21b3-473a-9035-d51cb2d5f65e","Type":"ContainerStarted","Data":"2753f760302f4d8456edd42876078a8a3877d29f8b3dec08320c1e5ae8ce01dc"} Apr 20 14:56:58.490143 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:56:58.490084 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.438005538 podStartE2EDuration="6.490069604s" podCreationTimestamp="2026-04-20 14:56:52 +0000 UTC" firstStartedPulling="2026-04-20 14:56:53.427771124 +0000 UTC m=+97.083819407" lastFinishedPulling="2026-04-20 14:56:57.479835189 +0000 UTC m=+101.135883473" observedRunningTime="2026-04-20 14:56:58.487128493 +0000 UTC m=+102.143176844" watchObservedRunningTime="2026-04-20 14:56:58.490069604 +0000 UTC m=+102.146117950" Apr 20 14:57:00.294861 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:00.294832 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-d4wt8" Apr 20 14:57:01.459168 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:01.459140 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-78d4476d99-54g5t" Apr 20 14:57:02.472537 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:02.472497 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:10.746634 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:10.746604 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:57:10.746634 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:10.746638 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:57:15.517125 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:15.517090 2570 generic.go:358] "Generic (PLEG): container finished" podID="2baebef1-0abd-4dc6-a4f1-5cf8fe74d376" containerID="2d52832da8ec3628ccfdfb88cc72bce211658f458c107ecb20c56ed0a8289860" exitCode=0 Apr 20 14:57:15.517525 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:15.517180 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" event={"ID":"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376","Type":"ContainerDied","Data":"2d52832da8ec3628ccfdfb88cc72bce211658f458c107ecb20c56ed0a8289860"} Apr 20 14:57:15.517741 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:15.517721 2570 scope.go:117] "RemoveContainer" containerID="2d52832da8ec3628ccfdfb88cc72bce211658f458c107ecb20c56ed0a8289860" Apr 20 14:57:16.521896 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:16.521862 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-sj2fv" event={"ID":"2baebef1-0abd-4dc6-a4f1-5cf8fe74d376","Type":"ContainerStarted","Data":"4176da4d4eb83e5dd0db375ce6c3eff7e4b926d62febc04e29c682ecd2266e38"} Apr 20 14:57:30.752581 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:30.752547 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:57:30.756312 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:30.756279 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6d767b4bfd-nqwbp" Apr 20 14:57:52.472663 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:52.472620 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:52.495882 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:52.495853 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:57:52.656393 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:57:52.656063 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 20 14:58:10.227105 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.227025 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-76848cdc47-r5v66"] Apr 20 14:58:10.230568 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.230535 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.234923 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.234893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 20 14:58:10.235063 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.235030 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-7fgk7\"" Apr 20 14:58:10.235898 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.235880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 20 14:58:10.236006 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.235917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 20 14:58:10.236006 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.235949 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 20 14:58:10.236175 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.235998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 20 14:58:10.240082 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.240064 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 20 14:58:10.250013 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.249993 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76848cdc47-r5v66"] Apr 20 14:58:10.310357 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-federate-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310357 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w89gz\" (UniqueName: \"kubernetes.io/projected/ae298d00-8803-4cc4-8ff7-acfdc73593c0-kube-api-access-w89gz\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310385 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310463 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310551 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310541 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-serving-certs-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310672 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310585 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-metrics-client-ca\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310672 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.310672 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.310632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.411668 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411636 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-federate-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.411825 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411683 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w89gz\" (UniqueName: \"kubernetes.io/projected/ae298d00-8803-4cc4-8ff7-acfdc73593c0-kube-api-access-w89gz\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.411825 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.411825 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411747 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.411825 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411794 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-serving-certs-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.412023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-metrics-client-ca\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.412023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.411946 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.412023 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.412005 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.412925 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.412895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-metrics-client-ca\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.412925 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.412922 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.413072 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.413003 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ae298d00-8803-4cc4-8ff7-acfdc73593c0-serving-certs-ca-bundle\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.414562 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.414536 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-telemeter-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.414765 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.414741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.414810 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.414781 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-federate-client-tls\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.414810 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.414799 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ae298d00-8803-4cc4-8ff7-acfdc73593c0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.420532 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.420510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w89gz\" (UniqueName: \"kubernetes.io/projected/ae298d00-8803-4cc4-8ff7-acfdc73593c0-kube-api-access-w89gz\") pod \"telemeter-client-76848cdc47-r5v66\" (UID: \"ae298d00-8803-4cc4-8ff7-acfdc73593c0\") " pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.540723 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.540700 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" Apr 20 14:58:10.673348 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.673323 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-76848cdc47-r5v66"] Apr 20 14:58:10.675362 ip-10-0-133-163 kubenswrapper[2570]: W0420 14:58:10.675332 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae298d00_8803_4cc4_8ff7_acfdc73593c0.slice/crio-6bb7949e043442ae472f270f4c6ef334b16c724387d040c2a04206aa7cddf6af WatchSource:0}: Error finding container 6bb7949e043442ae472f270f4c6ef334b16c724387d040c2a04206aa7cddf6af: Status 404 returned error can't find the container with id 6bb7949e043442ae472f270f4c6ef334b16c724387d040c2a04206aa7cddf6af Apr 20 14:58:10.689851 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:10.689820 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" event={"ID":"ae298d00-8803-4cc4-8ff7-acfdc73593c0","Type":"ContainerStarted","Data":"6bb7949e043442ae472f270f4c6ef334b16c724387d040c2a04206aa7cddf6af"} Apr 20 14:58:12.698134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:12.698042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" event={"ID":"ae298d00-8803-4cc4-8ff7-acfdc73593c0","Type":"ContainerStarted","Data":"94ff0eacb413074e1596fc7a09fafc693f161194539a96ccdcbd13a4ced176bf"} Apr 20 14:58:12.698134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:12.698084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" event={"ID":"ae298d00-8803-4cc4-8ff7-acfdc73593c0","Type":"ContainerStarted","Data":"693ce1889d25cfdd9f1b74111c44cf87528699b9e02590dd17306962c0c64d51"} Apr 20 14:58:12.698134 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:12.698094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" event={"ID":"ae298d00-8803-4cc4-8ff7-acfdc73593c0","Type":"ContainerStarted","Data":"3327614970a1edcb164fc21541743caa028e1d59c41020958c24d17854f89437"} Apr 20 14:58:12.722697 ip-10-0-133-163 kubenswrapper[2570]: I0420 14:58:12.722648 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-76848cdc47-r5v66" podStartSLOduration=1.054828528 podStartE2EDuration="2.722631665s" podCreationTimestamp="2026-04-20 14:58:10 +0000 UTC" firstStartedPulling="2026-04-20 14:58:10.677220407 +0000 UTC m=+174.333268687" lastFinishedPulling="2026-04-20 14:58:12.345023544 +0000 UTC m=+176.001071824" observedRunningTime="2026-04-20 14:58:12.720702447 +0000 UTC m=+176.376750774" watchObservedRunningTime="2026-04-20 14:58:12.722631665 +0000 UTC m=+176.378679966" Apr 20 15:00:16.838335 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:00:16.838290 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:00:16.840117 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:00:16.840097 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:00:16.841666 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:00:16.841648 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 15:01:24.625996 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.625937 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh"] Apr 20 15:01:24.633371 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.633346 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.635924 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.635892 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 15:01:24.636060 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.635925 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-vsz6j\"" Apr 20 15:01:24.636060 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.636040 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 15:01:24.636261 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.636247 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 15:01:24.636367 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.636349 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 15:01:24.650265 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.650241 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh"] Apr 20 15:01:24.743388 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.743368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss2bc\" (UniqueName: \"kubernetes.io/projected/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-kube-api-access-ss2bc\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.743499 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.743403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.743499 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.743437 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.844464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.844436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ss2bc\" (UniqueName: \"kubernetes.io/projected/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-kube-api-access-ss2bc\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.844568 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.844473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.844688 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.844668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.847202 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.847180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-webhook-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.847278 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.847180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-apiservice-cert\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.852334 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.852295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss2bc\" (UniqueName: \"kubernetes.io/projected/ad8ba0ee-5509-4f32-96e5-6b0de0b47177-kube-api-access-ss2bc\") pod \"opendatahub-operator-controller-manager-99ff97f7d-hxndh\" (UID: \"ad8ba0ee-5509-4f32-96e5-6b0de0b47177\") " pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:24.945034 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:24.944971 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:25.067647 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:25.067621 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh"] Apr 20 15:01:25.070595 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:01:25.070565 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad8ba0ee_5509_4f32_96e5_6b0de0b47177.slice/crio-54b05bff03decc2e00668b6dd35fa5e5b3a8aabdf1be47ce75702e6ab02d42e0 WatchSource:0}: Error finding container 54b05bff03decc2e00668b6dd35fa5e5b3a8aabdf1be47ce75702e6ab02d42e0: Status 404 returned error can't find the container with id 54b05bff03decc2e00668b6dd35fa5e5b3a8aabdf1be47ce75702e6ab02d42e0 Apr 20 15:01:25.072844 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:25.072826 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:01:25.292281 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:25.292247 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" event={"ID":"ad8ba0ee-5509-4f32-96e5-6b0de0b47177","Type":"ContainerStarted","Data":"54b05bff03decc2e00668b6dd35fa5e5b3a8aabdf1be47ce75702e6ab02d42e0"} Apr 20 15:01:28.304903 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:28.304868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" event={"ID":"ad8ba0ee-5509-4f32-96e5-6b0de0b47177","Type":"ContainerStarted","Data":"eec6dfc2bc9d78de5037b10042213bbb2e421896be9c20730d253843f1c18ce4"} Apr 20 15:01:28.305399 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:28.305032 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:28.327975 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:28.327925 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" podStartSLOduration=1.922864407 podStartE2EDuration="4.327912744s" podCreationTimestamp="2026-04-20 15:01:24 +0000 UTC" firstStartedPulling="2026-04-20 15:01:25.072944823 +0000 UTC m=+368.728993102" lastFinishedPulling="2026-04-20 15:01:27.477993157 +0000 UTC m=+371.134041439" observedRunningTime="2026-04-20 15:01:28.325671553 +0000 UTC m=+371.981719856" watchObservedRunningTime="2026-04-20 15:01:28.327912744 +0000 UTC m=+371.983961046" Apr 20 15:01:39.310677 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:39.310646 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-99ff97f7d-hxndh" Apr 20 15:01:45.029598 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.029564 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg"] Apr 20 15:01:45.035373 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.035352 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.038425 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.038389 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 15:01:45.039829 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.039804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 15:01:45.039964 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.039859 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 15:01:45.039964 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.039873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 15:01:45.039964 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.039811 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 15:01:45.040136 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.040107 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-nplmq\"" Apr 20 15:01:45.041606 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.041583 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg"] Apr 20 15:01:45.207244 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.207218 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdg2v\" (UniqueName: \"kubernetes.io/projected/6660b740-f666-473b-a2c9-b5ced164f05b-kube-api-access-vdg2v\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.207402 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.207248 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6660b740-f666-473b-a2c9-b5ced164f05b-manager-config\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.207402 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.207268 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.207402 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.207334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.308540 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.308471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.308662 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.308544 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vdg2v\" (UniqueName: \"kubernetes.io/projected/6660b740-f666-473b-a2c9-b5ced164f05b-kube-api-access-vdg2v\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.308662 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.308566 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6660b740-f666-473b-a2c9-b5ced164f05b-manager-config\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.308771 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.308654 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.309130 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.309105 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/6660b740-f666-473b-a2c9-b5ced164f05b-manager-config\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.311029 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.311007 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.311163 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.311145 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/6660b740-f666-473b-a2c9-b5ced164f05b-metrics-cert\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.316917 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.316894 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdg2v\" (UniqueName: \"kubernetes.io/projected/6660b740-f666-473b-a2c9-b5ced164f05b-kube-api-access-vdg2v\") pod \"lws-controller-manager-6687ffb5c6-v6xtg\" (UID: \"6660b740-f666-473b-a2c9-b5ced164f05b\") " pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.345907 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.345882 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:45.469071 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:45.469047 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg"] Apr 20 15:01:45.471030 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:01:45.471001 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6660b740_f666_473b_a2c9_b5ced164f05b.slice/crio-abcc907fc8e1f5a04e2b64a68725b17f0c8fe0396f589d695c8ca1b16450677e WatchSource:0}: Error finding container abcc907fc8e1f5a04e2b64a68725b17f0c8fe0396f589d695c8ca1b16450677e: Status 404 returned error can't find the container with id abcc907fc8e1f5a04e2b64a68725b17f0c8fe0396f589d695c8ca1b16450677e Apr 20 15:01:46.371114 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:46.371056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" event={"ID":"6660b740-f666-473b-a2c9-b5ced164f05b","Type":"ContainerStarted","Data":"abcc907fc8e1f5a04e2b64a68725b17f0c8fe0396f589d695c8ca1b16450677e"} Apr 20 15:01:49.384913 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:49.384873 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" event={"ID":"6660b740-f666-473b-a2c9-b5ced164f05b","Type":"ContainerStarted","Data":"a2a61742f375394d1b5eef49bd2bf4713dbdafb2a34d1765eb2d19d9967029be"} Apr 20 15:01:49.385292 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:49.384996 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:01:49.402932 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:01:49.402877 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" podStartSLOduration=1.13510299 podStartE2EDuration="4.402863311s" podCreationTimestamp="2026-04-20 15:01:45 +0000 UTC" firstStartedPulling="2026-04-20 15:01:45.472808325 +0000 UTC m=+389.128856606" lastFinishedPulling="2026-04-20 15:01:48.740568637 +0000 UTC m=+392.396616927" observedRunningTime="2026-04-20 15:01:49.402011947 +0000 UTC m=+393.058060248" watchObservedRunningTime="2026-04-20 15:01:49.402863311 +0000 UTC m=+393.058911614" Apr 20 15:02:00.391125 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:00.391094 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-6687ffb5c6-v6xtg" Apr 20 15:02:36.704711 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.704636 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:36.707208 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.707185 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.710091 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.710064 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-mc7kh\"" Apr 20 15:02:36.710437 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.710413 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 15:02:36.724331 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.724284 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:36.825471 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825445 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825579 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825579 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825513 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825579 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825565 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxvc9\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825732 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825588 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825732 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825627 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825732 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825644 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825732 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825681 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.825732 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.825719 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926102 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926077 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926215 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926114 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926215 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926136 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926350 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926242 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sxvc9\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926350 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926277 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926458 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926362 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926458 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926391 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926458 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926416 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926458 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926674 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926619 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926724 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926688 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.926908 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.926866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.927186 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.927006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.927186 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.927060 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.928560 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.928537 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.929278 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.929255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.938171 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.938149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxvc9\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.938290 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.938269 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:36.963254 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.963199 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h"] Apr 20 15:02:36.966377 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.966359 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:36.977071 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:36.977050 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h"] Apr 20 15:02:37.019493 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.019458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:37.027498 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027475 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027592 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027508 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027592 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027537 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027678 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027602 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027678 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027666 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027760 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027694 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04721783-0eb5-4a58-b550-5e18f6b0d95d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027760 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027741 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027837 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gx8gf\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-kube-api-access-gx8gf\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.027837 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.027807 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129186 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129329 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129194 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129374 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129354 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129407 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129396 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129439 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129473 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04721783-0eb5-4a58-b550-5e18f6b0d95d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129513 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129488 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129556 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129556 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129529 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gx8gf\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-kube-api-access-gx8gf\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129655 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129560 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129655 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129565 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129773 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129753 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.129843 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.129822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.130416 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.130389 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/04721783-0eb5-4a58-b550-5e18f6b0d95d-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.131842 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.131822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.131937 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.131920 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.142757 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.142739 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.144205 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.144185 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gx8gf\" (UniqueName: \"kubernetes.io/projected/04721783-0eb5-4a58-b550-5e18f6b0d95d-kube-api-access-gx8gf\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h\" (UID: \"04721783-0eb5-4a58-b550-5e18f6b0d95d\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.154753 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.154687 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:37.157114 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:02:37.157092 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod714943ea_06f1_4dd7_892a_657beda4992a.slice/crio-a3a7d4deeb43048883d010b85edef30cdc3713b5b29ad6c97808404d46f68c00 WatchSource:0}: Error finding container a3a7d4deeb43048883d010b85edef30cdc3713b5b29ad6c97808404d46f68c00: Status 404 returned error can't find the container with id a3a7d4deeb43048883d010b85edef30cdc3713b5b29ad6c97808404d46f68c00 Apr 20 15:02:37.277652 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.277623 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:37.394291 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.394266 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h"] Apr 20 15:02:37.396395 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:02:37.396367 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04721783_0eb5_4a58_b550_5e18f6b0d95d.slice/crio-b6c6b0a80aaf1032b1cc178732801719c66be18e3318246207ce357e9b5a2d35 WatchSource:0}: Error finding container b6c6b0a80aaf1032b1cc178732801719c66be18e3318246207ce357e9b5a2d35: Status 404 returned error can't find the container with id b6c6b0a80aaf1032b1cc178732801719c66be18e3318246207ce357e9b5a2d35 Apr 20 15:02:37.547176 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.547098 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" event={"ID":"714943ea-06f1-4dd7-892a-657beda4992a","Type":"ContainerStarted","Data":"a3a7d4deeb43048883d010b85edef30cdc3713b5b29ad6c97808404d46f68c00"} Apr 20 15:02:37.548180 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:37.548156 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" event={"ID":"04721783-0eb5-4a58-b550-5e18f6b0d95d","Type":"ContainerStarted","Data":"b6c6b0a80aaf1032b1cc178732801719c66be18e3318246207ce357e9b5a2d35"} Apr 20 15:02:39.787683 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.787643 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:39.788021 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.787714 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:39.788021 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.787740 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:39.793163 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.793137 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:39.793239 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.793226 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:39.793278 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:39.793257 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:02:40.562315 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:40.562276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" event={"ID":"714943ea-06f1-4dd7-892a-657beda4992a","Type":"ContainerStarted","Data":"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a"} Apr 20 15:02:40.563676 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:40.563649 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" event={"ID":"04721783-0eb5-4a58-b550-5e18f6b0d95d","Type":"ContainerStarted","Data":"87dd4b928c50712fc0fa9fe4028bdd688b2caa138c3dc96e079e451dd359bde5"} Apr 20 15:02:40.588769 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:40.588721 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" podStartSLOduration=1.960328469 podStartE2EDuration="4.588709972s" podCreationTimestamp="2026-04-20 15:02:36 +0000 UTC" firstStartedPulling="2026-04-20 15:02:37.159022233 +0000 UTC m=+440.815070512" lastFinishedPulling="2026-04-20 15:02:39.787403735 +0000 UTC m=+443.443452015" observedRunningTime="2026-04-20 15:02:40.586338567 +0000 UTC m=+444.242386868" watchObservedRunningTime="2026-04-20 15:02:40.588709972 +0000 UTC m=+444.244758322" Apr 20 15:02:41.020578 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.020558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:41.021479 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.021450 2570 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" start-of-body= Apr 20 15:02:41.021599 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.021513 2570 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" Apr 20 15:02:41.278256 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.278186 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:41.282394 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.282372 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:41.306249 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.306202 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" podStartSLOduration=2.9114441490000003 podStartE2EDuration="5.306192272s" podCreationTimestamp="2026-04-20 15:02:36 +0000 UTC" firstStartedPulling="2026-04-20 15:02:37.398178841 +0000 UTC m=+441.054227120" lastFinishedPulling="2026-04-20 15:02:39.792926962 +0000 UTC m=+443.448975243" observedRunningTime="2026-04-20 15:02:40.615611167 +0000 UTC m=+444.271659468" watchObservedRunningTime="2026-04-20 15:02:41.306192272 +0000 UTC m=+444.962240572" Apr 20 15:02:41.567793 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.567729 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:41.568623 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.568607 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h" Apr 20 15:02:41.619824 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:41.619800 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:42.020184 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.020138 2570 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" start-of-body= Apr 20 15:02:42.020348 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.020211 2570 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" Apr 20 15:02:42.756112 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.756077 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:42.758984 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.758963 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:42.761847 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.761825 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 15:02:42.761952 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.761925 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 15:02:42.762957 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.762940 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-mf7xx\"" Apr 20 15:02:42.767714 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.767690 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:42.878617 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.878589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7rsj\" (UniqueName: \"kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj\") pod \"kuadrant-operator-catalog-pdm42\" (UID: \"1e93e3d6-14a3-4191-a1e3-70121f7ff12d\") " pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:42.979757 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.979718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7rsj\" (UniqueName: \"kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj\") pod \"kuadrant-operator-catalog-pdm42\" (UID: \"1e93e3d6-14a3-4191-a1e3-70121f7ff12d\") " pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:42.987619 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:42.987597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7rsj\" (UniqueName: \"kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj\") pod \"kuadrant-operator-catalog-pdm42\" (UID: \"1e93e3d6-14a3-4191-a1e3-70121f7ff12d\") " pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:43.020324 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.020281 2570 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" start-of-body= Apr 20 15:02:43.020398 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.020354 2570 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.26:15021/healthz/ready\": dial tcp 10.133.0.26:15021: connect: connection refused" Apr 20 15:02:43.068819 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.068799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:43.122442 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.122410 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:43.187888 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.187864 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:43.189518 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:02:43.189489 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e93e3d6_14a3_4191_a1e3_70121f7ff12d.slice/crio-27863837159cbca28b523ed0e589f65389aefee75b2e31bef768d4853578857e WatchSource:0}: Error finding container 27863837159cbca28b523ed0e589f65389aefee75b2e31bef768d4853578857e: Status 404 returned error can't find the container with id 27863837159cbca28b523ed0e589f65389aefee75b2e31bef768d4853578857e Apr 20 15:02:43.320466 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.320404 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-47b2b"] Apr 20 15:02:43.323451 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.323433 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:43.330442 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.330253 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-47b2b"] Apr 20 15:02:43.484085 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.484061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5pxd\" (UniqueName: \"kubernetes.io/projected/ccdfb83f-9afc-4949-8be1-324eb63e1b9a-kube-api-access-m5pxd\") pod \"kuadrant-operator-catalog-47b2b\" (UID: \"ccdfb83f-9afc-4949-8be1-324eb63e1b9a\") " pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:43.578737 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.578674 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" event={"ID":"1e93e3d6-14a3-4191-a1e3-70121f7ff12d","Type":"ContainerStarted","Data":"27863837159cbca28b523ed0e589f65389aefee75b2e31bef768d4853578857e"} Apr 20 15:02:43.578978 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.578954 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" containerID="cri-o://07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a" gracePeriod=30 Apr 20 15:02:43.584522 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.584501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5pxd\" (UniqueName: \"kubernetes.io/projected/ccdfb83f-9afc-4949-8be1-324eb63e1b9a-kube-api-access-m5pxd\") pod \"kuadrant-operator-catalog-47b2b\" (UID: \"ccdfb83f-9afc-4949-8be1-324eb63e1b9a\") " pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:43.592386 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.592366 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5pxd\" (UniqueName: \"kubernetes.io/projected/ccdfb83f-9afc-4949-8be1-324eb63e1b9a-kube-api-access-m5pxd\") pod \"kuadrant-operator-catalog-47b2b\" (UID: \"ccdfb83f-9afc-4949-8be1-324eb63e1b9a\") " pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:43.633381 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.633353 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:43.748999 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:43.748976 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-47b2b"] Apr 20 15:02:43.750879 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:02:43.750853 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccdfb83f_9afc_4949_8be1_324eb63e1b9a.slice/crio-6ec9345c7b0f22be2618223ed37e58c6c77fc3baad318a4013e54a07c77bb9c1 WatchSource:0}: Error finding container 6ec9345c7b0f22be2618223ed37e58c6c77fc3baad318a4013e54a07c77bb9c1: Status 404 returned error can't find the container with id 6ec9345c7b0f22be2618223ed37e58c6c77fc3baad318a4013e54a07c77bb9c1 Apr 20 15:02:44.592397 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:44.592333 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" event={"ID":"ccdfb83f-9afc-4949-8be1-324eb63e1b9a","Type":"ContainerStarted","Data":"6ec9345c7b0f22be2618223ed37e58c6c77fc3baad318a4013e54a07c77bb9c1"} Apr 20 15:02:45.597742 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.597678 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" event={"ID":"1e93e3d6-14a3-4191-a1e3-70121f7ff12d","Type":"ContainerStarted","Data":"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262"} Apr 20 15:02:45.597742 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.597733 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" podUID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" containerName="registry-server" containerID="cri-o://2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262" gracePeriod=2 Apr 20 15:02:45.599125 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.599095 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" event={"ID":"ccdfb83f-9afc-4949-8be1-324eb63e1b9a","Type":"ContainerStarted","Data":"ac58444c35b6eaae30328a1c0cb94db99990993a0f625dd97fe7eceb1024120d"} Apr 20 15:02:45.613321 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.613251 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" podStartSLOduration=1.359240991 podStartE2EDuration="3.613235406s" podCreationTimestamp="2026-04-20 15:02:42 +0000 UTC" firstStartedPulling="2026-04-20 15:02:43.19095259 +0000 UTC m=+446.847000869" lastFinishedPulling="2026-04-20 15:02:45.444946993 +0000 UTC m=+449.100995284" observedRunningTime="2026-04-20 15:02:45.612345185 +0000 UTC m=+449.268393486" watchObservedRunningTime="2026-04-20 15:02:45.613235406 +0000 UTC m=+449.269283706" Apr 20 15:02:45.627713 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.627669 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" podStartSLOduration=0.932343249 podStartE2EDuration="2.627657682s" podCreationTimestamp="2026-04-20 15:02:43 +0000 UTC" firstStartedPulling="2026-04-20 15:02:43.752355759 +0000 UTC m=+447.408404038" lastFinishedPulling="2026-04-20 15:02:45.447670189 +0000 UTC m=+449.103718471" observedRunningTime="2026-04-20 15:02:45.625819271 +0000 UTC m=+449.281867573" watchObservedRunningTime="2026-04-20 15:02:45.627657682 +0000 UTC m=+449.283705982" Apr 20 15:02:45.831673 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.831653 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:45.905639 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.905618 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7rsj\" (UniqueName: \"kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj\") pod \"1e93e3d6-14a3-4191-a1e3-70121f7ff12d\" (UID: \"1e93e3d6-14a3-4191-a1e3-70121f7ff12d\") " Apr 20 15:02:45.907718 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:45.907696 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj" (OuterVolumeSpecName: "kube-api-access-h7rsj") pod "1e93e3d6-14a3-4191-a1e3-70121f7ff12d" (UID: "1e93e3d6-14a3-4191-a1e3-70121f7ff12d"). InnerVolumeSpecName "kube-api-access-h7rsj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:46.006598 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.006525 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7rsj\" (UniqueName: \"kubernetes.io/projected/1e93e3d6-14a3-4191-a1e3-70121f7ff12d-kube-api-access-h7rsj\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:46.603237 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.603209 2570 generic.go:358] "Generic (PLEG): container finished" podID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" containerID="2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262" exitCode=0 Apr 20 15:02:46.603597 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.603262 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" Apr 20 15:02:46.603597 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.603291 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" event={"ID":"1e93e3d6-14a3-4191-a1e3-70121f7ff12d","Type":"ContainerDied","Data":"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262"} Apr 20 15:02:46.603597 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.603338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-pdm42" event={"ID":"1e93e3d6-14a3-4191-a1e3-70121f7ff12d","Type":"ContainerDied","Data":"27863837159cbca28b523ed0e589f65389aefee75b2e31bef768d4853578857e"} Apr 20 15:02:46.603597 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.603356 2570 scope.go:117] "RemoveContainer" containerID="2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262" Apr 20 15:02:46.612418 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.612403 2570 scope.go:117] "RemoveContainer" containerID="2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262" Apr 20 15:02:46.612644 ip-10-0-133-163 kubenswrapper[2570]: E0420 15:02:46.612628 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262\": container with ID starting with 2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262 not found: ID does not exist" containerID="2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262" Apr 20 15:02:46.612695 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.612651 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262"} err="failed to get container status \"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262\": rpc error: code = NotFound desc = could not find container \"2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262\": container with ID starting with 2025c06725703cc29e2e82d2613805f23a9e9faf5a89bd9d1dba2bc066e49262 not found: ID does not exist" Apr 20 15:02:46.626642 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.626617 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:46.628339 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.628320 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-pdm42"] Apr 20 15:02:46.983716 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:46.983689 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" path="/var/lib/kubelet/pods/1e93e3d6-14a3-4191-a1e3-70121f7ff12d/volumes" Apr 20 15:02:48.830239 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.830218 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:48.928967 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.928908 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.928967 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.928937 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.928967 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.928958 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929005 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929034 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxvc9\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929065 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929109 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929152 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929201 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929225 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert\") pod \"714943ea-06f1-4dd7-892a-657beda4992a\" (UID: \"714943ea-06f1-4dd7-892a-657beda4992a\") " Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929370 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data" (OuterVolumeSpecName: "istio-data") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929382 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929575 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929633 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929701 2570 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-socket\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929719 2570 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-credential-socket\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929733 2570 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-data\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:48.929799 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.929746 2570 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-workload-certs\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:48.931387 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.931355 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9" (OuterVolumeSpecName: "kube-api-access-sxvc9") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "kube-api-access-sxvc9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:48.931387 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.931360 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:02:48.931526 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.931388 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 20 15:02:48.931714 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:48.931691 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token" (OuterVolumeSpecName: "istio-token") pod "714943ea-06f1-4dd7-892a-657beda4992a" (UID: "714943ea-06f1-4dd7-892a-657beda4992a"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:02:49.030336 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.030291 2570 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/714943ea-06f1-4dd7-892a-657beda4992a-istio-envoy\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:49.030464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.030346 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sxvc9\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-kube-api-access-sxvc9\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:49.030464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.030356 2570 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/714943ea-06f1-4dd7-892a-657beda4992a-istio-podinfo\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:49.030464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.030366 2570 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/714943ea-06f1-4dd7-892a-657beda4992a-istio-token\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:49.030464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.030374 2570 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/714943ea-06f1-4dd7-892a-657beda4992a-istiod-ca-cert\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:02:49.615999 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.615967 2570 generic.go:358] "Generic (PLEG): container finished" podID="714943ea-06f1-4dd7-892a-657beda4992a" containerID="07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a" exitCode=0 Apr 20 15:02:49.616179 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.616046 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" Apr 20 15:02:49.616179 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.616050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" event={"ID":"714943ea-06f1-4dd7-892a-657beda4992a","Type":"ContainerDied","Data":"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a"} Apr 20 15:02:49.616179 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.616087 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w" event={"ID":"714943ea-06f1-4dd7-892a-657beda4992a","Type":"ContainerDied","Data":"a3a7d4deeb43048883d010b85edef30cdc3713b5b29ad6c97808404d46f68c00"} Apr 20 15:02:49.616179 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.616102 2570 scope.go:117] "RemoveContainer" containerID="07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a" Apr 20 15:02:49.624255 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.624219 2570 scope.go:117] "RemoveContainer" containerID="07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a" Apr 20 15:02:49.624529 ip-10-0-133-163 kubenswrapper[2570]: E0420 15:02:49.624510 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a\": container with ID starting with 07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a not found: ID does not exist" containerID="07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a" Apr 20 15:02:49.624603 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.624535 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a"} err="failed to get container status \"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a\": rpc error: code = NotFound desc = could not find container \"07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a\": container with ID starting with 07950bae76a899d18899fbd31de91bee71622e851cc8947602248211788afa2a not found: ID does not exist" Apr 20 15:02:49.633509 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.633480 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:49.637213 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:49.637196 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd5g899w"] Apr 20 15:02:50.984316 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:50.984279 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="714943ea-06f1-4dd7-892a-657beda4992a" path="/var/lib/kubelet/pods/714943ea-06f1-4dd7-892a-657beda4992a/volumes" Apr 20 15:02:53.633737 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:53.633707 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:53.634129 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:53.633788 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:53.655236 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:53.655213 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:02:54.654744 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:02:54.654718 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-47b2b" Apr 20 15:03:14.853970 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.853937 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx"] Apr 20 15:03:14.854349 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854326 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" Apr 20 15:03:14.854349 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854337 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" Apr 20 15:03:14.854349 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854349 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" containerName="registry-server" Apr 20 15:03:14.854463 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854355 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" containerName="registry-server" Apr 20 15:03:14.854463 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854433 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1e93e3d6-14a3-4191-a1e3-70121f7ff12d" containerName="registry-server" Apr 20 15:03:14.854463 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.854447 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="714943ea-06f1-4dd7-892a-657beda4992a" containerName="istio-proxy" Apr 20 15:03:14.862886 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.862867 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:14.867387 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.867366 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-mdn6b\"" Apr 20 15:03:14.870742 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.870717 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx"] Apr 20 15:03:14.947265 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:14.947227 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7vxd\" (UniqueName: \"kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd\") pod \"limitador-operator-controller-manager-85c4996f8c-v86vx\" (UID: \"d3f24fda-bb0c-4879-9a62-fa11c7b08a11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:15.048009 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:15.047968 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7vxd\" (UniqueName: \"kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd\") pod \"limitador-operator-controller-manager-85c4996f8c-v86vx\" (UID: \"d3f24fda-bb0c-4879-9a62-fa11c7b08a11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:15.058592 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:15.058568 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7vxd\" (UniqueName: \"kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd\") pod \"limitador-operator-controller-manager-85c4996f8c-v86vx\" (UID: \"d3f24fda-bb0c-4879-9a62-fa11c7b08a11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:15.174100 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:15.174008 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:15.299273 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:15.299244 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx"] Apr 20 15:03:15.302296 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:03:15.302268 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3f24fda_bb0c_4879_9a62_fa11c7b08a11.slice/crio-4084e7e1389e1de7963f6ece0cac78b976bcbf67a3e975f7501a6d154afaa397 WatchSource:0}: Error finding container 4084e7e1389e1de7963f6ece0cac78b976bcbf67a3e975f7501a6d154afaa397: Status 404 returned error can't find the container with id 4084e7e1389e1de7963f6ece0cac78b976bcbf67a3e975f7501a6d154afaa397 Apr 20 15:03:15.710258 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:15.710218 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" event={"ID":"d3f24fda-bb0c-4879-9a62-fa11c7b08a11","Type":"ContainerStarted","Data":"4084e7e1389e1de7963f6ece0cac78b976bcbf67a3e975f7501a6d154afaa397"} Apr 20 15:03:17.721584 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:17.721543 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" event={"ID":"d3f24fda-bb0c-4879-9a62-fa11c7b08a11","Type":"ContainerStarted","Data":"4cfcf526af80e1a61ba8a6d6defb2b2c1c5bce102b73ecb513cafdbd40c85916"} Apr 20 15:03:17.721943 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:17.721677 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:17.740453 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:17.740410 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" podStartSLOduration=2.079157363 podStartE2EDuration="3.740397031s" podCreationTimestamp="2026-04-20 15:03:14 +0000 UTC" firstStartedPulling="2026-04-20 15:03:15.304545081 +0000 UTC m=+478.960593361" lastFinishedPulling="2026-04-20 15:03:16.965784746 +0000 UTC m=+480.621833029" observedRunningTime="2026-04-20 15:03:17.738428714 +0000 UTC m=+481.394477016" watchObservedRunningTime="2026-04-20 15:03:17.740397031 +0000 UTC m=+481.396445332" Apr 20 15:03:18.279560 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.279526 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7"] Apr 20 15:03:18.283151 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.283130 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:18.286403 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.286382 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-7p2lx\"" Apr 20 15:03:18.286484 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.286382 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 15:03:18.294150 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.294129 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7"] Apr 20 15:03:18.378496 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.378473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktzlx\" (UniqueName: \"kubernetes.io/projected/ab365834-adc7-4d59-9a5c-c2c739687f62-kube-api-access-ktzlx\") pod \"dns-operator-controller-manager-648d5c98bc-b5xb7\" (UID: \"ab365834-adc7-4d59-9a5c-c2c739687f62\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:18.479796 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.479768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktzlx\" (UniqueName: \"kubernetes.io/projected/ab365834-adc7-4d59-9a5c-c2c739687f62-kube-api-access-ktzlx\") pod \"dns-operator-controller-manager-648d5c98bc-b5xb7\" (UID: \"ab365834-adc7-4d59-9a5c-c2c739687f62\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:18.489621 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.489602 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktzlx\" (UniqueName: \"kubernetes.io/projected/ab365834-adc7-4d59-9a5c-c2c739687f62-kube-api-access-ktzlx\") pod \"dns-operator-controller-manager-648d5c98bc-b5xb7\" (UID: \"ab365834-adc7-4d59-9a5c-c2c739687f62\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:18.594472 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.594404 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:18.720817 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:18.720786 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7"] Apr 20 15:03:18.722330 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:03:18.722281 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab365834_adc7_4d59_9a5c_c2c739687f62.slice/crio-8c64907eefbd6f15a4c1ea7beb03de10bce8a500c0ddb9650e38362fc6631d3f WatchSource:0}: Error finding container 8c64907eefbd6f15a4c1ea7beb03de10bce8a500c0ddb9650e38362fc6631d3f: Status 404 returned error can't find the container with id 8c64907eefbd6f15a4c1ea7beb03de10bce8a500c0ddb9650e38362fc6631d3f Apr 20 15:03:19.729703 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:19.729669 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" event={"ID":"ab365834-adc7-4d59-9a5c-c2c739687f62","Type":"ContainerStarted","Data":"8c64907eefbd6f15a4c1ea7beb03de10bce8a500c0ddb9650e38362fc6631d3f"} Apr 20 15:03:21.737845 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:21.737806 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" event={"ID":"ab365834-adc7-4d59-9a5c-c2c739687f62","Type":"ContainerStarted","Data":"ca84bf481a24b974d15ff53564cf9bdb4b249a4913e9b5f8caed8770b3e83c1f"} Apr 20 15:03:21.738244 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:21.737934 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:21.760159 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:21.760112 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" podStartSLOduration=1.295529717 podStartE2EDuration="3.760101322s" podCreationTimestamp="2026-04-20 15:03:18 +0000 UTC" firstStartedPulling="2026-04-20 15:03:18.724375249 +0000 UTC m=+482.380423532" lastFinishedPulling="2026-04-20 15:03:21.188946854 +0000 UTC m=+484.844995137" observedRunningTime="2026-04-20 15:03:21.758627745 +0000 UTC m=+485.414676072" watchObservedRunningTime="2026-04-20 15:03:21.760101322 +0000 UTC m=+485.416149623" Apr 20 15:03:23.106971 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.106937 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb"] Apr 20 15:03:23.110445 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.110428 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.114085 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.114064 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-n2fng\"" Apr 20 15:03:23.129068 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.129046 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb"] Apr 20 15:03:23.223261 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.223235 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.223399 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.223284 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tg82w\" (UniqueName: \"kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.323708 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.323684 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.323824 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.323734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tg82w\" (UniqueName: \"kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.324028 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.324010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.368689 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.368636 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tg82w\" (UniqueName: \"kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.419988 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.419964 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:23.547244 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.547222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb"] Apr 20 15:03:23.549159 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:03:23.549124 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3b144d7_7054_4752_a61e_f81236dc65d4.slice/crio-1e92379dc6ad237db0f99e951c3f665d8a3339d12040062ebdd6fc1f3303c74f WatchSource:0}: Error finding container 1e92379dc6ad237db0f99e951c3f665d8a3339d12040062ebdd6fc1f3303c74f: Status 404 returned error can't find the container with id 1e92379dc6ad237db0f99e951c3f665d8a3339d12040062ebdd6fc1f3303c74f Apr 20 15:03:23.746486 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:23.746407 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" event={"ID":"f3b144d7-7054-4752-a61e-f81236dc65d4","Type":"ContainerStarted","Data":"1e92379dc6ad237db0f99e951c3f665d8a3339d12040062ebdd6fc1f3303c74f"} Apr 20 15:03:28.728078 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:28.728054 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:28.774012 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:28.773981 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" event={"ID":"f3b144d7-7054-4752-a61e-f81236dc65d4","Type":"ContainerStarted","Data":"18e34c1415315c7cff41f0b85300d7b5b5d27e270f2ec167f09af8737c89ee41"} Apr 20 15:03:28.774131 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:28.774085 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:28.798569 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:28.798506 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" podStartSLOduration=0.659464259 podStartE2EDuration="5.798493542s" podCreationTimestamp="2026-04-20 15:03:23 +0000 UTC" firstStartedPulling="2026-04-20 15:03:23.551588588 +0000 UTC m=+487.207636868" lastFinishedPulling="2026-04-20 15:03:28.690617862 +0000 UTC m=+492.346666151" observedRunningTime="2026-04-20 15:03:28.796743643 +0000 UTC m=+492.452791985" watchObservedRunningTime="2026-04-20 15:03:28.798493542 +0000 UTC m=+492.454541873" Apr 20 15:03:32.744335 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:32.744284 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-b5xb7" Apr 20 15:03:39.780537 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:39.780506 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:40.621144 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.621097 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb"] Apr 20 15:03:40.621421 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.621371 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" containerName="manager" containerID="cri-o://18e34c1415315c7cff41f0b85300d7b5b5d27e270f2ec167f09af8737c89ee41" gracePeriod=2 Apr 20 15:03:40.633017 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.632987 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb"] Apr 20 15:03:40.652116 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.652087 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx"] Apr 20 15:03:40.652448 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.652419 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" containerName="manager" containerID="cri-o://4cfcf526af80e1a61ba8a6d6defb2b2c1c5bce102b73ecb513cafdbd40c85916" gracePeriod=2 Apr 20 15:03:40.654026 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.654000 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:40.654500 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.654483 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" containerName="manager" Apr 20 15:03:40.654614 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.654502 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" containerName="manager" Apr 20 15:03:40.654674 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.654621 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" containerName="manager" Apr 20 15:03:40.654728 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.654675 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.657818 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.657799 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.663167 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.663143 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx"] Apr 20 15:03:40.667962 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.667933 2570 status_manager.go:895] "Failed to get status for pod" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" err="pods \"limitador-operator-controller-manager-85c4996f8c-v86vx\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.670682 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.670486 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.672748 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.672524 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:40.686582 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.686563 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v"] Apr 20 15:03:40.687041 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.687026 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" containerName="manager" Apr 20 15:03:40.687087 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.687044 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" containerName="manager" Apr 20 15:03:40.687140 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.687130 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" containerName="manager" Apr 20 15:03:40.690032 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.690017 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:40.704821 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.704767 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v"] Apr 20 15:03:40.707055 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.707031 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.709072 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.709049 2570 status_manager.go:895] "Failed to get status for pod" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" err="pods \"limitador-operator-controller-manager-85c4996f8c-v86vx\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.739001 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.738979 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49"] Apr 20 15:03:40.742453 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.742436 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.760316 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.760275 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49"] Apr 20 15:03:40.773744 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.773718 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtqfs\" (UniqueName: \"kubernetes.io/projected/fa7a90a9-4a60-41f1-9bf6-51d11c2214b0-kube-api-access-jtqfs\") pod \"limitador-operator-controller-manager-85c4996f8c-bvn5v\" (UID: \"fa7a90a9-4a60-41f1-9bf6-51d11c2214b0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:40.773851 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.773796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.773918 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.773889 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glrr2\" (UniqueName: \"kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.797285 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.796953 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.800815 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.800777 2570 status_manager.go:895] "Failed to get status for pod" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" err="pods \"limitador-operator-controller-manager-85c4996f8c-v86vx\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.822029 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.822004 2570 generic.go:358] "Generic (PLEG): container finished" podID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" containerID="4cfcf526af80e1a61ba8a6d6defb2b2c1c5bce102b73ecb513cafdbd40c85916" exitCode=0 Apr 20 15:03:40.824211 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.824187 2570 generic.go:358] "Generic (PLEG): container finished" podID="f3b144d7-7054-4752-a61e-f81236dc65d4" containerID="18e34c1415315c7cff41f0b85300d7b5b5d27e270f2ec167f09af8737c89ee41" exitCode=0 Apr 20 15:03:40.875428 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875297 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jtqfs\" (UniqueName: \"kubernetes.io/projected/fa7a90a9-4a60-41f1-9bf6-51d11c2214b0-kube-api-access-jtqfs\") pod \"limitador-operator-controller-manager-85c4996f8c-bvn5v\" (UID: \"fa7a90a9-4a60-41f1-9bf6-51d11c2214b0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:40.875520 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txqq4\" (UniqueName: \"kubernetes.io/projected/168e11ea-36bd-4438-8e61-adfa38f077a3-kube-api-access-txqq4\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.875579 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875540 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/168e11ea-36bd-4438-8e61-adfa38f077a3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.875630 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.875753 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875734 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-glrr2\" (UniqueName: \"kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.875990 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.875971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.890876 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.890858 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:40.894146 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.894130 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:40.894226 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.894153 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.896146 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.896122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-glrr2\" (UniqueName: \"kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-whkm5\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:40.896426 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.896403 2570 status_manager.go:895] "Failed to get status for pod" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" err="pods \"limitador-operator-controller-manager-85c4996f8c-v86vx\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.896525 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.896510 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtqfs\" (UniqueName: \"kubernetes.io/projected/fa7a90a9-4a60-41f1-9bf6-51d11c2214b0-kube-api-access-jtqfs\") pod \"limitador-operator-controller-manager-85c4996f8c-bvn5v\" (UID: \"fa7a90a9-4a60-41f1-9bf6-51d11c2214b0\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:40.898740 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.898719 2570 status_manager.go:895] "Failed to get status for pod" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" err="pods \"limitador-operator-controller-manager-85c4996f8c-v86vx\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.900938 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.900919 2570 status_manager.go:895] "Failed to get status for pod" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-mrqcb\" is forbidden: User \"system:node:ip-10-0-133-163.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-133-163.ec2.internal' and this object" Apr 20 15:03:40.976059 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976036 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume\") pod \"f3b144d7-7054-4752-a61e-f81236dc65d4\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " Apr 20 15:03:40.976146 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976084 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tg82w\" (UniqueName: \"kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w\") pod \"f3b144d7-7054-4752-a61e-f81236dc65d4\" (UID: \"f3b144d7-7054-4752-a61e-f81236dc65d4\") " Apr 20 15:03:40.976356 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-txqq4\" (UniqueName: \"kubernetes.io/projected/168e11ea-36bd-4438-8e61-adfa38f077a3-kube-api-access-txqq4\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.976429 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976386 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/168e11ea-36bd-4438-8e61-adfa38f077a3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.976560 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976535 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f3b144d7-7054-4752-a61e-f81236dc65d4" (UID: "f3b144d7-7054-4752-a61e-f81236dc65d4"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:40.976691 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.976671 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/168e11ea-36bd-4438-8e61-adfa38f077a3-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:40.978166 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.978146 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w" (OuterVolumeSpecName: "kube-api-access-tg82w") pod "f3b144d7-7054-4752-a61e-f81236dc65d4" (UID: "f3b144d7-7054-4752-a61e-f81236dc65d4"). InnerVolumeSpecName "kube-api-access-tg82w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:40.983889 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:40.983868 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3b144d7-7054-4752-a61e-f81236dc65d4" path="/var/lib/kubelet/pods/f3b144d7-7054-4752-a61e-f81236dc65d4/volumes" Apr 20 15:03:41.006941 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.006920 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-txqq4\" (UniqueName: \"kubernetes.io/projected/168e11ea-36bd-4438-8e61-adfa38f077a3-kube-api-access-txqq4\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-hhl49\" (UID: \"168e11ea-36bd-4438-8e61-adfa38f077a3\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:41.070955 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.070936 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:41.077723 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.077706 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7vxd\" (UniqueName: \"kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd\") pod \"d3f24fda-bb0c-4879-9a62-fa11c7b08a11\" (UID: \"d3f24fda-bb0c-4879-9a62-fa11c7b08a11\") " Apr 20 15:03:41.077864 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.077851 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f3b144d7-7054-4752-a61e-f81236dc65d4-extensions-socket-volume\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:03:41.077933 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.077866 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tg82w\" (UniqueName: \"kubernetes.io/projected/f3b144d7-7054-4752-a61e-f81236dc65d4-kube-api-access-tg82w\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:03:41.079647 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.079628 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd" (OuterVolumeSpecName: "kube-api-access-v7vxd") pod "d3f24fda-bb0c-4879-9a62-fa11c7b08a11" (UID: "d3f24fda-bb0c-4879-9a62-fa11c7b08a11"). InnerVolumeSpecName "kube-api-access-v7vxd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:41.081771 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.081754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:41.088588 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.088569 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:41.180865 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.180838 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v7vxd\" (UniqueName: \"kubernetes.io/projected/d3f24fda-bb0c-4879-9a62-fa11c7b08a11-kube-api-access-v7vxd\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:03:41.228319 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.228271 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:41.228546 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:03:41.228526 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b31739_e044_4a63_ac9d_dc4693db537c.slice/crio-c6551df43d1b747e5d0f8d4d1c5def0aaea8f947ace3e58560b17d58c542d114 WatchSource:0}: Error finding container c6551df43d1b747e5d0f8d4d1c5def0aaea8f947ace3e58560b17d58c542d114: Status 404 returned error can't find the container with id c6551df43d1b747e5d0f8d4d1c5def0aaea8f947ace3e58560b17d58c542d114 Apr 20 15:03:41.447208 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.447187 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v"] Apr 20 15:03:41.454596 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.454569 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49"] Apr 20 15:03:41.457639 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:03:41.457609 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod168e11ea_36bd_4438_8e61_adfa38f077a3.slice/crio-4b924260263244bdb21baba61e79d60cc441d89cb848ba0d35513c354a5f329a WatchSource:0}: Error finding container 4b924260263244bdb21baba61e79d60cc441d89cb848ba0d35513c354a5f329a: Status 404 returned error can't find the container with id 4b924260263244bdb21baba61e79d60cc441d89cb848ba0d35513c354a5f329a Apr 20 15:03:41.829765 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.829728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" event={"ID":"fa7a90a9-4a60-41f1-9bf6-51d11c2214b0","Type":"ContainerStarted","Data":"8edf17aef74abd644614c7e55c4db1926e885030e62c5f41b2053c9ae2c84227"} Apr 20 15:03:41.830192 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.829774 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" event={"ID":"fa7a90a9-4a60-41f1-9bf6-51d11c2214b0","Type":"ContainerStarted","Data":"e20d6847ec52a58e2df9e73ffa9f843c6e2ebb2eb76e391929492d8bceccb645"} Apr 20 15:03:41.830192 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.829806 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:41.831247 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.831220 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" event={"ID":"168e11ea-36bd-4438-8e61-adfa38f077a3","Type":"ContainerStarted","Data":"35f00bc359bbdf0c7c3e67f5ca6d38206e5d55c55568367b7415ac65e125b4ba"} Apr 20 15:03:41.831247 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.831248 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" event={"ID":"168e11ea-36bd-4438-8e61-adfa38f077a3","Type":"ContainerStarted","Data":"4b924260263244bdb21baba61e79d60cc441d89cb848ba0d35513c354a5f329a"} Apr 20 15:03:41.831460 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.831284 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:41.832376 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.832344 2570 scope.go:117] "RemoveContainer" containerID="4cfcf526af80e1a61ba8a6d6defb2b2c1c5bce102b73ecb513cafdbd40c85916" Apr 20 15:03:41.832485 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.832379 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-v86vx" Apr 20 15:03:41.833532 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.833464 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-mrqcb" Apr 20 15:03:41.835069 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.835045 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" event={"ID":"f8b31739-e044-4a63-ac9d-dc4693db537c","Type":"ContainerStarted","Data":"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9"} Apr 20 15:03:41.835174 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.835073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" event={"ID":"f8b31739-e044-4a63-ac9d-dc4693db537c","Type":"ContainerStarted","Data":"c6551df43d1b747e5d0f8d4d1c5def0aaea8f947ace3e58560b17d58c542d114"} Apr 20 15:03:41.835249 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.835229 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:41.841272 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.841255 2570 scope.go:117] "RemoveContainer" containerID="18e34c1415315c7cff41f0b85300d7b5b5d27e270f2ec167f09af8737c89ee41" Apr 20 15:03:41.857576 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.857537 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" podStartSLOduration=1.857526926 podStartE2EDuration="1.857526926s" podCreationTimestamp="2026-04-20 15:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:41.857071529 +0000 UTC m=+505.513119842" watchObservedRunningTime="2026-04-20 15:03:41.857526926 +0000 UTC m=+505.513575228" Apr 20 15:03:41.877272 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.877221 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" podStartSLOduration=1.877204603 podStartE2EDuration="1.877204603s" podCreationTimestamp="2026-04-20 15:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:41.875033392 +0000 UTC m=+505.531081768" watchObservedRunningTime="2026-04-20 15:03:41.877204603 +0000 UTC m=+505.533252906" Apr 20 15:03:41.894170 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:41.894135 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" podStartSLOduration=1.89412467 podStartE2EDuration="1.89412467s" podCreationTimestamp="2026-04-20 15:03:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:03:41.893016306 +0000 UTC m=+505.549064623" watchObservedRunningTime="2026-04-20 15:03:41.89412467 +0000 UTC m=+505.550172971" Apr 20 15:03:42.984578 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:42.984549 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3f24fda-bb0c-4879-9a62-fa11c7b08a11" path="/var/lib/kubelet/pods/d3f24fda-bb0c-4879-9a62-fa11c7b08a11/volumes" Apr 20 15:03:52.842465 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:52.842436 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:52.842951 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:52.842489 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-bvn5v" Apr 20 15:03:52.842951 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:52.842801 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-hhl49" Apr 20 15:03:52.935674 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:52.935643 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:52.935942 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:52.935885 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" podUID="f8b31739-e044-4a63-ac9d-dc4693db537c" containerName="manager" containerID="cri-o://84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9" gracePeriod=10 Apr 20 15:03:53.185194 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.185174 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:53.283686 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.283658 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume\") pod \"f8b31739-e044-4a63-ac9d-dc4693db537c\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " Apr 20 15:03:53.283826 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.283781 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glrr2\" (UniqueName: \"kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2\") pod \"f8b31739-e044-4a63-ac9d-dc4693db537c\" (UID: \"f8b31739-e044-4a63-ac9d-dc4693db537c\") " Apr 20 15:03:53.284000 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.283976 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "f8b31739-e044-4a63-ac9d-dc4693db537c" (UID: "f8b31739-e044-4a63-ac9d-dc4693db537c"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 15:03:53.285701 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.285681 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2" (OuterVolumeSpecName: "kube-api-access-glrr2") pod "f8b31739-e044-4a63-ac9d-dc4693db537c" (UID: "f8b31739-e044-4a63-ac9d-dc4693db537c"). InnerVolumeSpecName "kube-api-access-glrr2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:03:53.385122 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.385070 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8b31739-e044-4a63-ac9d-dc4693db537c-extensions-socket-volume\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:03:53.385122 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.385093 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-glrr2\" (UniqueName: \"kubernetes.io/projected/f8b31739-e044-4a63-ac9d-dc4693db537c-kube-api-access-glrr2\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:03:53.882586 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.882546 2570 generic.go:358] "Generic (PLEG): container finished" podID="f8b31739-e044-4a63-ac9d-dc4693db537c" containerID="84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9" exitCode=0 Apr 20 15:03:53.882984 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.882616 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" Apr 20 15:03:53.882984 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.882633 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" event={"ID":"f8b31739-e044-4a63-ac9d-dc4693db537c","Type":"ContainerDied","Data":"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9"} Apr 20 15:03:53.882984 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.882681 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5" event={"ID":"f8b31739-e044-4a63-ac9d-dc4693db537c","Type":"ContainerDied","Data":"c6551df43d1b747e5d0f8d4d1c5def0aaea8f947ace3e58560b17d58c542d114"} Apr 20 15:03:53.882984 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.882698 2570 scope.go:117] "RemoveContainer" containerID="84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9" Apr 20 15:03:53.892150 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.892134 2570 scope.go:117] "RemoveContainer" containerID="84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9" Apr 20 15:03:53.892757 ip-10-0-133-163 kubenswrapper[2570]: E0420 15:03:53.892741 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9\": container with ID starting with 84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9 not found: ID does not exist" containerID="84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9" Apr 20 15:03:53.892808 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.892764 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9"} err="failed to get container status \"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9\": rpc error: code = NotFound desc = could not find container \"84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9\": container with ID starting with 84d4919555156ea0f79c2c054d377ad151f18c012883c2aa91cc0364a0e594a9 not found: ID does not exist" Apr 20 15:03:53.910086 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.910064 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:53.913982 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:53.913960 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-whkm5"] Apr 20 15:03:54.984384 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:03:54.984352 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8b31739-e044-4a63-ac9d-dc4693db537c" path="/var/lib/kubelet/pods/f8b31739-e044-4a63-ac9d-dc4693db537c/volumes" Apr 20 15:04:09.180279 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.180203 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp"] Apr 20 15:04:09.180851 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.180824 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8b31739-e044-4a63-ac9d-dc4693db537c" containerName="manager" Apr 20 15:04:09.180851 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.180849 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8b31739-e044-4a63-ac9d-dc4693db537c" containerName="manager" Apr 20 15:04:09.180982 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.180961 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8b31739-e044-4a63-ac9d-dc4693db537c" containerName="manager" Apr 20 15:04:09.183462 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.183438 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.186062 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.186037 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-jvq84\"" Apr 20 15:04:09.197284 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.196965 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp"] Apr 20 15:04:09.213864 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.213839 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.213975 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.213873 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.213975 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.213902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.213975 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.213936 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.214072 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.214005 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.214072 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.214045 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.214156 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.214139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.214200 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.214172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94xqr\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-kube-api-access-94xqr\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.214259 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.214214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315250 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315219 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315250 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315255 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94xqr\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-kube-api-access-94xqr\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315483 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315521 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315568 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315519 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315568 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315551 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315670 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315587 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315670 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315630 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315765 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.315945 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.316246 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.315973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.316246 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.316064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.316246 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.316116 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.316459 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.316278 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.317892 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.317867 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.318017 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.317999 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.323397 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.323372 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.323608 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.323592 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94xqr\" (UniqueName: \"kubernetes.io/projected/f82db5b3-f3e6-43f9-bb28-f03c4db40696-kube-api-access-94xqr\") pod \"maas-default-gateway-openshift-default-58b6f876-7n9cp\" (UID: \"f82db5b3-f3e6-43f9-bb28-f03c4db40696\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.495265 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.495177 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:09.623756 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.623726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp"] Apr 20 15:04:09.626834 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:04:09.626800 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82db5b3_f3e6_43f9_bb28_f03c4db40696.slice/crio-aff5e679d18aae39d1a1638d4fef05bfb03392ffe6aa3942c06e2290f54b93a1 WatchSource:0}: Error finding container aff5e679d18aae39d1a1638d4fef05bfb03392ffe6aa3942c06e2290f54b93a1: Status 404 returned error can't find the container with id aff5e679d18aae39d1a1638d4fef05bfb03392ffe6aa3942c06e2290f54b93a1 Apr 20 15:04:09.628985 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.628956 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:04:09.629069 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.629020 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:04:09.629069 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.629047 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892156Ki","pods":"250"} Apr 20 15:04:09.943831 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.943796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" event={"ID":"f82db5b3-f3e6-43f9-bb28-f03c4db40696","Type":"ContainerStarted","Data":"dedc70c7aa332caea4b6221c80fe9187754b48aea480d2a3f3e15d6e7fc094f7"} Apr 20 15:04:09.943831 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.943834 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" event={"ID":"f82db5b3-f3e6-43f9-bb28-f03c4db40696","Type":"ContainerStarted","Data":"aff5e679d18aae39d1a1638d4fef05bfb03392ffe6aa3942c06e2290f54b93a1"} Apr 20 15:04:09.963190 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:09.963143 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" podStartSLOduration=0.963130602 podStartE2EDuration="963.130602ms" podCreationTimestamp="2026-04-20 15:04:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:04:09.961644403 +0000 UTC m=+533.617692706" watchObservedRunningTime="2026-04-20 15:04:09.963130602 +0000 UTC m=+533.619178903" Apr 20 15:04:10.495966 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:10.495930 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:10.500659 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:10.500629 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:10.947544 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:10.947496 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:10.948552 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:10.948525 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-7n9cp" Apr 20 15:04:14.012655 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.012628 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:04:14.015585 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.015565 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.018052 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.018030 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 15:04:14.018188 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.018164 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-jwvf8\"" Apr 20 15:04:14.023795 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.023679 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:04:14.049162 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.049139 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:04:14.057605 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.057583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3b11d86c-92f4-4313-a3a6-421820d49702-config-file\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.057682 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.057653 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-649mx\" (UniqueName: \"kubernetes.io/projected/3b11d86c-92f4-4313-a3a6-421820d49702-kube-api-access-649mx\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.158860 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.158836 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-649mx\" (UniqueName: \"kubernetes.io/projected/3b11d86c-92f4-4313-a3a6-421820d49702-kube-api-access-649mx\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.158962 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.158902 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3b11d86c-92f4-4313-a3a6-421820d49702-config-file\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.159452 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.159436 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/3b11d86c-92f4-4313-a3a6-421820d49702-config-file\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.167741 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.167718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-649mx\" (UniqueName: \"kubernetes.io/projected/3b11d86c-92f4-4313-a3a6-421820d49702-kube-api-access-649mx\") pod \"limitador-limitador-78c99df468-pn2zj\" (UID: \"3b11d86c-92f4-4313-a3a6-421820d49702\") " pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.327738 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.327682 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:14.459215 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.459189 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:04:14.462037 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:04:14.462004 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b11d86c_92f4_4313_a3a6_421820d49702.slice/crio-7994fef9421b08005ed22589de99a94a00ace9d59bdd99597b232a3cd7954927 WatchSource:0}: Error finding container 7994fef9421b08005ed22589de99a94a00ace9d59bdd99597b232a3cd7954927: Status 404 returned error can't find the container with id 7994fef9421b08005ed22589de99a94a00ace9d59bdd99597b232a3cd7954927 Apr 20 15:04:14.972808 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:14.972767 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" event={"ID":"3b11d86c-92f4-4313-a3a6-421820d49702","Type":"ContainerStarted","Data":"7994fef9421b08005ed22589de99a94a00ace9d59bdd99597b232a3cd7954927"} Apr 20 15:04:17.986998 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:17.986955 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" event={"ID":"3b11d86c-92f4-4313-a3a6-421820d49702","Type":"ContainerStarted","Data":"97a295b8262ad423541f9dd14b9a7177f64fe32dd8dd0e025f09d6a6750055a7"} Apr 20 15:04:17.987411 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:17.987040 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:04:18.004532 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:18.004489 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" podStartSLOduration=2.506072929 podStartE2EDuration="5.004474127s" podCreationTimestamp="2026-04-20 15:04:13 +0000 UTC" firstStartedPulling="2026-04-20 15:04:14.464333824 +0000 UTC m=+538.120382104" lastFinishedPulling="2026-04-20 15:04:16.962735018 +0000 UTC m=+540.618783302" observedRunningTime="2026-04-20 15:04:18.002789219 +0000 UTC m=+541.658837520" watchObservedRunningTime="2026-04-20 15:04:18.004474127 +0000 UTC m=+541.660522428" Apr 20 15:04:28.991439 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:04:28.991413 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-pn2zj" Apr 20 15:05:16.867232 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:16.867205 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:05:16.867232 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:16.867217 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:05:47.380623 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.380536 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:05:47.384031 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.383992 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:47.386747 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.386723 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 15:05:47.386858 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.386754 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-vbgkq\"" Apr 20 15:05:47.395336 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.395292 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:05:47.489219 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.489185 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wp44t\" (UniqueName: \"kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t\") pod \"maas-controller-64dd4bd954-sq47p\" (UID: \"2a64ab8e-b1d4-4197-bfb2-5837f427460d\") " pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:47.590470 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.590434 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wp44t\" (UniqueName: \"kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t\") pod \"maas-controller-64dd4bd954-sq47p\" (UID: \"2a64ab8e-b1d4-4197-bfb2-5837f427460d\") " pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:47.600109 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.600077 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wp44t\" (UniqueName: \"kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t\") pod \"maas-controller-64dd4bd954-sq47p\" (UID: \"2a64ab8e-b1d4-4197-bfb2-5837f427460d\") " pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:47.696789 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.696679 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:47.831353 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.831289 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:05:47.833747 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:05:47.833719 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a64ab8e_b1d4_4197_bfb2_5837f427460d.slice/crio-dace57529a9fa4f42e4fc66817e189bddfb9328ef1971663eafca4e455898895 WatchSource:0}: Error finding container dace57529a9fa4f42e4fc66817e189bddfb9328ef1971663eafca4e455898895: Status 404 returned error can't find the container with id dace57529a9fa4f42e4fc66817e189bddfb9328ef1971663eafca4e455898895 Apr 20 15:05:47.850772 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:47.850593 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:05:48.230196 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.230165 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-f5bbffd8f-km7qf"] Apr 20 15:05:48.234757 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.234734 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.237418 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.237393 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-gn9x5\"" Apr 20 15:05:48.237526 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.237458 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 15:05:48.241560 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.241531 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f5bbffd8f-km7qf"] Apr 20 15:05:48.299048 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.299002 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6mgb\" (UniqueName: \"kubernetes.io/projected/c5aa3aa8-6607-4f7f-8c1f-96b110395961-kube-api-access-k6mgb\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.299220 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.299095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c5aa3aa8-6607-4f7f-8c1f-96b110395961-maas-api-tls\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.315697 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.315662 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-64dd4bd954-sq47p" event={"ID":"2a64ab8e-b1d4-4197-bfb2-5837f427460d","Type":"ContainerStarted","Data":"dace57529a9fa4f42e4fc66817e189bddfb9328ef1971663eafca4e455898895"} Apr 20 15:05:48.400746 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.400701 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6mgb\" (UniqueName: \"kubernetes.io/projected/c5aa3aa8-6607-4f7f-8c1f-96b110395961-kube-api-access-k6mgb\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.401196 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.400762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c5aa3aa8-6607-4f7f-8c1f-96b110395961-maas-api-tls\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.403718 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.403687 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/c5aa3aa8-6607-4f7f-8c1f-96b110395961-maas-api-tls\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.409160 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.409128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6mgb\" (UniqueName: \"kubernetes.io/projected/c5aa3aa8-6607-4f7f-8c1f-96b110395961-kube-api-access-k6mgb\") pod \"maas-api-f5bbffd8f-km7qf\" (UID: \"c5aa3aa8-6607-4f7f-8c1f-96b110395961\") " pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.546855 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.546817 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:48.728857 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:48.728831 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-f5bbffd8f-km7qf"] Apr 20 15:05:48.731405 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:05:48.731347 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5aa3aa8_6607_4f7f_8c1f_96b110395961.slice/crio-1eb0c44debd5740c08a4cbbd62c94b51358181f69081382d459a18ba6a321f80 WatchSource:0}: Error finding container 1eb0c44debd5740c08a4cbbd62c94b51358181f69081382d459a18ba6a321f80: Status 404 returned error can't find the container with id 1eb0c44debd5740c08a4cbbd62c94b51358181f69081382d459a18ba6a321f80 Apr 20 15:05:49.323804 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:49.323736 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f5bbffd8f-km7qf" event={"ID":"c5aa3aa8-6607-4f7f-8c1f-96b110395961","Type":"ContainerStarted","Data":"1eb0c44debd5740c08a4cbbd62c94b51358181f69081382d459a18ba6a321f80"} Apr 20 15:05:52.343672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.343638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-64dd4bd954-sq47p" event={"ID":"2a64ab8e-b1d4-4197-bfb2-5837f427460d","Type":"ContainerStarted","Data":"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe"} Apr 20 15:05:52.344092 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.343786 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:05:52.345110 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.345083 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-f5bbffd8f-km7qf" event={"ID":"c5aa3aa8-6607-4f7f-8c1f-96b110395961","Type":"ContainerStarted","Data":"561312ddc19aedde72a9c7fda058d06a4d1e410f208252e8e4466d9e7e248579"} Apr 20 15:05:52.345233 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.345214 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:05:52.359363 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.359292 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-64dd4bd954-sq47p" podStartSLOduration=1.893587853 podStartE2EDuration="5.359280363s" podCreationTimestamp="2026-04-20 15:05:47 +0000 UTC" firstStartedPulling="2026-04-20 15:05:47.835105563 +0000 UTC m=+631.491153843" lastFinishedPulling="2026-04-20 15:05:51.300798074 +0000 UTC m=+634.956846353" observedRunningTime="2026-04-20 15:05:52.35822809 +0000 UTC m=+636.014276396" watchObservedRunningTime="2026-04-20 15:05:52.359280363 +0000 UTC m=+636.015328664" Apr 20 15:05:52.374185 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:52.374139 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-f5bbffd8f-km7qf" podStartSLOduration=1.8043982669999998 podStartE2EDuration="4.374130162s" podCreationTimestamp="2026-04-20 15:05:48 +0000 UTC" firstStartedPulling="2026-04-20 15:05:48.733759096 +0000 UTC m=+632.389807389" lastFinishedPulling="2026-04-20 15:05:51.303491002 +0000 UTC m=+634.959539284" observedRunningTime="2026-04-20 15:05:52.373294884 +0000 UTC m=+636.029343210" watchObservedRunningTime="2026-04-20 15:05:52.374130162 +0000 UTC m=+636.030178463" Apr 20 15:05:58.353502 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:05:58.353472 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-f5bbffd8f-km7qf" Apr 20 15:06:03.353982 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:03.353954 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:06:16.838468 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:16.838432 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:06:16.838860 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:16.838718 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-64dd4bd954-sq47p" podUID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" containerName="manager" containerID="cri-o://fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe" gracePeriod=10 Apr 20 15:06:17.089892 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.089835 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:06:17.172682 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.172643 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wp44t\" (UniqueName: \"kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t\") pod \"2a64ab8e-b1d4-4197-bfb2-5837f427460d\" (UID: \"2a64ab8e-b1d4-4197-bfb2-5837f427460d\") " Apr 20 15:06:17.174940 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.174911 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t" (OuterVolumeSpecName: "kube-api-access-wp44t") pod "2a64ab8e-b1d4-4197-bfb2-5837f427460d" (UID: "2a64ab8e-b1d4-4197-bfb2-5837f427460d"). InnerVolumeSpecName "kube-api-access-wp44t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 15:06:17.273492 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.273469 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wp44t\" (UniqueName: \"kubernetes.io/projected/2a64ab8e-b1d4-4197-bfb2-5837f427460d-kube-api-access-wp44t\") on node \"ip-10-0-133-163.ec2.internal\" DevicePath \"\"" Apr 20 15:06:17.439099 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.438965 2570 generic.go:358] "Generic (PLEG): container finished" podID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" containerID="fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe" exitCode=0 Apr 20 15:06:17.439099 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.439025 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-64dd4bd954-sq47p" event={"ID":"2a64ab8e-b1d4-4197-bfb2-5837f427460d","Type":"ContainerDied","Data":"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe"} Apr 20 15:06:17.439099 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.439068 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-64dd4bd954-sq47p" Apr 20 15:06:17.439099 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.439084 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-64dd4bd954-sq47p" event={"ID":"2a64ab8e-b1d4-4197-bfb2-5837f427460d","Type":"ContainerDied","Data":"dace57529a9fa4f42e4fc66817e189bddfb9328ef1971663eafca4e455898895"} Apr 20 15:06:17.439099 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.439099 2570 scope.go:117] "RemoveContainer" containerID="fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe" Apr 20 15:06:17.455495 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.455472 2570 scope.go:117] "RemoveContainer" containerID="fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe" Apr 20 15:06:17.455984 ip-10-0-133-163 kubenswrapper[2570]: E0420 15:06:17.455958 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe\": container with ID starting with fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe not found: ID does not exist" containerID="fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe" Apr 20 15:06:17.456186 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.456148 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe"} err="failed to get container status \"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe\": rpc error: code = NotFound desc = could not find container \"fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe\": container with ID starting with fb4c9ff1c146e6d08d0ee9b5c2332fe29895c76b71735505a8d49fb45bbd26fe not found: ID does not exist" Apr 20 15:06:17.470075 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.470048 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:06:17.471882 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:17.471861 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-64dd4bd954-sq47p"] Apr 20 15:06:18.984534 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:06:18.984502 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" path="/var/lib/kubelet/pods/2a64ab8e-b1d4-4197-bfb2-5837f427460d/volumes" Apr 20 15:07:10.639730 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:10.639697 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:07:13.840406 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.840373 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc"] Apr 20 15:07:13.840790 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.840774 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" containerName="manager" Apr 20 15:07:13.840859 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.840793 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" containerName="manager" Apr 20 15:07:13.840904 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.840895 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="2a64ab8e-b1d4-4197-bfb2-5837f427460d" containerName="manager" Apr 20 15:07:13.844102 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.844082 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.848074 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.848051 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 15:07:13.848074 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.848068 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 15:07:13.848244 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.848094 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 15:07:13.848244 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.848057 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-z9d6w\"" Apr 20 15:07:13.854679 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.854657 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc"] Apr 20 15:07:13.880286 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880263 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.880388 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880348 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf8ch\" (UniqueName: \"kubernetes.io/projected/8edfbca3-f1d1-4f45-91b3-ee182252838c-kube-api-access-bf8ch\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.880435 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880386 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.880470 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.880526 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.880579 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.880565 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8edfbca3-f1d1-4f45-91b3-ee182252838c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981230 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8edfbca3-f1d1-4f45-91b3-ee182252838c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981363 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981243 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981363 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bf8ch\" (UniqueName: \"kubernetes.io/projected/8edfbca3-f1d1-4f45-91b3-ee182252838c-kube-api-access-bf8ch\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981454 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981500 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981444 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981500 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981860 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981962 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981831 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-model-cache\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.981962 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.981865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-home\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.983654 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.983628 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8edfbca3-f1d1-4f45-91b3-ee182252838c-dshm\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.983910 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.983895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8edfbca3-f1d1-4f45-91b3-ee182252838c-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:13.989906 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:13.989885 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf8ch\" (UniqueName: \"kubernetes.io/projected/8edfbca3-f1d1-4f45-91b3-ee182252838c-kube-api-access-bf8ch\") pod \"e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc\" (UID: \"8edfbca3-f1d1-4f45-91b3-ee182252838c\") " pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:14.156245 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:14.156183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:14.292291 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:14.292265 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc"] Apr 20 15:07:14.294966 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:07:14.294937 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8edfbca3_f1d1_4f45_91b3_ee182252838c.slice/crio-609d5cb1142d719e63994101bb2fbb7e74f7cdff1358e8dbc9d0c20fc7d3e52a WatchSource:0}: Error finding container 609d5cb1142d719e63994101bb2fbb7e74f7cdff1358e8dbc9d0c20fc7d3e52a: Status 404 returned error can't find the container with id 609d5cb1142d719e63994101bb2fbb7e74f7cdff1358e8dbc9d0c20fc7d3e52a Apr 20 15:07:14.297115 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:14.297096 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:07:14.425500 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:14.425436 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:07:14.650382 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:14.650349 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" event={"ID":"8edfbca3-f1d1-4f45-91b3-ee182252838c","Type":"ContainerStarted","Data":"609d5cb1142d719e63994101bb2fbb7e74f7cdff1358e8dbc9d0c20fc7d3e52a"} Apr 20 15:07:15.136972 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.136939 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7"] Apr 20 15:07:15.140800 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.140776 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.143282 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.143259 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 20 15:07:15.149283 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.149258 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7"] Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192097 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192223 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192258 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192354 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhv2f\" (UniqueName: \"kubernetes.io/projected/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kube-api-access-dhv2f\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.192497 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.192402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58bd1ef8-316e-4000-9e0e-5f678b3ff521-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293384 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293343 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293573 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293573 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dhv2f\" (UniqueName: \"kubernetes.io/projected/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kube-api-access-dhv2f\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293573 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293520 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58bd1ef8-316e-4000-9e0e-5f678b3ff521-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293573 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293553 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293780 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293876 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293829 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293961 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-home\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.293961 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.293883 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-model-cache\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.296427 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.296401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/58bd1ef8-316e-4000-9e0e-5f678b3ff521-dshm\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.296788 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.296765 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/58bd1ef8-316e-4000-9e0e-5f678b3ff521-tls-certs\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.301297 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.301266 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhv2f\" (UniqueName: \"kubernetes.io/projected/58bd1ef8-316e-4000-9e0e-5f678b3ff521-kube-api-access-dhv2f\") pod \"e2e-distinct-simulated-kserve-69d7bf476b-thgh7\" (UID: \"58bd1ef8-316e-4000-9e0e-5f678b3ff521\") " pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.455396 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.455298 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:15.663381 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:15.663355 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7"] Apr 20 15:07:15.666277 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:07:15.666217 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58bd1ef8_316e_4000_9e0e_5f678b3ff521.slice/crio-47296f17cb7656be5042014b2cc5cc722a9da719c8c4829a181f55bd43f191c9 WatchSource:0}: Error finding container 47296f17cb7656be5042014b2cc5cc722a9da719c8c4829a181f55bd43f191c9: Status 404 returned error can't find the container with id 47296f17cb7656be5042014b2cc5cc722a9da719c8c4829a181f55bd43f191c9 Apr 20 15:07:16.631200 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:16.631161 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:07:16.663113 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:16.663068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" event={"ID":"58bd1ef8-316e-4000-9e0e-5f678b3ff521","Type":"ContainerStarted","Data":"47296f17cb7656be5042014b2cc5cc722a9da719c8c4829a181f55bd43f191c9"} Apr 20 15:07:20.684148 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:20.684109 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" event={"ID":"8edfbca3-f1d1-4f45-91b3-ee182252838c","Type":"ContainerStarted","Data":"02d555ed99ff72d8aeb49f7369f016df357c8bd48d9a69f784369dad73d2a3be"} Apr 20 15:07:20.685893 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:20.685865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" event={"ID":"58bd1ef8-316e-4000-9e0e-5f678b3ff521","Type":"ContainerStarted","Data":"c5277343f2b7836d2a821185559ac840da56465207c43e2970f958fbe79ad815"} Apr 20 15:07:25.706980 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:25.706900 2570 generic.go:358] "Generic (PLEG): container finished" podID="8edfbca3-f1d1-4f45-91b3-ee182252838c" containerID="02d555ed99ff72d8aeb49f7369f016df357c8bd48d9a69f784369dad73d2a3be" exitCode=0 Apr 20 15:07:25.707416 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:25.706973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" event={"ID":"8edfbca3-f1d1-4f45-91b3-ee182252838c","Type":"ContainerDied","Data":"02d555ed99ff72d8aeb49f7369f016df357c8bd48d9a69f784369dad73d2a3be"} Apr 20 15:07:25.708591 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:25.708368 2570 generic.go:358] "Generic (PLEG): container finished" podID="58bd1ef8-316e-4000-9e0e-5f678b3ff521" containerID="c5277343f2b7836d2a821185559ac840da56465207c43e2970f958fbe79ad815" exitCode=0 Apr 20 15:07:25.708591 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:25.708452 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" event={"ID":"58bd1ef8-316e-4000-9e0e-5f678b3ff521","Type":"ContainerDied","Data":"c5277343f2b7836d2a821185559ac840da56465207c43e2970f958fbe79ad815"} Apr 20 15:07:31.739995 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.739907 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" event={"ID":"8edfbca3-f1d1-4f45-91b3-ee182252838c","Type":"ContainerStarted","Data":"8da144a737e6ad90dce07c665d24bbf95218825fad663a3d7494e87b28198963"} Apr 20 15:07:31.740481 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.740138 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:31.741613 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.741592 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" event={"ID":"58bd1ef8-316e-4000-9e0e-5f678b3ff521","Type":"ContainerStarted","Data":"d25aec0ff9bdbf275d8d469805bbc52212e2444d16dd1e315334c6cdba12694e"} Apr 20 15:07:31.741798 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.741782 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:07:31.770193 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.770140 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" podStartSLOduration=1.7284892790000002 podStartE2EDuration="18.770128083s" podCreationTimestamp="2026-04-20 15:07:13 +0000 UTC" firstStartedPulling="2026-04-20 15:07:14.297223458 +0000 UTC m=+717.953271738" lastFinishedPulling="2026-04-20 15:07:31.338862258 +0000 UTC m=+734.994910542" observedRunningTime="2026-04-20 15:07:31.769401647 +0000 UTC m=+735.425449975" watchObservedRunningTime="2026-04-20 15:07:31.770128083 +0000 UTC m=+735.426176385" Apr 20 15:07:31.807814 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:31.807767 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" podStartSLOduration=1.151364016 podStartE2EDuration="16.807755094s" podCreationTimestamp="2026-04-20 15:07:15 +0000 UTC" firstStartedPulling="2026-04-20 15:07:15.669193232 +0000 UTC m=+719.325241520" lastFinishedPulling="2026-04-20 15:07:31.325584318 +0000 UTC m=+734.981632598" observedRunningTime="2026-04-20 15:07:31.80586739 +0000 UTC m=+735.461915693" watchObservedRunningTime="2026-04-20 15:07:31.807755094 +0000 UTC m=+735.463803395" Apr 20 15:07:42.132548 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:42.132520 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:07:42.758435 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:42.758410 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc" Apr 20 15:07:42.759198 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:07:42.759178 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-69d7bf476b-thgh7" Apr 20 15:08:04.373418 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:08:04.373383 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:08:51.926561 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:08:51.926528 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:09:00.527651 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:09:00.527618 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:09:31.330696 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:09:31.330663 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:09:46.430714 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:09:46.430684 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:10:16.898340 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:10:16.898284 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:10:16.898901 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:10:16.898881 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:10:24.731095 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:10:24.731064 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:10:41.934784 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:10:41.934735 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:10:56.225095 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:10:56.225053 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:11:11.132036 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:11:11.131996 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:11:15.031600 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:11:15.031563 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:11:36.739326 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:11:36.739235 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:11:42.329776 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:11:42.329733 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:12:04.441804 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:12:04.441767 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:12:13.236954 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:12:13.236916 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:12:29.594435 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:12:29.594399 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:12:38.834264 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:12:38.834228 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:12:56.128164 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:12:56.128128 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:13:03.981260 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:13:03.981222 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:13:36.325529 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:13:36.325488 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:13:45.327858 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:13:45.327818 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:13:53.529762 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:13:53.529718 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:14:02.231686 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:14:02.231645 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:14:10.732257 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:14:10.732220 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:14:27.323493 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:14:27.323459 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:14:37.931268 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:14:37.931183 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:15:16.927612 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:16.927585 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:15:16.928616 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:16.928598 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:15:25.925866 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:25.925826 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:15:34.140431 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:34.140390 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:15:42.328518 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:42.328477 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:15:50.838289 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:15:50.838251 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:00.036462 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:00.036424 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:08.931970 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:08.931882 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:17.937819 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:17.937780 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:26.237723 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:26.237686 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:35.432716 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:35.432680 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:44.230295 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:44.230257 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:16:53.432282 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:16:53.432245 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:01.139806 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:01.139766 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:10.338012 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:10.337973 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:18.829953 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:18.829914 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:28.531696 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:28.531650 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:36.239328 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:36.239232 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:45.534653 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:45.534618 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:17:53.539040 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:17:53.538990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:19:04.034596 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:19:04.034556 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:19:08.640428 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:19:08.640345 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:19:18.936086 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:19:18.936045 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:19:49.725929 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:19:49.725892 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:20:16.960682 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:16.960653 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:20:16.962141 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:16.962119 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:20:32.427835 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:32.427796 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:20:40.737795 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:40.737698 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:20:49.336576 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:49.336538 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:20:57.537156 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:20:57.537119 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:06.826561 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:06.826512 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:15.235584 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:15.235540 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:23.526903 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:23.526867 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:31.829890 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:31.829842 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:40.924761 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:40.924719 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:49.026530 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:49.026480 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:21:57.036330 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:21:57.036268 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:22:05.620515 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:22:05.620471 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:22:22.432783 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:22:22.432742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:22:31.223629 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:22:31.223587 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:22:39.728193 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:22:39.728144 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:22:48.429295 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:22:48.429254 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:05.624348 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:05.624313 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:13.821448 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:13.821412 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:22.033543 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:22.033509 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:30.631361 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:30.631320 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:39.940394 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:39.940315 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:47.933813 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:47.933772 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:23:58.824649 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:23:58.824613 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:24:13.725988 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:24:13.725946 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:24:23.026385 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:24:23.026342 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:24:40.634527 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:24:40.634475 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:24:49.140470 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:24:49.140427 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:24:57.128124 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:24:57.128087 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:05.335566 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:05.335529 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:13.822507 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:13.822471 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:16.993032 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:16.992999 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:25:16.995084 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:16.995066 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:25:30.629378 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:30.629335 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:39.229969 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:39.229931 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:47.337042 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:47.336995 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:25:55.132536 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:25:55.132496 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:26:18.826866 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:26:18.826830 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:26:31.621816 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:26:31.621781 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-pn2zj"] Apr 20 15:29:17.689187 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:17.689154 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-f5bbffd8f-km7qf_c5aa3aa8-6607-4f7f-8c1f-96b110395961/maas-api/0.log" Apr 20 15:29:18.164615 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:18.164581 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-hxndh_ad8ba0ee-5509-4f32-96e5-6b0de0b47177/manager/0.log" Apr 20 15:29:19.886663 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:19.886630 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-b5xb7_ab365834-adc7-4d59-9a5c-c2c739687f62/manager/0.log" Apr 20 15:29:20.101183 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:20.101148 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-47b2b_ccdfb83f-9afc-4949-8be1-324eb63e1b9a/registry-server/0.log" Apr 20 15:29:20.214682 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:20.214604 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-hhl49_168e11ea-36bd-4438-8e61-adfa38f077a3/manager/0.log" Apr 20 15:29:20.321241 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:20.321215 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pn2zj_3b11d86c-92f4-4313-a3a6-421820d49702/limitador/0.log" Apr 20 15:29:20.437383 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:20.437354 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bvn5v_fa7a90a9-4a60-41f1-9bf6-51d11c2214b0/manager/0.log" Apr 20 15:29:20.778418 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:20.778384 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h_04721783-0eb5-4a58-b550-5e18f6b0d95d/istio-proxy/0.log" Apr 20 15:29:21.235070 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.234992 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-7n9cp_f82db5b3-f3e6-43f9-bb28-f03c4db40696/istio-proxy/0.log" Apr 20 15:29:21.346194 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.346167 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-67d496fbdd-b9cws_95d5e76e-7689-4825-ba12-28c88aebccda/router/0.log" Apr 20 15:29:21.674606 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.674574 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc_8edfbca3-f1d1-4f45-91b3-ee182252838c/storage-initializer/0.log" Apr 20 15:29:21.681233 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.681196 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-8454f99c75-dm8dc_8edfbca3-f1d1-4f45-91b3-ee182252838c/main/0.log" Apr 20 15:29:21.792383 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.792347 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-thgh7_58bd1ef8-316e-4000-9e0e-5f678b3ff521/storage-initializer/0.log" Apr 20 15:29:21.798560 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:21.798536 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-69d7bf476b-thgh7_58bd1ef8-316e-4000-9e0e-5f678b3ff521/main/0.log" Apr 20 15:29:29.009852 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:29.009823 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-cnpgl_8ff97bcf-86b2-437d-aad6-c51eae0b40b1/global-pull-secret-syncer/0.log" Apr 20 15:29:29.092077 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:29.092049 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-9vtqc_50d12be2-afb1-4257-895a-8f2eed4865c3/konnectivity-agent/0.log" Apr 20 15:29:29.194659 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:29.194633 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-163.ec2.internal_8147dca2f1846ffe58ac40c8a9cdfc0b/haproxy/0.log" Apr 20 15:29:33.575025 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:33.574983 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-b5xb7_ab365834-adc7-4d59-9a5c-c2c739687f62/manager/0.log" Apr 20 15:29:33.643628 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:33.643540 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-47b2b_ccdfb83f-9afc-4949-8be1-324eb63e1b9a/registry-server/0.log" Apr 20 15:29:33.748055 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:33.748011 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-hhl49_168e11ea-36bd-4438-8e61-adfa38f077a3/manager/0.log" Apr 20 15:29:33.767789 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:33.767759 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-pn2zj_3b11d86c-92f4-4313-a3a6-421820d49702/limitador/0.log" Apr 20 15:29:33.821236 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:33.821206 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-bvn5v_fa7a90a9-4a60-41f1-9bf6-51d11c2214b0/manager/0.log" Apr 20 15:29:35.153508 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.153477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/alertmanager/0.log" Apr 20 15:29:35.174251 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.174225 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/config-reloader/0.log" Apr 20 15:29:35.199022 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.198998 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/kube-rbac-proxy-web/0.log" Apr 20 15:29:35.220393 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.220368 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/kube-rbac-proxy/0.log" Apr 20 15:29:35.242090 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.242064 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/kube-rbac-proxy-metric/0.log" Apr 20 15:29:35.262008 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.261985 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/prom-label-proxy/0.log" Apr 20 15:29:35.284583 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.284562 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f7f08eba-59dc-40e3-8dc6-9ab29e1c1fbb/init-config-reloader/0.log" Apr 20 15:29:35.346216 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.346180 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rt627_92d39451-f35b-4d2a-88da-a4769e1eaae5/kube-state-metrics/0.log" Apr 20 15:29:35.365529 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.365504 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rt627_92d39451-f35b-4d2a-88da-a4769e1eaae5/kube-rbac-proxy-main/0.log" Apr 20 15:29:35.386658 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.386608 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rt627_92d39451-f35b-4d2a-88da-a4769e1eaae5/kube-rbac-proxy-self/0.log" Apr 20 15:29:35.411719 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.411624 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6d767b4bfd-nqwbp_77d517a8-5191-4606-8bdd-d236599c3b5b/metrics-server/0.log" Apr 20 15:29:35.620708 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.620684 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb9k2_937a5c5b-de08-42bb-9cb1-0086ff30299e/node-exporter/0.log" Apr 20 15:29:35.643785 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.643753 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb9k2_937a5c5b-de08-42bb-9cb1-0086ff30299e/kube-rbac-proxy/0.log" Apr 20 15:29:35.665339 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.665239 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-sb9k2_937a5c5b-de08-42bb-9cb1-0086ff30299e/init-textfile/0.log" Apr 20 15:29:35.691145 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.691120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gqhb4_14881dac-c7a8-45ea-bd59-230b8e9811af/kube-rbac-proxy-main/0.log" Apr 20 15:29:35.709273 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.709245 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gqhb4_14881dac-c7a8-45ea-bd59-230b8e9811af/kube-rbac-proxy-self/0.log" Apr 20 15:29:35.728062 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.728038 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-gqhb4_14881dac-c7a8-45ea-bd59-230b8e9811af/openshift-state-metrics/0.log" Apr 20 15:29:35.763251 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.763216 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/prometheus/0.log" Apr 20 15:29:35.783153 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.783120 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/config-reloader/0.log" Apr 20 15:29:35.805331 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.805260 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/thanos-sidecar/0.log" Apr 20 15:29:35.825721 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.825693 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/kube-rbac-proxy-web/0.log" Apr 20 15:29:35.854284 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.854251 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/kube-rbac-proxy/0.log" Apr 20 15:29:35.878425 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.878385 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/kube-rbac-proxy-thanos/0.log" Apr 20 15:29:35.901439 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.901415 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_948924e3-21b3-473a-9035-d51cb2d5f65e/init-config-reloader/0.log" Apr 20 15:29:35.925994 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.925905 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-scrln_7d4fa613-c16b-4696-a031-8643149ab3a6/prometheus-operator/0.log" Apr 20 15:29:35.943363 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.943331 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-scrln_7d4fa613-c16b-4696-a031-8643149ab3a6/kube-rbac-proxy/0.log" Apr 20 15:29:35.975221 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:35.975163 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-k7lvd_368b3ec8-ae66-4571-9538-90802a0710c3/prometheus-operator-admission-webhook/0.log" Apr 20 15:29:36.001787 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.001764 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76848cdc47-r5v66_ae298d00-8803-4cc4-8ff7-acfdc73593c0/telemeter-client/0.log" Apr 20 15:29:36.021343 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.021298 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76848cdc47-r5v66_ae298d00-8803-4cc4-8ff7-acfdc73593c0/reload/0.log" Apr 20 15:29:36.041672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.041651 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-76848cdc47-r5v66_ae298d00-8803-4cc4-8ff7-acfdc73593c0/kube-rbac-proxy/0.log" Apr 20 15:29:36.073444 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.073416 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/thanos-query/0.log" Apr 20 15:29:36.095134 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.095110 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/kube-rbac-proxy-web/0.log" Apr 20 15:29:36.115256 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.115221 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/kube-rbac-proxy/0.log" Apr 20 15:29:36.134960 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.134936 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/prom-label-proxy/0.log" Apr 20 15:29:36.154565 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.154530 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/kube-rbac-proxy-rules/0.log" Apr 20 15:29:36.180022 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:36.179934 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-78d4476d99-54g5t_7bef6867-224c-4525-bca7-c1f04fe94c83/kube-rbac-proxy-metrics/0.log" Apr 20 15:29:37.855210 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.855177 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9"] Apr 20 15:29:37.859080 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.859057 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:37.862698 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.862668 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"openshift-service-ca.crt\"" Apr 20 15:29:37.862842 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.862762 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-8mfh8\"/\"default-dockercfg-xm2g5\"" Apr 20 15:29:37.862842 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.862795 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-8mfh8\"/\"kube-root-ca.crt\"" Apr 20 15:29:37.869387 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.869360 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9"] Apr 20 15:29:37.957519 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.957481 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-podres\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:37.957681 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.957539 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-lib-modules\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:37.957681 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.957608 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-sys\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:37.957878 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.957677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-proc\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:37.957878 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:37.957742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6l88\" (UniqueName: \"kubernetes.io/projected/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-kube-api-access-t6l88\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058481 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058441 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-proc\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6l88\" (UniqueName: \"kubernetes.io/projected/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-kube-api-access-t6l88\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-podres\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058572 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-proc\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058672 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058589 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-lib-modules\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058823 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058671 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-sys\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058823 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058707 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-podres\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058823 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-lib-modules\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.058823 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.058762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-sys\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.066480 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.066454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6l88\" (UniqueName: \"kubernetes.io/projected/f65f59cc-dc4b-45e7-a0f3-d58f09436d1b-kube-api-access-t6l88\") pod \"perf-node-gather-daemonset-kcqq9\" (UID: \"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b\") " pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.170597 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.170497 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.243841 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.243812 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-tjwf7_764028e9-7c8b-4dc6-9392-a96033a5f59d/download-server/0.log" Apr 20 15:29:38.309829 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.309800 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9"] Apr 20 15:29:38.311836 ip-10-0-133-163 kubenswrapper[2570]: W0420 15:29:38.311812 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf65f59cc_dc4b_45e7_a0f3_d58f09436d1b.slice/crio-5747ba523dffd6e2523556a8b2f20d069d85b9cdeaf243c3e9f1335bc1bdea5f WatchSource:0}: Error finding container 5747ba523dffd6e2523556a8b2f20d069d85b9cdeaf243c3e9f1335bc1bdea5f: Status 404 returned error can't find the container with id 5747ba523dffd6e2523556a8b2f20d069d85b9cdeaf243c3e9f1335bc1bdea5f Apr 20 15:29:38.313703 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.313687 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 15:29:38.755464 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.755423 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-dt42z_a319ad70-d05f-4ed2-a056-f1fe50c202fc/volume-data-source-validator/0.log" Apr 20 15:29:38.777471 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.777432 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" event={"ID":"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b","Type":"ContainerStarted","Data":"859032d9aea95818900da6f197e04f5950d205e8c0e86c440cabc28401a14dd4"} Apr 20 15:29:38.777471 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.777474 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" event={"ID":"f65f59cc-dc4b-45e7-a0f3-d58f09436d1b","Type":"ContainerStarted","Data":"5747ba523dffd6e2523556a8b2f20d069d85b9cdeaf243c3e9f1335bc1bdea5f"} Apr 20 15:29:38.777710 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.777571 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:38.794815 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:38.794764 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" podStartSLOduration=1.794748186 podStartE2EDuration="1.794748186s" podCreationTimestamp="2026-04-20 15:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 15:29:38.79342525 +0000 UTC m=+2062.449473553" watchObservedRunningTime="2026-04-20 15:29:38.794748186 +0000 UTC m=+2062.450796511" Apr 20 15:29:39.577868 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:39.577839 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jbtsc_7c674737-9de4-4df3-8cd4-de9165e6e70a/dns/0.log" Apr 20 15:29:39.596248 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:39.596224 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-jbtsc_7c674737-9de4-4df3-8cd4-de9165e6e70a/kube-rbac-proxy/0.log" Apr 20 15:29:39.705095 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:39.705068 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-zb7gn_f6c014f8-befe-4916-a8ed-bc592d3baacf/dns-node-resolver/0.log" Apr 20 15:29:40.163471 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:40.163424 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-h82ph_3661ad3f-53ca-47ec-8a9b-15e3d3f054bd/node-ca/0.log" Apr 20 15:29:40.963715 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:40.963678 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8zr2h_04721783-0eb5-4a58-b550-5e18f6b0d95d/istio-proxy/0.log" Apr 20 15:29:41.255743 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:41.255639 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-7n9cp_f82db5b3-f3e6-43f9-bb28-f03c4db40696/istio-proxy/0.log" Apr 20 15:29:41.279707 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:41.279681 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-67d496fbdd-b9cws_95d5e76e-7689-4825-ba12-28c88aebccda/router/0.log" Apr 20 15:29:41.823163 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:41.823134 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-9dz42_7fe57737-4cb8-41e4-95a2-77878dc0e909/serve-healthcheck-canary/0.log" Apr 20 15:29:42.353015 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:42.352986 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gktqt_07670bb4-e63a-4c79-930c-288b4bffcda3/kube-rbac-proxy/0.log" Apr 20 15:29:42.371421 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:42.371398 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gktqt_07670bb4-e63a-4c79-930c-288b4bffcda3/exporter/0.log" Apr 20 15:29:42.391023 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:42.390999 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-gktqt_07670bb4-e63a-4c79-930c-288b4bffcda3/extractor/0.log" Apr 20 15:29:44.423068 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:44.423035 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-f5bbffd8f-km7qf_c5aa3aa8-6607-4f7f-8c1f-96b110395961/maas-api/0.log" Apr 20 15:29:44.596228 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:44.596193 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-99ff97f7d-hxndh_ad8ba0ee-5509-4f32-96e5-6b0de0b47177/manager/0.log" Apr 20 15:29:44.792575 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:44.792541 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-8mfh8/perf-node-gather-daemonset-kcqq9" Apr 20 15:29:45.869489 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:45.869465 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-6687ffb5c6-v6xtg_6660b740-f666-473b-a2c9-b5ced164f05b/manager/0.log" Apr 20 15:29:51.742189 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.742157 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/kube-multus-additional-cni-plugins/0.log" Apr 20 15:29:51.762724 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.762688 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/egress-router-binary-copy/0.log" Apr 20 15:29:51.786180 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.786149 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/cni-plugins/0.log" Apr 20 15:29:51.810332 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.810280 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/bond-cni-plugin/0.log" Apr 20 15:29:51.829932 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.829907 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/routeoverride-cni/0.log" Apr 20 15:29:51.849908 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.849881 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/whereabouts-cni-bincopy/0.log" Apr 20 15:29:51.868660 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.868639 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-lstvb_4aa972e0-3242-4b0c-87e7-b4ebc421bbce/whereabouts-cni/0.log" Apr 20 15:29:51.900257 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:51.900235 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-chk28_2db4df8a-cdb6-4503-9793-bc14f5983e3e/kube-multus/0.log" Apr 20 15:29:52.054013 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:52.053985 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sjhzf_6ae6c334-21b5-4f64-b2c3-68f797cd363b/network-metrics-daemon/0.log" Apr 20 15:29:52.072182 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:52.072140 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-sjhzf_6ae6c334-21b5-4f64-b2c3-68f797cd363b/kube-rbac-proxy/0.log" Apr 20 15:29:53.115690 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.115661 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-controller/0.log" Apr 20 15:29:53.134886 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.134856 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/0.log" Apr 20 15:29:53.145763 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.145738 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovn-acl-logging/1.log" Apr 20 15:29:53.161185 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.161154 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/kube-rbac-proxy-node/0.log" Apr 20 15:29:53.184015 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.183981 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 15:29:53.203082 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.203058 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/northd/0.log" Apr 20 15:29:53.221630 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.221603 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/nbdb/0.log" Apr 20 15:29:53.240549 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.240525 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/sbdb/0.log" Apr 20 15:29:53.338512 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:53.338477 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-g9x87_f6944a1f-03f8-4115-899e-e5c61d0d6075/ovnkube-controller/0.log" Apr 20 15:29:54.715318 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:54.715282 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-d4wt8_dbe6bf00-4b0b-4432-80f4-1085e83c9110/network-check-target-container/0.log" Apr 20 15:29:55.839451 ip-10-0-133-163 kubenswrapper[2570]: I0420 15:29:55.839421 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-s775f_db4252bf-5e13-4727-a83a-7f87874cf5c4/iptables-alerter/0.log"